X-Message-Number: 11527
Date: Thu, 08 Apr 1999 00:18:58 -0700
From: Mike Perry <>
Subject: Re: Searle, Consciousness

Bob Ettinger, #11507, writes,

>But when we turn to consciousness and feeling, it is another matter. Here 
>Searle et al are right and the strong AI people are wrong. This becomes clear 
>when we take a hard look at the "hard problem" in consciousness--the 
>anatomy/physiology of qualia--roughly, "feelings" or subjective conditions.
>

I think Searle makes a fundamental error with the Chinese room experiment,
which is to confuse emulator and emulatee. To those on the outside, it
appears that the person inside understands Chinese, yet the man inside
understands no Chinese, but is just following elaborate rules. The man, in
effect becomes the emulator of the real Chinese conversationist. Under
appropriate conditions it would, I think, become reasonable to say that (1)
there is a conversationist in the room who does understand Chinese,
(2) the man is a device that emulates this conversationist but does not
understand what the conversationist understands, and thus (3) the man in the
room is not the conversationist. 

As for the more general point that Bob makes, basically, that feeling is not
necessarily captured in an emulation, I would again bring up the
possibility, which we have no way of disproving, that we are right now  in
an emulation of some sort, and very different physically from what we think
we are, though still isomorphic. I don't see any way in principle to tell
the difference between this condition and being "real," or, for that matter,
to *disprove* the claim that only emulations have real feeling and
consciousness, not, for heaven's sake, the "real" processes! (On this basis,
then, if you accept that you are conscious, you must conclude that you are
in an emulation!) So I don't accept the argument that feeling is something
specifically physical that can't be captured, equivalently, in a different
substrate.

I think some day we will observe systems that *behave* as if they
were conscious and had feeling, but are made of different stuff than we are,
and that process their information rather differently but isomorphically to
systems we would consider conscious. And I am prepared to accept that such
systems too would be conscious, and that I would be conscious also if
expressed, isomorphically, in that type of system rather than as I am now.
(So I would be comfortable with an uploading approach to my reanimation from
cryonic suspension, assuming it was feasible and "our friends of the future"
had no problem with it.) This, of course, proves nothing by way of resolving
the question at hand, but I do think, once again, that there is
no way to resolve this issue by scientific "proof" or "almost-proof" as we
usually understand it. But once systems start to behave convincingly as if
they were conscious, I expect people increasingly to accept that they are. 

Mike Perry

Rate This Message: http://www.cryonet.org/cgi-bin/rate.cgi?msg=11527