X-Message-Number: 11808
Date: Mon, 24 May 1999 01:27:21 -0700
From: Mike Perry <>
Subject: Consciousness issues

Thomas Donaldson, #11797:
>I will also note one major change in what you seem to be saying. A robot
>living in the world is not the same as a subroutine or other program
>structure in a computer --- even if that program structure in the computer
>seems to act as if it is a human being in a virtual world.

At least, for practical reasons the two are different, and what suffices for
a program in a virtual world, playing baseball say, may not do for a robot
in the real world playing real baseball. But as far as the subjective
experience is concerned, the differences (if any) may not be so clearcut.

> The first does
>not perform any purely symbolic actions: it acts in the world, and thus
>is not just a symbolic structure (the same as a book). The second is a 
>symbolic structure alone, and acts like a human being only in the sense
>that a REAL human being, reading or seeing its activities in its virtual
>world, would agree that they are like those of a human being.

A book I don't think of as conscious. But a program running in a computer
could be. Note the operative word, *running*. Activity, I think, is at least
necessary for consciousness (unless maybe we could have just an unvarying,
static state of "consciousness," but thinking clearly requires activity).

> But it has
>no more real existence than any other symbolic entity --- like Donald
>Duck. (A computer virus is a REAL life form because it acts on the
>computers in reality. If it only made virtual changes, it would only be
>a virtual computer virus, and again no more real than Donald Duck).
It occurs to me, though, that any running computer program is something
happening in reality--real electrons are moving, etc.

>There is a subsidiary question which Bob Ettinger has brought up several
>times. It deserves attention, but at present I would not claim it is
>decisive in any way. The neural nets of which our brain is composed do
>not work like those in current computing: they grow new connections.
>Whether this becomes a critical issue would need a good deal of thought;

Well, if our old friend quantum mechanics isn't overturned, once again,
everything appears to be computational and digital on a deep level.

from Thomas Donaldson, #11798,

>My reason for saying that dreams are not the same as virtual consciousness
>is simple. They DO depend on previous consciousness, and persist for only
>a finite length of time. It isn't sufficient to keep a brain only
>TEMPORARILY isolated to make its thinking virtual.

In other words, you'd have to be isolated forever to be in virtual reality?

> If I go into my
>room, lie down, and close my eyes to think about something, I am just 
>as isolated, but that is hardly the same as the long term isolation of
>a "virtual person" (if such is possible) in a computer.
But it's not the same as dreaming either, where you can see and experience a
world that seems real, at least for awhile, only it isn't.

Bob Ettinger, #11800:

>1. We do not yet understand the physical basis of consciousness.

On a detailed level, yes. But that consciousness, whatever it is, can be
seen as a side effect of the interactions of subatomic particles (electrons,
protons, neutrons, and photons, mainly) seems reasonably well established.
Unless you hold out for mystical things, or the possibility of overturning a
lot of physics. Of course, it is possible that physics will be overturned in
this way--it's happened before. But it seems reasonable to me to bet it
won't be and assume that consciousness is some emergent property of the
interactions of "unconscious" particles, at least as a working hypothesis.
And I think a lot is also known about what sorts of features conscious
systems must have, based on accepted physics.

>2. A sufficiently fast sequential (Turing) computer, supplied with enough 
>information (both about the laws of nature and about the system being 
>studied, in particular a human brain), could in principle predict or describe 
>the behavior or states of that system in all circumstances, even if not in 
>real time.

Yes, that seems to be so.

>Prop (1) by itself ought to be enough to warn against any assumption that any 
>inorganic system, let alone a computer, coud be conscious. How can you 
>possibly claim, with confidence, that a system has property A if you don't 
>even know what property A is? The various discussions, such as Dennett's, do 
>not constitute proof of anything. Claiming that consciousness is 
>"computational" again is stating your premise as a conclusion; we do not 
>know, and cannot assume, that consciousness is "computational" in your sense. 

In answer to this I'll once again raise the issue of a system that behaves
exactly as if conscious, both externally and with parts that function
analogously to our brain structures, but is not made as we are, e.g. maybe
inside it is silicon and plastic rather than meat. Is it conscious or just
faking it? I think this question can never be fully answered objectively. I
would vote "yes, it is conscious" on grounds that I would feel no compelling
reason *not* to regard the system as conscious (barring unexpected
discoveries). To vote "no" in such a circumstance, or even "I don't know,"
smacks of solipsism--at least there's a parallel.

>Again I suggest that consciousness may reside in some kind of standing wave 
>which binds space and time--i.e., which includes a non-zero region of space 
>and time. If this or something like it is correct, then a computer could be 

>conscious ONLY if we swallow the "isomorphism is everything" postulate, >which
>once again would lead to the conclusion that a book could be conscious, since 
>nothing matters except relationships between symbols. That last notion seems 
>extremely far-fetched to me, although we don't know enough yet to rule it out 

I don't share the attachment to the "standing wave" idea that Bob does, but
at least we have to agree he is open-minded.
>(Mike Perry suggests the isomorphism postulate should be modified to forbid 
>replacing time by symbols, while still allowing replacement of space and 
>matter by symbols.

A good observation. I do think that time is important in the expression of
consciousness, in a way that specific details of space and matter are not.
...At least, Dr. Perry understands my arguments, and 
>agrees that his isomorphism postulate is only that, and does not claim 
>certainty but only expresses leanings, as do I.)
Agreed. We don't want to be dogmatic here.

>Now let's look again at parts of Mr. Crevier's post:
>>there cannot
>>be one kind of mechanisms (the A's) that make physical systems just
>>behave as if they were conscious, and another kind (the B's) that make 
>>them really conscious. 
>Yes there can. The computer predicts or describes behavior, and a book 
>written by a computer could do the same, and a robot controlled by the 
>computer or the book would behave as if conscious. 

A robot puppet, controlled entirely by a list of instructions from a book
... is this what you have in mind? It would be like a mechanical doll of the
18th century that went through an elaborate but predictable sequence of
actions. I don't see how it could interact or adapt. Even if it couldn't,
though, we can ask if such a device could ever be conscious. Note that this
is different from just a static record alone. Suppose the book was very
large and detailed, and the "robot" was made of, well, carbon, hydrogen,
oxygen, etc., and in short, was a wide-awake human being. After creating it,
all according to instructions (using advanced technology of the future), the
book says, "stick it with a pin." If you do that, will the person feel pain?
I don't see why not, yet the steps are all laid out beforehand. (Of course
there is an unpredictability in events at the quantum level here too, but is
that property really crucial for consciousness? You could, for exampe, run
the experiment over and over, so many times as to end up repeating the
unpredictable events in their many variations--at least this is what quantum
mechanics appears to be telling us.)

Mike Perry

Rate This Message: http://www.cryonet.org/cgi-bin/rate.cgi?msg=11808