X-Message-Number: 12980
From: 
Date: Tue, 21 Dec 1999 22:32:29 EST
Subject: careful--zombies

Some brief responses to Daniel Crevier's summary of the arguments for the 
possibility of consciousness in computers.

1. I  agree--and have always said--that consciousness in computers is 
POSSIBLE, as far as we know. In fact, I don't know of any informed person who 
categorically denies that it is possible. My argument is with the "strong AI" 
people who think their case is overwhelmingly strong. 

2. >Argument for the reproducibility of conscious behavior in machines.   
>Let us start with digital computers. Thy can simulate any physical system, 
and >therefore can simulate a human brain. 

"Simulation" is a potentially deceptive term. 

First, there can be partial isomorphisms which one could call simulations, 
but there would not be one-to-one mapping. Only a small fraction of the 
numbers generated by the computer correspond to parameters of the system 
simulated.

Second, a digital computer could not (with full accuracy) simulate a 
continuous universe; and we have no assurance that our universe is discrete 
at the level of a digital computer. 

Third, the word "simulation" fudges the time element. If a linear computer 
"simulates" a dynamic system, then one could equally well say that the mere 
program and initial data store themselves simulate the system, without the 
computer running. In fact, this appears to be close to Moravec's position, 
that "existence" at bottom is just abstract relationships.

Fourth (and this is partially redundant), a mere description is not an 
emulation. I could (in principle, assuming that present quantum mechanics is 
the be-all and end-all, and assuming enough information and enough time) 
write out in long hand a description of the present and future states of any 
system, e.g. you and your environment. To imagine that such a description 
would BE you and your future life seems like nonsense, apologies to Moravec.

3. >Argument from the uploading thought experiment. 
>You are operated upon by a robot surgeon which analyses a small part of your 
>brain, and constructs an electronic circuit (either analog or digital, it 
doesn't matter) >with the same input-output properties.
 
There is  no assurance that this is possible, even in principle.

> at  the end of the process, all of your brain will have been replaced by 
circuitry, and >you will have become a 'simulation' of yourself. Yet you will 
still be conscious. If >not, at what point in the substitution process did 
you lose your awareness? 

If consciousness depends on a special organic mechanism (the "self circuit"), 
you lost consciousness when that was removed or damaged.

>Further, if you did lose your awareness, then it slipped away from you 
without your >being aware or it, because at the end you will still maintain 
that you are conscious. 

I don't concede that; see above. In any case, the vocal assertion is not 
proof.

>Losing one's awareness without being aware of it sounds like quite a 
contradictory proposition. 

On the contrary--you cannot be aware of losing awareness, by definition. 
(Well, you  might be aware of it fading in some cases, but you could not be 
aware of having lost it.)

4. >Argument from the unobservability of 'true' consiousness. 
>If there can never be any observable difference between a zombie and a 
conscious >being, does it make any objective sense to talk about 'true' vs 
'make believe' >consciousness?  If it doesn't, acting as if conscious and 
being conscious must be >one and the same thing.

Depends what you mean by "observable." Lots of real differences in systems 
are not observable in practice. One could also ask about the "reality" of 
events outside and inside the horizon of a black hole--but our case is 
simpler. We could, in principle, observe the workings of a "self circuit" 
with noninvasive scans of one sort or another, or possibly some type of 
telepathy.

Time lurches on, and I'll mercifully stop here, at least for now.

Robert Ettinger
Cryonics Institute
Immortalist Society
http://www.cryonics.org

Rate This Message: http://www.cryonet.org/cgi-bin/rate.cgi?msg=12980