X-Message-Number: 25042
Date: Sun, 14 Nov 2004 14:00:22 -0800
Subject: A Useful Analogy for Mind Uploaders
From: <>

It is my contention that destruction of the qualia experiencer (
soul) in a brain results in the death of that person, in the most 
fundamental sense. Mind uploaders, however, insist the essential 
identity of a person can be transferred to a computer. Let me give 
an analogy that I think will help explain why this is false.

In my college years, I had a professor of computer science who took 
a very perverse delight in giving students computer programs during 
exams. It was our job to execute the programs entirely within our 
heads, and output the result of that execution to a piece of paper. 
Points would be docked for even a single missed whitespace 
character (which were to be indicated with the underbar character '_
').

In essence, we were acting as a kind of Turing machine. Feed us 
some input, and we'll apply rules to the input to generate the 
output. (At least, that was the theory; I think I was the only 
student who actually aced the exam.)

Unfortunately, even I don't make a very good Turing machine. Feed 
me a program of 1,000 lines, and I'm doomed to produce multiple 
errors. But you can imagine increasing the processing capabilities 
of my brain by a thousand or even a millionfold. 

Were you to do this, at some point, I would reach a level were I 
could execute just about any program in my head, and give you the 
result. In particular, then, I could execute a program that encodes 
the behavior of a human brain (a 'brain program', if you will). 

Now imagine that a mind uploader were frozen, and destructively 
scanned to produce a map of his brain, which was then given a 
numerical representation. Together with software for simulating the 
laws of how that representation evolves with time (the laws of 
physics/biophysics), this would constitute a complete description 
of the operation of the uploader's brain (or at least, an 
arbitrarily good approximation thereof).

Now feed the program to me and let me execute it. Notice two things 
about this scenario: (1) I am the same person; my identity remains 
unchanged, even though I am executing a brain program; (2) I don't 
need to know the program is a brain program; in fact, for all I 
know, it could be a finance program operating under artificial 
constraints, since I can guarantee for any brain program you give 
me, I can construct some other program which is bit-for-bit 
identical to the brain program, but whose interpretation is 
entirely different (albeit possibly contrived and artificial).

It should be clear in this scenario that the uploader died, and 
never woke up. The fact that I am executing his brain program does 
not mean his subjective life continues; it does not. He died when 
his qualia experiencer was destroyed. The program I am executing 
bears no useful relationship to him. It doesn't even have an 
objective meaning, since you can interpret the program to be many 
things---a brain program is just one possible interpretation, but 
there are countless others (e.g. a finance program under artificial 
constraints).

Now let's transfer this to the case of computers. A computer of 
sufficient complexity will presumably have its own qualia 
experiencer, in the same way that my brain and your brain have 
qualia experiencers. If you give the computer a brain program to 
operate, you will not magically transfer the qualia experiencer of 
the former brain to it. The computer's own qualia experiencer does 
not change; it is simply tasked with the operation of applying 
rules to numbers in memory. The brain program it executes cannot 
even be objectively identified as a brain program; it's 
interpretation is arbitrary, and in some schemes the program might 
be a stock advisor for an alien civilization.

What this means is that when your qualia experiencer (soul) is 
destroyed, you are dead, regardless of whatever copies of it may 
exist, and regardless of whether or not simulations of your brain 
are being run on computers. From your subjective point of view, 
simulations of your brain should offer no comfort. You may as well 
have been cremated.

Therefore, I again urge the cryonics community in general and 
immortalists in particular to reject mind uploading. To me, it is 
as if you were promoting cremation, which I do not want for you, 
and I know that you do not want for yourself.

It is especially important to establish a body of work to convince 
people that mind uploading does not result in personal survival, so 
that future generations working on the task of reanimation do not 
even consider mind uploading as a possible choice for reanimation.

Best Regards,

Richard B. R.

Rate This Message: http://www.cryonet.org/cgi-bin/rate.cgi?msg=25042