X-Message-Number: 25263
Date: Fri, 10 Dec 2004 18:10:49 -0800 (PST)
From: Scott Badger <>
Subject: Re: Identity

I wrote:

>Let s try something else. How about the
>classic thought experiment where the natural brain s
>neurons are replaced, one at a time, by artificial
>neurons which are precise duplicates until the normal
>brain was completely replaced by an artificial brain?
>Where is the line at which the QE is destroyed and
how
>do you justify the existence of that line?

Richard: Assuming the scenario is possible, then the
QE is NOT destroyed. Remember the criterion for
survival: you survive from time T0 to T1 if at all
points T in [T0, T1], your physical system is 
Capable of experiencing qualia. This implies that
replacing the neurons of your brain one at a time with
(as you say) precise functional duplicates would not
result in your personal destruction.

Scott: So we finally agree on something. The human
mind CAN BE sustained on a non-biological substrate
with the original QE.

Richard: Contrast this with near instantaneous
replacement of all your neurons (e.g. mass
disassembling and then recreation): there is a time
when your physical system lacks the ability to
experience qualia---when you exist as 'information'
(which is another way of saying, you don't exist at
all, since nothing can exist as information). Since
the physical system that was your brain ceased to have
a QE (i.e. became incapable of experiencing qualia),
you did not survive; and assembling from your remains
another QE, even in the likeness of your own, is not a
continuation of your inner subjective life.

Scott: So if the instantaneous replacement of all your
neurons with artificial duplicates destroys the QE,
just how quickly  can  I replace them if I do so one
at a time (assuming there was no limitation to how
quickly I could replace them)? Just a fraction of a
second short of instantaneous? 

Also, your last post made it clear that you re pretty
critical of the idea of exceeding human limitations or
enhancing human capabilities (with the notable
exception of extending your own maximum life span),
but what if I wanted a bigger, more powerful brain?
Assume my brain has successfully been replaced with
let s say, silicon-based, artificial neurons and my QE
survived. Assume also that these silicon neurons can
be easily manufactured and I can add as many of them
to my existing system as I wish. The size of my skull
would restrict how many I could add to my system
internally, but a silicon version of a corpus callosum
it seems to me could allow my mind to expand its reach
to an external silicon-based system. Or perhaps a
physical cord wouldn t be necessary at all, and a
wireless connection could be used. 

I don t think there s anything so far about this
scenario that s disallowed by your theory, right?

Now let s say the external system I choose to build
and expand into is a replica of the brain system
inside of my skull so that my internal system gets
connected to an identical external system. Being
identical to my internal system, the external system
also has the structures necessary to experience qualia
and begins doing so as soon as I expand into it. So
the two systems have become one system, and qualia are
happening to both of my QE s, both of which would be
having the same internal subjective experience. So now
I have a redundant brain system and the accidental or
intentional loss of either QE structures would not
result in the destruction of the original QE since the
overall physical system remains capable of
experiencing qualia and continuity would be
maintained. 

Have I broken any of your theory s rules yet?

If not, haven t I just uploaded without losing my
original QE?


Scott: The closest thing to what Richard is describing
that I ve seen is Michael Gazzaniga s  Interpreter 
theory.

Richard: If these quotes are representative, and I
have interpreted them correctly, then Michael is in
gross error.

Scott: I see. Given Dr. Gazzaniga s considerable
experience and reputation in this field, and given the
empirical evidence he cites to support his theory,
you ll forgive me for not readily accepting your
sweeping dismissal of his ideas since I m unaware of
your experience and reputation in this field.

Richard: Evolution does not produce a little man in
the brain, whose purpose is to passively observe the
macroscopic operation of the system, but effect no
change.

Scott: I assume you base this on an earlier assertion
you made which I believe to be false. I ve always
heard the idea that evolution is so efficient that its
products always have survival value or purpose is a
myth. Natural selection involves a lot of trial and
error, a lot of dead ends. Am I wrong?

Best regards,

Scott

Rate This Message: http://www.cryonet.org/cgi-bin/rate.cgi?msg=25263