X-Message-Number: 14987
Date: Tue, 21 Nov 2000 19:06:53 -0500
From: Jeffrey Soreff <>
Subject: Re: Moving closer to agreement on identities vs persons.

Henri Kluytmans wrote:
>And of course, also we non-Pizerists want to survive as a person too!
>Because only surviving in a static state is not living. I.e. when 
>cryo-suspended, I dont want to survive as frozen body, I want to 
>be re-animated again.

>And I'm quite sure Jeffrey Soreff wants this too, although he did 
>say : "but am quite satisfied by survival of my identity." Many non-
>Pizerists make the same mistake of mixing up the terms identity and 
>person! (And the terms memory and information.)

You are right in a sense.  Let me try to clarify my view:
I do indeed want to be re-animated again.  I agree with the view that
a static state is not living.  What I don't care about is exactly how
many (nearly identical) copies of me exist (above at least 1).  If some
process creates and destroys (painlessly) a thousand instances of me,
it doesn't bother me, as long as at least one is left standing.
I _don't_ insist that every single copy of me that ever gets created
must be preserved at all costs.  In that sense I care about the
survival of my identity, but not of every person that I will ever be.

Dave Pizer wrote:

>>Dave,
>>what do you think of other people choosing to make backups or
>>copies, even if you find the prospect unattractive yourself?

>If a backup could be made without damage to the original, I think it would
>be worthwhile.  No one can know at this early stage, for sure, that the
>back up is, or is not, the original, so it would be worth doing.  I would
>do it, even though I have doubts that the backup is me.  What if I am wrong?

Thanks!  If you've looked at the nanodot discussion, you'll have seen
that Mark Gubrud was so absolutely sure that a backup was worthless,
and so squicked by the possibility of creating them, that he proposes
using State power to forbid creating them.  Thanks very much for being
more accepting of this possibility.

Lee Corbin wrote:
>In my opinion, this is almost right.  Though the computer could be
>distributed, often one part would need to know results from another
>part, just as your hippocampus needs to be influenced by what has
>happened in a parietal lobe.  So (according to us) "you" live as
>long as the entire causal calculation is done, and it doesn't matter
>by what.  It is NOT sufficient for parts to proceed in ignorance of
>each other.

>>Well, remember the sculptor? "Inside every slab of marble there is a
>>beautiful sculpture. All the sculptor has to do is remove the covering." And
>>all the observer of emulation has to do is recognize those parts of the
>>universe where some set of atoms (or whatever) are arranged in an order that
>>could be interpreted as constituting the right symbols; and a chronon later,

>>somewhere else, he perceives an arrangement that could be interpreted as a
>>succeeding quantum state.

>This is the "Theory of Dust" talked about by Greg Egan in the science
>fiction novel "Permutation City".  He had a catchy name for it, but it's
>an old idea.  Morovec alludes to it in the appendix of Mind Children.

I agree with the importance of causal connections, for two basic reasons:
1) If you build a system with the causal connections between successive
   states broken, it cannot respond to inputs from the real world
   anymore.  In a _very_ general sense, it cannot possibly defend itself.
2) Just as it cannot respond to external events, it cannot respond to
   dialog.  We might think of it as representing a brain state, but if
   it cannot speak, how can it pass the Turing test and let us see that
   it is indeed conscious?

There are a _lot_ of twists that one can put into a nonbiological
system which still preserve what I think of as consciousness.  One
can speed up or slow down clock rates.  Once can change the
implementation of many processes between physically realistic
simulations and pre-computed table lookups.  One can change the
granularity of simulation between interactions with the external
world.  Once one starts reordering interactions with the external
world, however, I'm _very_ skeptical that you really have a
conscious system anymore.

This ties back to my comments about static data.  A static data
structure doesn't interact with the world and is, again,
defenseless in a very strong sense.  I consider it acceptable to
be in that state in preference to complete annihilation, e.g. in
a cryopreserved state, or under deep anesthesia, but I regard
this state as "time at risk" and prefer to be put back into an
active state as quickly as possible.

                                 Best wishes,
                                 -Jeffrey Soreff

Rate This Message: http://www.cryonet.org/cgi-bin/rate.cgi?msg=14987