X-Message-Number: 19865
From: "davepizer" <>
Subject: You must remain you for you to survive.  
Date: Wed, 21 Aug 2002 18:41:40 -0500

This is a multi-part message in MIME format.

Content-Type: text/plain;

I do not believe that making copies of an individual and then destroying the 
original can ever be called survival of the original, I believe that holds true 
whether the copy is a carbon based brain similar to the original brain, or a 
silicone based brain, of any other type of duplicate.  Only *the* original is 
the original.  Below I take issue with Francois, whom I do not know.  Here are 
his or her comments, and mine, for those who are not completely bored with the 
long running duplicate is or is not you debate.

From: "Francois" <>

Francois said:

"The aim of most of CryoNet's subscribers is to achieve immortality. This, in
principle, can be done in many ways, some emotionally more satisfying than

I agree that achieving immortality is a great goal, but I don't think in can be 
done is "many ways."  It seems to me that there is only one way *you* can become
immortal, that is for *you* to become immortal.  This may sound funny at first 
but what I mean is that if a person tries to create a copy of himself in some 
way, but does not capture *himself* in the copy, (which I think may be 
impossible),  then if that copy becomes immortal and the original is then 
destroyed, the original person is not immortal, just the copy.    

 "The preferred vision of immortality emerging from the many posted
messages seems to be purely biological in nature. Either people die, are
cryopreserved and revived when the needed technology becomes available, or
medical science advances fast enough to vanquish aging before people die.
The resulting individuals look exactly as we do today, except that they
forever remain physically young."

"This, however, presents a problem. To illustrate it, suppose one of our
distant ancestors, a Homo Habilis for instance, became immortal three
million years ago. How this could have happened is irrelevant, it is enough
to imagine that it did. This immortal Habilis could then still be alive
today, but he would obviously be completely obsolete from the point of view
of intelligence, having been left far behind by our much better and keener
minds. Evolution would not have stopped just because he became immortal, and
it would have quickly transformed him into an actual living fossil."

But old Habilis would still be immortal, and that is what most of us are 
shooting for.  I have no fear of competing with machines in the future, as long 
as I have access to machines myself for help in the limited areas they can help.
But that does not mean I want to be destroyed and have some machine exist that
everyone calls David.    

There are ways for Habilis  to cope with a superior race (us), in our world and 
for us to cope with others in the future, they do not include making a copy of 
the original and letting the original be destroyed, when you do that, and if the
original is destroyed, the original is no longer immortal, he is now dead.  

You don't have to become the machine to have access to machine technology.  Me 
being me, or you being you, is not how fast you think or how much information 
you have stored in your cortex, it is the feeling of awareness that you have and
that I have.  As long as that is preserved, it can access lots of other stuff 
without having to become the other stuff.

"Purely biological immortals will always suffer this fate."

Why?  Are we so dumb that we can't push a button and get an answer from a 

" It is not easy for
a living creature to "upgrade" itself. Normally, old individuals die long
before this becomes a problem, but we will prevent this from happening.
Immortal Homo Sapiens will suffer the fate of my hypothetical Homo Habilis,
and probably much sooner than he would have because we will be faced with
entities that can "evolve" much faster than any biological organisms."

This kind of thinking is called, "That's the way it was, therefore that's the 
way it will be."   If you hold that, then you must hold that cryonics won't work
in the future, because it has never worked in the past.

speaking of the intelligent machines that will have to exist at the time of
our reanimations. Biological brains have very strict limits, and we are very
close to them right now. Machine brains don't have those limits. It has been
demonstrated that machines can enhance their capacities up to literal
infinity. The only way for us to cope with this problem will be to join the
machines on their own turf and convert ourselves into machine entities. Then
our minds will acquire the infinite enhancement potential of the machines.
Biological immortality can only be seen as a temporary stepping stone to a
different realm of existence."

First of all there are no "machine brains" that can be compared to our brains at
this time.  It may turn out to be the case that silicone cannot produce a 
feeling of self-awareness like carbon can.  The jury is out on that.  But 
without a feeling of awareness, a thing is not like us.  A machine that can do 
computations a trillion times faster and better than us, but does not have 
self-awareness is less like us than an bug is.

But I would agree that silicone based machines can help carbon based brains 
answer problems better.  It may be in the future that our brains will get 
assistance from machines, but that does not mean that our brains can become 
those machines.  It is a small part of our brain where self-awareness lies that 
is what makes us each our own person.  If that does not survive, then the 
original person is dead.

I don't think we are doomed in the form we are.  I think we will figure out how 
to get help from machines just as we are doing now.  It may be the case that the
machine is hooked directly to your present brain for faster or better access, 
but I would hold that the part of your brain that now feels self-awareness is 
the only essential part of any new arrangement that will be you.



 Content-Type: text/html;


Rate This Message: http://www.cryonet.org/cgi-bin/rate.cgi?msg=19865