X-Message-Number: 11064
From: 
Date: Sun, 10 Jan 1999 10:32:49 EST
Subject: (1) Strout (2) Turing

1. Joe Strout (#11056) wrote:

>I have a theory of personal identity which [  ] concludes that a copy *IS*
the same
person as the original; that "copy" and "original" are just meaningless
labels we use to keep track of instances, not identities.

I--and doubtless others--would be interested to see that theory. Although I
don't rule it out, I have several problems with the conclusion that a copy of
you "is" you.

One of the problems has been mentioned repeatedly here and elsewhere. If there
are many known copies of "you"--and perhaps many more unknown or possible
copies somewhere in spacetime--then apparently one "ought" to strive to
maximize total or maximum satisfaction for the whole set. This seems to mean
that the present instance ("you") ought to be willing to tolerate extreme and
prolonged discomfort, if that would somehow benefit the other instances. Now,
I recognize that something that is counter-intuitive may nevertheless be
correct; this is not my problem. 

My problem (or part of it) is that there seems no way to separate this
scenario from a broader one involving "copies" of much lower fidelity. Even if
initial "copies" are identical by agreed criteria, they will rapidly diverge,
as Mr. Strout notes, and after a while may be no more similar than identical
twins or clones with different histories. There is some plausibility in the
quantitative approach (a copy sufficiently similar, whether by design or
accident, is partly you), but plausibility isn't enough. 

I also see a problem with the whole concept of "instance" as opposed to
original/copy. This in SOME ways reminds us of the idea in physics that
elementary particles have no individuality and cannot in principle be labeled
as e.g. "electron A" and "electron B."  Yet in SOME cases they can (and must)
be so labeled. In many experiments there are different electrons with
different histories, and only one may be available or appropriate for
consideration. 

Mr. Strout correctly points out that, if we reject the idea that you "survive"
in a copy, we might also (depending on future developments in physics) have to
reject the idea that we "survive" from day to day in the ordinary course of
events, or after cryostasis, etc. I don't reject that possibility either,
although I emphasize that we still lack important relevant information about
the nature of nature, and in particular about time.

As far as I can see--and subject to review of Strout's theory--there is
presently NO satisfactory or consistent answer to the question of correct
criteria of survival. Why not accept that and push on to get the needed
information and concepts? 

2. Since I haven't done it recently, perhaps I should have added a bit of
detail to my comment yesterday that passing a "Turing Test" is neither
necessary nor sufficient to prove humanity or awareness.

a) If a programmer tries to fool a correspondent into thinking that the
messages are coming from a real person instead of a computer, obviously this
contest could go either way. There are already programs that have, in fact,
fooled many people. If the programmer is very smart, or/and willing to invest
a great deal of time and effort, he could (eventually) fool most of the people
most of the time. Hence it is clear that passing a Turing Test is not
sufficient to prove humanity or awareness.

b) If the person being tested is immature or handicapped or not sufficiently
motivated, his conversation could easily fail to convince the correspondent
that the testee is human or aware. Hence passing the Turing Test is not
necessary to prove (or for consistency with) humanity or awareness.

Robert Ettinger
Cryonics Institute
Immortalist Society
http://www.cryonics.org

Rate This Message: http://www.cryonet.org/cgi-bin/rate.cgi?msg=11064