X-Message-Number: 15029
Date: Sun, 26 Nov 2000 10:50:52 -0700
From: Mike Perry <>
Subject: Reply to Ettinger

Robert Ettinger, #15025, says:

>Mike Perry (# 15016) reiterates and tries to clarify his views on the 
>"relativity" of sentience and related matters. However, I think he has (at 
>least in the minds of some readers) mixed up three different issues.
>1. If a system is a perfect mimic of a person in external behavior, would it 
>necessarily have feeling? (Zombies)
>2. Would a running digital computer emulation of a person (brain) have 
>feelings? (Uploading)
>3. Would a static set of representations of states of a person have feelings? 
>(Turing Tome)
>(There are also combinations and variations of these.)

I hope the reader wasn't confused, but sorry if you were. Actually, #1 was
mentioned only peripherally, as (I thought) a way to make it easier to
approach #2, the more important of the two. I combined the computer with the
robot human body to make it more plausible that an uploaded mind would
indeed have feeling and consciousness.

>Mike makes a valiant effort to meet some of the issues by speaking of 
>"relative" sentience. For example, he says, a static record could be 
>conscious in some appropriate context, although not from our point of view. 
>While that sounds slightly plausible at first, it really amounts to creating 
>your own jury-rigged definitions.

Constructing any theory involves creating your own definitions. You want to
explain something and you want certain results to follow. So there's an
inescapable element of reverse engineering, that I suppose will seem like
jury-rigging to some. If you consider the field of mathematical logic, Frege
worked for 20 years to construct (jury-rig?) definitions that would allow
mathematics to be explained in terms of logic. Unfortunately, after all this
work, a younger colleague of his, Bertrand Russell, showed his system was
inconsistent. Russel and Whitehead worked for more years to modify the
definitions (more jury-rigging?) and came up with an adequate system that
has so far not been shown inconsistent (though simplified versions, also not
known to be inconsistent, have proved more useful for the purposes
intended). Anyway, my own effort ("jury-rigging") is based on the idea that
a suitable isomorphism does establish that consciousness, in a relative
sense, occurs. The isomorphism would model both the subject and the
surrounding world. That seems entirely reasonable to me. If consciousness
occurs in the original, real-world setting, then the real-world being is
conscious relative to that setting. If the modeling is sufficiently
detailed, then it seems appropriate to regard the modeled being as conscious
*relative to* the modeled setting, though not necessarily relative to the
real-world setting. For instance, the modeling may describe a meat-based
brain down to the quantum level, along with a sizable chunk of the
surrounding universe, over an extended period of time. Everything that
happens in the real setting is mirrored in the modeled one. (The modeling
itself might have to be far removed from the real-world setting, to avoid
the complication of having to model its own existence, but that could be
imagined too.) So why shouldn't we say the modeled being is conscious
relative to the modeled world, though not necessarily elsewhere? Is there
anything arbitrary, capricious, or counterintuitive about that? 

> It also begs the issue, which is whether we 
>OUGHT to regard uploading as life-saving. 

That is a separate issue I didn't address in the posting.

>Mike also reiterates the question of an objective test for zombies. Well, for 
>the system in question to be acknowledged alive, it would probably be 
>SUFFICIENT for it to have some APPROPRIATE analog or homologue of something 
>in our brains which we know produces sentience, along with appropriate 

So it sounds as if you are conceding that a non-meat brain would be sentient
based on its behavior and internal structuring, or would you consider that

> NECESSARY conditions are also presently unknown, ...

I agree, and I am not dogmatic, and much does remain to be determined.

Mike Perry

Rate This Message: http://www.cryonet.org/cgi-bin/rate.cgi?msg=15029