X-Message-Number: 6809
From: 
Date: Tue, 27 Aug 1996 17:04:28 -0400
Subject: SCI. CRYONICS more EI

Previously I have discussed some aspects of Peter Unger's "Experience
Inducer" (EI) in his book, IDENTITY, CONSCIOUSNESS & VALUE. In particular, I
have shown that he erred in his argument based on a thought experiment
involving the EI--an argument intended to show that we may value external
things more than our own future subjective experiences or brain states. 

However, one aspect I neglected was whether in fact an EI is possible--even
in principle. The main purpose of this discussion is just to show how easy it
is even for professionals, in their own specialties, to make faulty
assumptions. (These are only notes, not reviewed or revised or polished.)

To recapitulate, the EI is a device, built around an extremely sophisticated
computer, acting somewhat like a highly advanced Virtual Reality machine,
applied (say) to a person lying in a cocoon on life support, with electrodes
in the brain etc. The net result is supposed to be that the subject thinks
she is experiencing ordinary reality, but that the EI can be programmed to
induce her to "live" through any prescribed scenario, even for a lifetime. It
can also edit her memories, at least to the extent of making her forget that
she chose to enter the EI.

Since the computer is tailored to her own psychology and neurology,
supposedly she could be made to "live" her most cherished dreams, or even her
potentially most cherished dreams. Yet--even if we forget all engineering or
other practical considerations--could this really happen?

Prof. Unger does not say whether the tailoring of the program (to her
individual psychology and neurology) extends down to the finest level;
neither does he engage the question of the limits of the EI's memory and its
simulation capabilities. In order for it to work as advertised, the EI would
have to know PRECISELY how to evoke a desired response, and PRECISELY what
the psychological/neurological response would be to every smallest stimulus
or change in stimulus.

In other words, an EI cannot just be programmed (say) with the command: "Make
this subject live through the life of the heroine in FOREVER AMBER." The
subject simply would not react to situations like the fictional heroine, and
almost instantly the scenario would be off the rails. 

If somehow COMPELLED to react like Amber--to believe she is making the
movements and experiencing the emotions etc.--then in effect she has been
radically altered, and the original premise is void.

Could the EI be so advanced and resourceful that it could deliver an
APPROXIMATELY outlined scenario, adjusting as needed to the subject's
individual responses, without violating her psychological integrity? I don't
think so. For example, what if the subject decides she wants to do scientific
investigation, or geographical exploration, or have tea with a pen pal in
Minsk or Pinsk? In at least some of these cases, the EI cannot possibly have
enough information in its banks--and indeed the information may not even
exist in all the libraries and data banks of the world. If she is actually
doing original scientific empirical investigation, the result (if it were in
the real world) might be beyond any computer's ability to predict, even in
principle.

Another possible problem: The subject's psychology does not allow a "happy"
script of the kind on the menu. She has problems bordering on masochism or
whatever. She will change every silk purse into a sow's ear, plucking defeat
from every victory--at least to the extent of mangling the scenario. In such
case, the EI can deliver the promised happiness only by altering her makeup,
her personality, which is a no-no.

The experimenter could, of course, back off from promises of scenarios, and
just promise pleasure, period. But now we are back to the opium smoker, and
for well known reasons there would be few takers.

My conclusions: (1) In at least some cases, the EI is not possible, even in
principle; and (2) as previously discussed, even if we allow the EI, the
thought experiments Prof. Unger uses do not demonstrate that there is ever a
case where someone values externalities over her own subjective states.

Robert Ettinger


Rate This Message: http://www.cryonet.org/cgi-bin/rate.cgi?msg=6809