X-Message-Number: 10956
From: Thomas Donaldson <>
Subject: Re: CryoNet #10949 - #10955
Date: Tue, 22 Dec 1998 23:08:21 +1100 (EST)

Hi again!

It seems to me that someone with self-esteem would NOT want to be an Eloi
-- but that's perhaps a matter of opinion. The Eloi in Wells' book seemed
to have good self-esteem.

And yes, in some abstract case in which the person/creature/machine/God
that took care of you always knew AND took account of your desires and
needs, the arguments against becoming an Eloi lose their validity. But the
real kicker is that such a possibility is very unlikely. Even a robot
which STARTS by knowing and taking account of your desires will over
time move away from caring at all about you. Why? Because (this is over
time, and would not happen suddenly --- though you may come to realize
it very suddenly ---) there is no reason other than its internal design
why it should bother, and over time that internal design will change
for the same reasons as we all change. Sure, we might design the robot
so that the change is slower than that of a person, but we won't be
able to stop it entirely. Think of it in human terms: the robot gets
nothing at all from you, and you get everything from the robot. That's
not a recipe for something that will last. 

Not only that, but what you care about and want will CHANGE: the robot
will have to be designed to follow that change. If we think about how
this might be done, the most efficient way is that the robot asks
you, and when it does so you can no longer be an Eloi. You'll have to
think about what you want; and since our wants depend on our knowledge,
you'll have to learn about your situation too.

Put simply, there just isn't any way to crawl back into the womb, or
even back into childhood. Some may try, of course, just like they try
to do that now. And given the extra technology, their failure may become
even more spectacular than the failure of people who try to do that
today.

			Best and long long life,

				Thomas Donaldson

Rate This Message: http://www.cryonet.org/cgi-bin/rate.cgi?msg=10956