X-Message-Number: 12764
From: Thomas Donaldson <>
Subject: about controlled versus independent robots/persons... and feelings
Date: Sun, 14 Nov 1999 00:45:06 +1100 (EST)

Hi, for all those who disagree with me:

Let's see here. Some problems seem to me to come from misunderstanding 
what I said. Others may even be real.

The first thing I would say is that I was discussing a distinction which
can shade off into another. That is why the issue of a robot on which its
personality has been encoded into ROM versus RAM becomes a problem. I'm
not disturbed by this shading; after all lots of things go that way. After
all, we have to deal with such things all the time.

However I'll repeat the distinction, with more explanation: a robot
controlled by a changeable program is not acting independently at all.
Yes, it's easy to find cases inbetween, but that doesn't make such a 
robot act any more independently than it did before. A robot which was
built and then set going, with no further control, does qualify as
one which acts independently, and in that sense can be considered as
really thinking and feeling rather than just having the thoughts and
feelings of its builder or user. If that robot had its program in RAM,
though, that would raise my suspicions that its builder intended to
modify it later (which would suggest that it wasn't really independent).

FEELINGS!! How can robots have feelings? I would say that if we look at
human beings, their goals and their feelings are very closely intertwined.
Yes, we can create a robot, or even just a dumb machine, to carry out
one of our goals, and nobody would claim that it had any feelings. But
a robot capable of more general action, just as we are, will need
feelings, too. Why? Because no finite list of goals would exist to
tell it how to turn real-world cases into what it wanted. The
points to notice here are 1) I did not say those feelings must always
be STRONG, at all; and 2) they might be a subset, a superset, or have
no obvious relation to our own feelings. 

The kind of feelings that would be satisfactory here are things like a
liking for cherry pie. Nobody I know would kill for cherry pie, but
it lies behind your goals (if you have it) when you go shopping, even
if other feelings are stronger and you decide not to buy or make any
just now. Is it your liking or was it programmed into you by your aunt?
That's not relevant, so long as your aunt doesn't follow you around
and cause you to obey her commands. I would say the same about our
hypothetical robots. 

I'm describing two poles here, and would hardly claim there is nothing
in between. But then lots of things have that feature. Moreover, I may
be mistaken in my judgement of independence in some cases: and if
someone is totally controlled by another person without my knowing it,
then I might be fooled. But that's irrelevant to my distinction; I
can be and have been fooled by lots of things.

However I still find it hard to believe that the extremes aren't so
different that we can describe them as independent beings versus 
controlled robots. 

The relation to look for is that of CONTROL. If we program the robot, and
can easily change that program, then we're basically controlling it and
it has no independent goals or feelings. If we build the robot and set it
going, then it can be independent, and even have its own (perhaps
primitive) feelings. This relation continues no matter how complex the
program we use to run it (if we are controlling it); and again, it remains
even if its independent program is quite simple.

			Best and long long life to all,

				Thomas Donaldson

PS: Why have I taken so long these days to read and answer these comments?
Because 1) I now have a set of Updates for the ANTIAGING BOOK all ready---
not yet printed out, nor are the notices for them, but they're ready to
go. And 2) I've been writing an article for a noncryonics magazine.

Rate This Message: http://www.cryonet.org/cgi-bin/rate.cgi?msg=12764