X-Message-Number: 12668
From: Thomas Donaldson <>
Subject: more on intelligence, consciousness, feeling, goals
Date: Sat, 30 Oct 1999 00:28:45 +1000 (EST)

Hi everyone!

More on goals and feelings.

For Bob Ettinger, perhaps I did not make myself clear enough. It's
important for a goal and the feelings belonging to it that it come with
the creature. Programming a computer to work towards some aim doesn't
do that; it is just the way we ask our computers to do something towards
OUR goals. I'll also emphasize here that the level of feeling shown by
a device which wanders about looking for an electrical outlet is VERY
low --- just as its goal is very low. But in terms of the behavior of
microorganisms, I can't find anything in the behavior or structure of
such devices which makes me think they are devoid of any goal (and the
accompanying primitive feeling). Sure, they aren't put together the
same way life forms are put together, but how and why is that important?

Basically, executing a program doesn't count, even if the program and
computer are advanced enough that the computer can seem to act like
a person. It's at differences such as these that I agree with Bob:
we cannot just use behavior, we need to know more about how it came
about. (Yes, I'd hardly be surprised if some people on Cryonet disagree
with me here). The thing to look at is the ORIGIN of the behavior.

For Daniel Crevier I have something to say, too. The very first issue
is that of whether consciousness and AI have anything in common, and
if so what. I see no reason why we could not make a machine which 
by all OBJECTIVE tests would be very intelligent ie. pass all our
IQ tests, but remain quite unconscious and without either feeling or
goals. Some one might argue that a certain level of intelligence is
a PRECONDITION for consciousness, but that does not equate intelligence
with consciousness at all. I can also imagine an artificial creation
which is VERY conscious, but relatively unintelligent ie. it's devised
so that it notices (in the sense of records) everything in its
environment, but never draws any conclusions or makes any theories 
about what it perceives. (Any animal like this would not survive very
long at all, which is why this hasn't happened naturally). Such a 
creation would have to have SOME intelligence, since even to notice
something involves such abilities as recognition of shape, sound, or
other forms and recognition of behaviors... but it need not have 
very much, at least as we measure it (Note that even a film doesn't so
much record events as record the consequences of light on it; WE do the 
recognition of shape etc when we see the film --- so I'm not considering
something so lacking in intelligence as a film. Even recognition of
shapes is something computer scientists worked hard to produce, and
even now haven't produced completely, though they're getting there).

So just what is the relation between AI and consciousness? Or AI and
feeling, for that matter? 

I ask this not because I think that producing an intelligent and conscious
machine is impossible, but because I think they are (almost) independent.
Yet some people seem not to believe that. Why?

			Best and long long life for all,

				Thomas Donaldson

Rate This Message: http://www.cryonet.org/cgi-bin/rate.cgi?msg=12668