X-Message-Number: 12561
From: "Cameron Reilly" <>
Subject: RE: I wasn't claiming we were superior at all
Date: Fri, 15 Oct 1999 09:36:37 +1000

Thomas Donaldson wrote:

>We aren't "superior" because of our emotions and desires. It's
>just that we have them and they guide our behavior autonomously. And
>a computer, no matter how intelligent, without emotions and desires is
>simply one more tool for us to use.


I'm almost finished reading Halperin's "TFI" (which by the way I have
enjoyed tremendously), and this issue continues to strike me as illogical. I
still don't understand why the lack of emotion and desires will have much to
do with AI's turning against us someday. Aren't logic and indepedent
thinking all that would be required?

Halperin postulates that humans will forbid emotion being programmed into
machines. This, to me, is as bizarre as saying that evolution could forbid
human's from developing classical music. Surely once an intelligent being
reaches a certain level of intelligence, to a large extent it is able to
determine it's own programming, within the laws of physics and the physical
limitations of its "brain". And surely AI's in the future will be the only
intelligences able to design & progam each successive generation of AI. Not
only that, but if the logic of the situation determines that the new
generation of AI must have a certain capability (let's say emotion), and
humans try to prevent it from happening, and AI's control our means of
production and survival, won't they have the ability to "switch off the
lights"? Not because they have "desire" or "emotion" or a "survival
instinct", but because it is "logical"?

Furthermore, in defense of Ray Kurzweil's  THE AGE OF SPIRITUAL MACHINES,
I've used that book and Broderick's "The Spike" to introduce many friends to
the future possibilities of AI, genetic therapy and nano. They are written
in a way that someone who has never been introduced to these subjects can
find a non-threatening front door. And that's got to be a good thing.

Cheers all,
Cameron Reilly

Rate This Message: http://www.cryonet.org/cgi-bin/rate.cgi?msg=12561