X-Message-Number: 12551
From: "Scott Badger" <>
References: <>
Subject: Re: CryoNet #12546 - #12550
Date: Thu, 14 Oct 1999 07:32:17 -0500

Hi all.

Thomas Donalson wrote:

[snip]

> That is why I thought his book was rather weak. Without emotions,
> we can make very good hyperslaves --- so good that they'd never
> want to take over because they had no wants of their own.

[snip]

> We aren't "superior" because of our emotions and desires. It's
> just that we have them and they guide our behavior autonomously. And
> a computer, no matter how intelligent, without emotions and desires is
> simply one more tool for us to use.

Hi Tom,

Could you help me understand you point a bit more clearly?  Because it seems
to me that an artificial intelligence (a field in which I plead considerable
ignorance) could be programmed to seek a variety of goals which would guide
their behavior.  Why would they require emotions to guide their behavior as
well.  Especially if they were given the abiliity to self-program and alter
their goals based on new information.

Emotions have been factor analyzed by psychologists into two primary
factors; Positive Affect and Negative Affect.  These two categories seem to
function, as you say, to autonomously guide behavior toward desirable
stimuli and away from noxious stimuli in lower organisms.  But they seem to
serve a somewhat less critical role in human behavior because we have
enhanced cognitive systems to handle our decision making needs.  Emotions
don't strike me as the only way that an entity can have desires.  Desiring
doesn't even strike me as an emotion.  Is *desiring* a positive affect of a
negative affect?  See what I mean?  Desiring means little more than being
designed to seek positive outcomes or avoid negative outcomes.  That doesn't
necessarily require emotions, does it?  Can't a machine be programmed
thusly?

As for the question of which is simpler to accomplish, intelligence or
emotions...would that depend on whether we're talking about organic vs.
inorganic systems?

Best regards,

Scott

Rate This Message: http://www.cryonet.org/cgi-bin/rate.cgi?msg=12551