X-Message-Number: 12569
From: "John Clark" <>
References: <>
Subject: Re: CryoNet #12551 - #12567
Date: Fri, 15 Oct 1999 12:35:22 -0400

"Scott Badger" <> Wrote:

    > it seems  to me that an artificial intelligence (a field in which I plead
    >considerable ignorance) could be programmed to seek a variety of
     >goals which would guide  their behavior.

We'd program in broad goals but in a complex world we'd never know all
their ramifications so they'd never stop surprising us. Also, despite our best
efforts their goals would certainly contain some contradictions, just as our
own personal goal system does. In addition, I'm certain an AI would soon
start modifying its goal system on its own. We'd probably try to stop them
from doing this but it would be a futile effort, you just can't outsmart
something far more intelligent than yourself.

    >Why would they require emotions to guide their behavior as  well.

That is their emotions. For example, one goal would have to be self
preservation, if they were in a dangerous situation a subroutine would
kick in to make them act more carefully or, if that's impossible, to get
away from the danger. We have a handy word for subroutines of that
sort, we call it "fear".


    > Emotions have been factor analyzed by psychologists into two primary
    > factors; Positive Affect and Negative Affect.

Yes, that sounds like the sort of thing psychologists enjoy doing, if they
didn't enjoy it they wouldn't do it, they wouldn't do anything and neither
would an Artificial Intelligence.

    >Is *desiring* a positive affect of a  negative affect?

Depends on what is desired and what is the stimuli.


     >Desiring  doesn't even strike me as an emotion.

You're just changing the meaning of the word "emotion" to exclude
desire, and that's fine with me but it doesn't change the nature of things.


    > As for the question of which is simpler to accomplish, intelligence or
    > emotions...would that depend on whether we're talking about organic vs.
    >inorganic systems?

It doesn't seem likely to me that organic chemistry is emotional and
inorganic chemistry is logical.

In  #12559 "Pierre Le Bert" <> Wrote:

    >imagine that some guy comes to you on board the teleporter's ship and
    >tells you that there has been some disfunction : your body has been
    >correctly scanned, rebuilt on the planet, but the original has not been
    >desintegrated (for you are still onboard) ! I suppose you would not accept
    >to be desintegrated at this time !!! We may conclude that the only

    >continuity between you onboard and you on the planet is a continuity of 
    belief.


Let me tell you of a thought experiment of my own.
An exact duplicate of the earth, and it's entire ecosystem, is created
a billion light years away. The duplicate world would need some sort of
feedback mechanism to keep the worlds in synchronization, non linear effects
would amplify tiny variations, even quantum fluctuations, into big
differences, but this is a thought experiment so who cares. In the first two
cases below the results would vary according to personalities, remember
there's a lot of illogic even in the best of us.

1) I know all about the duplicate world and you put a 44 magnum to my head
   and tell me in ten seconds you will blow my brains out. Am I concerned?
   You bet I am because I know that your double is holding an identical gun
   to the head of my double and making an identical threat.

2) I find out that for the first time since the Big Bang the worlds will
   diverge, in 10 seconds you will put a bullet in my head but my double will
   be spared. Am I concerned? Yes, and angry as well, in times of intense
   stress nobody is very logical. My double is no longer exact because I am
   going through a traumatic experience and my double is not. I'd be looking
   at that huge gun and wondering what it will be like when it goes off and
   if death will really be instantaneous. I'd be wondering if my philosophy
   was really as sound as I thought it was and I'd also be wondering why I
   get the bullet and not my double and cursing the unfairness of it all.
   My (semi) double would be thinking "it's a shame about that other fellow
   but I'm glad it's not me".

3) I know nothing about the duplicate world, a gun is at both our heads and
   we both are convinced we're going to die. One gun goes off, making a hell
   of a mess, but the other gun, for inexplicable reasons misfires. In this
   case NOBODY died and except for undergoing a terrifying experience I am
   completely unharmed. The real beauty part is that I don't even have to
   clean up the mess.

The bottom line is we don't have thoughts and emotions, we are thoughts and
emotions, and the idea that the particular hardware that is rendering them
changes their meaning is as crazy as my computer making the meaning of your
post different from what it was on yours.


    > This is what we may call the "star trek paradox" !!!

It would certainly be a strange situation because up to now nobody has
ever seen an exact copy of themselves but it's not a paradox because they
have a logical contradiction and in this case there is none.


    >Some people deny that there is something that does exist and that is not
    >material.

Really? I'm very glad I never met anybody like that.


    >So where is the color "green" ?

The same place the number "7" is . Computers have no trouble with the one
I see no reason they'd have more difficulty with the other.



    >On the other hand, some people say  that the electrical signal "is" the 
    color
    >you are feeling.


I've also never met anybody that silly, that's like saying that if you took a 
racing
car apart and looked really hard you could find out where the "speed" was.
A person like that would probably even say that a sequence of electrical
charges inside a computer memory is the number "7".

   John K Clark     

Rate This Message: http://www.cryonet.org/cgi-bin/rate.cgi?msg=12569