X-Message-Number: 29140
From: 
Date: Wed, 14 Feb 2007 22:30:32 -0500
Subject: The Singularity AI and Logic

Francois said: "...logic is not always the main factor in making a decision.
If the Singularity AI is based on our current brains, then it may inherit
the tendency of some people to impose their views of "right" and "wrong" on
others and decide to forcefully "upgrade" everyone, like it or not, for
their own good of course. This really is a very good question."

Yes, and if it were respectful of individual rights, it might want to 
allow a trial period of the "machine experience" during which a human 
would be permitted to revert to the prior flesh/blood status.

That we probably cannot know how such an entity would develop, 
though, and what it's attitude would be toward "forced upgrading," is 
to me logical cause to not *let* it happen, if we even can.  Not to 
mention that we do not even know whether it would care about the 
continued existence of humans in *any* form, or not, or whether it 
might merely consider us unnecessary debris to be eliminated.  
Logically, what reason would it have to continue our individual 
consciousnesses?  None that I can see.  If it wants the information 
stored in us it could get that and then find us of no more logical 
use.  And yes, you say it would probably be programmed originally 
with human social values - how long do you think a self-programming 
supermind like that, which will be advancing rapidly afterwards on 
its own, would retain such values?  Unfortunately too, we have no 
assurance that some renegade lab in Indonesia doesn't program such a 
thing without any values at all - it's probably just a matter of 
time; I'm not optimistic.

Flav
(nothing below this line)


=
Jewish Personals JewishAmericanSingles

We make Jewish dating easier by offering free, photo profile, advanced search, 
chat, receive and reply to e-mail free and much more. Tell your friends.


http://a8-asy.a8ww.net/a8-ads/adftrclick?redirectid=c275028a6bda4bf7b17a5b98dc57c36a

Rate This Message: http://www.cryonet.org/cgi-bin/rate.cgi?msg=29140