X-Message-Number: 29138
Date: Wed, 14 Feb 2007 16:37:01 -0500
From: Francois <>
Subject: Re: Singularity Opt-out Function

>Francois said:  "Yes, humans as we know them will probably
>vanish when the Singularity occurs, but they won't simply go extinct. They
>will instead enter a new mode of existence..."

>Shudder.  Do you suppose we will be entitled to the choice of opting out?
>How about the Wal-Mart return policy ... 90 days, no matter what reason?

That's a very good question. Logically, the Singularity AI should not care 
about any human wishing to remain the flesh and blood variety. Maintaining 
an environment where such humans could continue to live short yet fulfilling 
lives would require it to expend very little effort and resources. Clean up 
the Earth, establish a paradise like environment on it and then leave the 
opt-out humans to their own devices. The Singularity AI would have an entire 
Universe to play with, why should it care about primitive humans living 
happy lives on a single planet?

On the other hand, logic is not always the main factor in making a decision. 
If the Singularity AI is based on our current brains, then it may inherit 
the tendency of some people to impose their views of "right" and "wrong" on 
others and decide to forcefully "upgrade" everyone, like it or not, for 
their own good of course. This really is a very good question.

Francois

Good health is merely the slowest
possible rate at which one can die. 

Rate This Message: http://www.cryonet.org/cgi-bin/rate.cgi?msg=29138