X-Message-Number: 27917
From: "Michael P. Read" <>
Subject: Pulling the Plug
Date: Sun, 7 May 2006 00:29:46 -0700

>>The day a computer becomes self-aware, is the day it has the same
>>rights as the rest of us.

>So I gather you are one of the short-sighted "big brains"

Huh?

>who wants to 
>build one of these things and give it control over it's own energy 
>supply.

Humans have control over their own energy supply and they don't run around
killing or over-powering others (with a few exceptions, of course).  The
vast majority just live their lives in peace.  I hope we don't have to
debate about that.

>If so you are just as much a threat to the continuance of humanity 
>and individuality, as the upcoming Singularity 

I never even heard of the "Singularity" until I got involved in cryonics
maybe 4 or 5 years ago.  I'm not even sure what it means.  Though, it does
have a religious flavor to it.  The people who've I asked about it seem to
only have a vague idea.  You brought it up.  So, I am asking you.

>you are helping build.

The only thing I am doing here is asking you questions to get your point of
view and making counter points based on my own understanding of life in
general.

>>I take it that your idea of the Singularity is when computers become
>>self-aware?  Why exactly do you believe that is a threat?

>Murphy's Law, and the probability that a self-aware computer will quickly 
>develop the ability to overpower humans, especially if it has no effective 
>"off" switch.

People don't have an off switch and (for the most part) they don't cause
serious problems for each other.

Also, probability does not mean certainty.

>>It doesn't have to perpetuate our existence.  It simply has to leave
>>us alone.

>What makes you think it will do that?

What makes you think it won't?  I admit that I don't know and it's ok to not
know.  The future is uncertain just as it has always been and always will
be.

>Murphy's Law says that at some point 
>it will not.

How is that different from the people I see every day?  They have the
capability to do crazy things too.  Murphy's law works both ways.

>>What makes you think it would care one way or the other?

>Caring is really irrelevant; in fact, it might not even have emotions 
>developed in its programming.  It could decide pragmatically, or merely by 
>accident.

That's a good reason for us to upload a lot of literature into it.  That
way, it won't have to spend the time learning things the hard way like we
did.  It would have the benefit of our collective experience.  Of course, it
would have the choice to act on what it learned.  It might not do that.  It
might do that.  Or, it might do both.  It will have choices at every moment
just like the rest of us.

Mike Read

Rate This Message: http://www.cryonet.org/cgi-bin/rate.cgi?msg=27917