X-Message-Number: 31376
From: 
Date: Thu, 5 Feb 2009 11:45:00 EST
Subject: singularity

FD (along with quite a few others) has written about the "dangers" of the  
"singularity." The only relevance I see of this topic is that it distracts  

people from serious endeavors. Here are some of the reasons to ignore this  
stuff:
 
1. There's essentially nothing our few cryonicists can do about it anyway.  
Computer advances will happen regardless.
 
2. The notion that advanced computers (AIs) would in effect be competing  
life forms has very little going for it. It isn't even clear that such a  

computer (or network) would or could have feelings (consciousness) or be  alive 
in 

our sense, or have desires or aversions or be either "friendly" or  "unfriendly"
to humans. You couldn't even program a computer that way if you  wanted to, 
because the terms are too vague and too difficult to allow of  algorithmic 
representation.
 
3. While there may be some nuts or terrorists among programmers, the  

majority will be sane and working to prevent take-overs by terrorists or  
nutcakes. 
 
4. Advanced computers need not, and probably will not, be autonomous.  

Instead, they will be slaved to human brains. (Right now, the larger parts of  
our 
brains are basically computers, with only a fraction dedicated to the self  or 
consciousness or qualia.)
 
Robert Ettinger
 
 
 
**************Great Deals on Dell Laptops. Starting at $499. 


(http://pr.atwola.com/promoclk/100000075x1217883258x1201191827/aol?redir=http://ad.doubleclick.
net/clk;211531132;33070124;e)


 Content-Type: text/html; charset="US-ASCII"

[ AUTOMATICALLY SKIPPING HTML ENCODING! ] 

Rate This Message: http://www.cryonet.org/cgi-bin/rate.cgi?msg=31376