X-Message-Number: 29708
Date: Mon, 6 Aug 2007 12:42:49 EDT
Subject: Re: SIAI Promotes Singularity, May Cause End of Humanity As W...

Science fiction has pointed to the dangers of AI taking over for so long  
that nobody is going to give a singularity computer control of The Bomb, nor  

much other physical power, nor power to send messages to corrupt other computers
either. They will be very careful for a long, long time. (Sample from 50 
years  ago, Arthur C. Clark I think: 
They turned on the ultra-computer and asked it the ultimate  question: Is 
there a God?  
"Yes, NOW there is!" it said. 
They leaped to pull the plug, but a bolt of lightening from the clear blue  
sky struck them dead and fused the plug to the socket.) 
We know it's a danger. We have heard that many times.
Will we ever get there? The Singularity means an intelligence so far above  
our own that we don't know what it can do. There are many ways in which a 

higher  intelligence may arise. Human-computer combinations, smart AI's, or 
 people through gene selection and manipulation. We are already finding genes 
 that affect intelligence. Selective breeding, creating a hundred embryos in  
vitro for each wanted child and then selecting the one with the best IQ 
genes,  gene transfer and manipulation should allow us, in 20 years or so, to 

create  generations 15 IQ points smarter than their parents. Going from 85 to 
in  some cases will improve society (fewer dangerous louts around).  From 100  
to 115 will give college graduates instead of store checkers. But from rare  

people with 160 IQ's to children with ultra-rare 175's will be the big change.
Instead of a few of those per generation there will be thousands, and they 

are  the type that makes the great discoveries, the Newton's and Feynman's. And
they  will study the problem of genes and IQ's...
Meanwhile progress has hardly stalled in computers. Storage size and speed  
keeps increasing by factors of 100, data transfer rates will go up 40 times 
with  new schemes that use light instead of wires-and-electrons, and chips do 

continue  to improve -- if not quite as fast as before, perhaps it's because 
were so  fast already there was little point in making them any faster, at 

least until  other bottlenecks had been fixed. And progress continues on quantum
computing,  which promises unimaginable speeds.
AI can come in two ways -- we figure out how to write smart programs *or*  we 
map the neurons of the brain and simulate their interactions. We can do that  
for sea slugs with 12 neurons now, I think. Obviously we have a ways to go to 
 work our way up to trillions of neurons in the human brain, but there are 

sudden  jumps in capacity to map things, as when we went to taking a week to map
a gene  to mapping the human genome in a couple of years, and now hoping to 
map it for  $1000.
For 20 years they've been predicting the Singularity in 40 years or so --  it 
does keep slipping further into the future -- but in fact we are closer now  
than 20 years ago. We know some IQ genes, can find genes much more easily, 

have  much faster computers, have mapped a primitive brain. As with a perfected
cryonics, we are not there yet but we keep making progress and there is every  
reason to believe we will get there some day.

************************************** Get a sneak peek of the all-new AOL at 

 Content-Type: text/html; charset="US-ASCII"


Rate This Message: http://www.cryonet.org/cgi-bin/rate.cgi?msg=29708