X-Message-Number: 27968
Subject: The Singularity Summit at Stanford
Date: Sun, 21 May 2006 17:16:38 -0700

As a scientist and engineer I have been fascinated by the prospect of a 
technological singularity since reading Vernor Vinge's 1993 paper.  Along with 
about 2,000 other scientists and engineers,  I attended The Singularity Summit 
at Stanford University.  I feel this meeting was historical in the sense that 
the concept of a technological singularity has now been accepted in mainstream 
science.  Ray Kurzweil presented a strong case for continued exponential growth 
in the information sciences.  The fields of genetics, nanotechnology, and 
robotics are grow growing at an accelerating pace.  The synergy among these 
fields will result in the creation of artificial intelligence (AI) with 
capabilities far beyond that of a human.  This prediction seemed to be accepted 
without serious objections.  Panel members estimated that this will happen in 25
to 100 years.

Is this inevitable?  I think so.  The knowledge to create AI is becoming 
available and there is pressure to create AI from science, business, and 

Is it desirable?  That's the tough question.  Friendly AI could help create a 
Golden Age for humanity -- a bounty for all, enhancement, uploading, even 
immortality.  But would a conscious AI be content to be a slave to humanity?  
What if an AI analyzes itself, enhances itself, analyzes itself, enhances 
itself,..., until it becomes so intelligent and powerful that humanity cannot 
comprehend or control it?  This is the technological singularity.  The future 
becomes opaque at this point.  The worst case scenario is an extinction event 
for humanity.  I think it is more likely that very rapid evolution of 
non-biological life will supplant glacially-paced biological evolution.  Will 
humans transcend biology or will they be left behind?

Vernor Vinge The Technological Singularity  

Highly recommended:  The Singularity is Near, Ray Kurzweil, Viking Press, 2005

Joseph W. Morgan

 Content-Type: text/html;


Rate This Message: http://www.cryonet.org/cgi-bin/rate.cgi?msg=27968