X-Message-Number: 15950
From: 
Date: Mon, 26 Mar 2001 11:32:41 EST
Subject: goo and you

The Atkins/Swayze/et al discussions about possible friendly Artificial 
Intelligence supervision/restriction of human activity to some extent 
overplays or underplays certain features of existence.

First of all, many people for some strange reason think large calamities are 
worse than small ones. A war is worse than a funeral. But if the funeral is 
yours, it is worse than a war--the whole universe ends, as far as you are 
concerned (leaving aside certain scenarios such as those of religion or 
Tipler's Omega Point etc.). So let's keep it in perspective. Grand scenarios 
may be more exciting for some people, but "All the world's a stage" only in 
your imagination, and if your meat decays your imagination will have no more 
playtime. Right now, the FDA is a bigger threat than gray goo.

Secondly, the madman scenario is probably overdone. The chance of a 
psychopath developing a machine that will have so much lead or superiority as 
to crush all opposition seems very remote. The logistical problem is too big.

Thirdly, there is probably more danger from totalitarianism than from 
anarchy. What will happen if there are one or two small but scary terrorist 
incidents, such as a biological agent in some city's water supply?  Police 
state tendencies might emerge. One of the alternatives--more difficult to 
promote--might be more alertness and cooperation on a voluntary basis.

Fourth, the very concept of AI/uploading is flawed, completely unproven. Of 
course, if apparently conscious automata are developed, some people will 
ascribe sentience to them and will believe themselves capable of being 
uploaded. (Many already do.) But again, the time frame of mere decades is far 
from certain. The "exponential increase of information" can be very 
misleading; weight of printout is not necessarily the right measure.

Of course, the Atkins scenario in a sense is self-sufficient. If the friendly 
AI awakens before the nasties do, that is that. By definition almost, any 
really super AI will take over, if it wishes, simply by persuasion, since it 
will be a super salesman, with plenty of goodies for bribes, no threats 
necessary.

But we have more immediate concerns.

Robert Ettinger
Cryonics Institute
Immortalist Society
http://www.cryonics.org

Rate This Message: http://www.cryonet.org/cgi-bin/rate.cgi?msg=15950