X-Message-Number: 15880
Date: Sat, 17 Mar 2001 10:32:12 -0800 (PST)
From: Scott Badger <>
Subject: Re: [off topic] Singularity... Bah Humbug!

To James Swayze;

First let me say that I don't believe this thread to
be far off topic. The singularity has a decent chance
of occurring before the technology develops to revive
cryonics patients and it would have an important
impact on our goals as well as everyone else in the
world for that matter.

I'm certainly not the most qualified to respond to
your concerns about machine AI, but I think your
making some rather remarkable assumptions when you
suggest that a human-based AI will be much less
dangerous. 

I suspect the most likely scenario is that one
human-based super AI entity will develop before the
general population has access to such a
transformation. I would also expect that AI to quickly
evolve into an entity as superior to the average human
as the average human is to an insect. The notion that
that entity will retain its "humanity" simply because
its origin was human is difficult to argue in my
opinion. 

Imagine you're the entity for a moment and the next
most sentient creatures on the planet are ants. They
have a rudimentary language and a simple social
system. Though you were once an ant yourself, they are
quite unable to conceive of what you are now and of
what you plan to become. Significant issues arise. You
need resources to continue your growth and expansion.
You may not see any reason to destroy the ants to get
the resources you need (then again you may), but the
real problem is that the ants have figured out how to
create someone like you which means they will do it
again. So do you want to allow another super AI to
come into existence to compete for the limited
resources available? And don't assume the AI would be
"lonely" or "sympathetic" or "ethical" by our
standards. Those are human traits and are not likely
to apply to a super AI. After all, how many traits do
share with ants? 

So ... if you choose to avoid competition, what must
you do? Perhaps you can try to prevent the ants from
creating another AI by destroying their technology,
but won't they only rebuild without you keeping a
constant eye on them? Perhaps you could leave the
planet looking for resources (i.e. matter/energy)
elsewhere in the cosmos. If this is the most optimal
choice, then the ants would be quite curious and
disappointed because every super AI they create
promptly leaves the planet. Then again, how important
would the ants' lives really be to you if their
continued existence represented such a serious threat?
To what lengths would you go to spare them? My real
point is that you can't answer that question for a
super AI because you aren't one and cannot know it's
mind. It is unfathomable.

The singularity and the idea of creating a super AI,
either machine-based or human-based, scares me, but I
don't see how it's not going to happen. Too many
people around the world are working on artificial
intelligence. The temptation will be too great to
resist. Someone's going to flip that switch and when
they do, for better or for worse, nothing is likely to
be the same again.

Best regards,

Scott Badger

"Vita Perpetuem"


__________________________________________________
Do You Yahoo!?
Get email at your own domain with Yahoo! Mail. 
http://personal.mail.yahoo.com/

Rate This Message: http://www.cryonet.org/cgi-bin/rate.cgi?msg=15880