X-Message-Number: 27305
From: "mike99" <>
Subject: Bill Gates & Ray Kurzweill chat about the Singularity
Date: Sun, 30 Oct 2005 17:13:47 -0700

Singularity chat between Bill Gates and Ray Kurzweill
from Ray Kurzweill's book THE SINGULARITY IS NEAR (pp. 374-376):

BILL GATES: I agree with you 99%. What I like about your ideas is that they
are grounded in science, but your optimism is almost a religious faith. I'm
optimistic also.

RAY KURZWEILL: Yes, well, we need a new religion. A principal role of
religion has been to rationalize death, since up until just now there was
little else constructive we could do about it.

BILL: What would the principles of the new religion be?

RAY: We'd want to keep two principles: one from traditional religion and one
from secular arts and sciences from traditional religion, the respect for
human consciousness.

BILL: Ah yes, the Golden Rule.

RAY: Right, our morality and legal system are based on respect for the
consciousness of others. If I hurt another person, that's considered
immoral, and probably illegal, because I have caused suffering to another
conscious person. If I destroy property, it's generally okay if it's my
property, and the primary reason it's immoral and illegal if it's someone
else's property is because I have caused suffering not to the property but
to the person owning it.

BILL: And the secular principle?

RAY: From the arts and sciences, it is the importance of knowledge.
Knowledge goes beyond information. It's information that has meaning for
conscious entities: music, art, literature, science, technology. These are
the qualities that will expand from the trends I'm talking about.

BILL: We need to get away from the ornate and strange stories in
contemporary religions and concentrate on some simple messages. We need a
charismatic leader for this new religion.

RAY: A charismatic leader is part of the old model. That's something we want
to get away from.

BILL: Okay, a charismatic computer then.

RAY: How about a charismatic operating system?

BILL: Ha, we've already got that. So is there a God in this religion?

RAY: Not yet, but there will be. Once we saturate the matter and energy in
the universe with intelligence, it will "wake up," be conscious, and
sublimely intelligent. That's about as close to God as I can imagine.

BILL: That's going to be silicon intelligence, not biological intelligence.

RAY: Well, yes, we're going to transcend biological intelligence. We'll
merge with it first, but ultimately the nonbiological portion of our
intelligence will predominate. By the way, it's not likely to be silicon,
but something like carbon nanotubes.

BILL: Yes, I understand I'm just referring to that as silicon intelligence
since people understand what that means. But I don't think it's going to be
conscious in the human sense.

RAY: Why not? If we emulate in as detailed a manner as necessary everything
going on in the human brain and body and instantiate these processes in
another substrate, and then of course expand it greatly, why wouldn't it be

BILL: Oh, it will be conscious. I just think it will be a different type of

RAY: Maybe this is the 1% we disagree on. Why would it be different?

BILL: Because computers can merge together instantly. Ten computers or one
million computers can become one faster, bigger computer. As humans, we
can't do that. We each have a distinct individuality that cannot be bridged.

RAY: That's just a limitation of biological intelligence. The unbridgeable
distinctness of biological intelligence is not a plus. "Silicon"
intelligence can have it both ways. Computers don't have to pool their
intelligence and resources. They can remain "individuals" if they wish.
Silicon intelligence can even have it both ways by merging and retaining
individuality at the same time. As humans, we try to merge with others also,
but our ability to accomplish this is fleeting.

BILL: Everything of value is fleeting.

RAY: Yes, but it gets replaced by something of even greater value.

BILL: True, that's why we need to keep innovating.

Rate This Message: http://www.cryonet.org/cgi-bin/rate.cgi?msg=27305