X-Message-Number: 13422
From: "George Smith" <>
References: <>
Subject: Intelligence, desires and insect robots
Date: Fri, 24 Mar 2000 10:15:09 -0800

Personally, I agree with Thomas Donaldson's expectation in his Message
#13416 when he wrote:

> I reviewed Kurzweil's book and found that it totally omitted any
> discussion of one major ability which humans have and which will
> not come with intelligence alone. It's called "desires", and
> no matter how intelligent (whatever reasonable definition of
> intelligence you wish here) a computer or device may become, without
> its own desires it remains a useful slave at best

At the same time as reported in the most recent copy of "The Immortalist"
(Jan-Feb 2000, page 6), there is this curious work by Mark Tilden at Los
Alamos National Laboratory with his "insectoid" robots, which hardly qualify
as computers (that's another debate, thank you) yet exhibit the features of
what could be short term memory - without any such design having been
included.

Desire is wanting something, having a goal. These little "no-brain"  robots
seek out food (electrical energy), a desire which IS programmed, but also
exhibit short term memory which ISN'T programmed.

Physicist and nonlinear theorist Brosl Hasslacher was quoted as saying, "We
don't know how they do it, but they do it, and they're not using sequential
logic.  If you tell a machine its task is survival - get more food and get
better food, which in this case is electrical power - those constraints are
so difficult to satisfy that once it's done, it's almost got all the rest.
Survival is such a strong constraint in the world that it may be enough to
get intelligence".  (Smithsonian magazine).

Thomas Donaldson might point out that the desire (for getting electricity)
WAS "programmed" in first, and he would be right.  But if this phenomena is
extended because of being able to develop intelligence, then OTHER, far more
complex desires may very well THEN be generated.  We may only need to first
"prime the pump" with an initial simple desire.

There may be many potential explanations for why this phenomena goes on, but
one that comes to mind is the suggestion from Amit Goswami, Professor of
Physics, University of Oregon, who proposes a no-paradox theory of Quantum
physics based an the supposition that consciousness is primary and that
matter and "mind" derive from consciousness, and not the other way around as
is so commonly assumed.

If Goswami is correct, the more complicated the RIGHT KIND of machine we
create, the more consciousness and therefore mind (including desires) would
be REVEALED through the behavior of that machine, as I understand it.  In
this sense, having a machine which exhibits a feedback loop (such as
Tilden's "insect" robots) would be exhibiting short term memory because
memory is ACCESSED as a non-material field, rather than generated, whether
by a computer, a human brain or a Tilden robot.

I am aware that this view is far from mainstream.  Yet it does have two
advantages.  First, it stems from the single view of quantum physics lacking
paradoxes and therefore may be more correct than those retaining paradoxes.
Second, it could explain why Tilden's "insect" robots can exhibit memory
characteristics not programmed into them.

Of course, if this particular view proves correct, that memory fields exist
outside of physical structures and are only tapped by the correct structures
(much as a television set taps into broadcast programs), this would alter
the outcome of concerns regarding cryonics reanimation of intact
personalities.  There are other pieces of evidence which point to this
conclusion (such as the suggestions of Rupert Sheldrake in his book "A New
Science of Life" and the historical and cross cultural phenomena of near
death experiences) but I recognize that these areas require a true
suspension of cultural bias to entertain. I would not ask anyone to do this
as it is a very emotional issue for most people - much like cryonics!

Once again, we really do not know yet.

Go, insects, go!

George Smith
www.cryonics.org

Rate This Message: http://www.cryonet.org/cgi-bin/rate.cgi?msg=13422