X-Message-Number: 9285
From: Thomas Donaldson <>
Subject: Re: CryoNet #9273 - #9282
Date: Sun, 15 Mar 1998 15:06:31 -0800 (PST)

Hi everyone!

To Mr. Metzger:
I do hope you have read the Siegelmann paper. Some of your statements made me
wonder about that. I claim NOTHING about the device it discusses except that
it provides an example of a computer which is not a Turing computer. That is,
it is a counterexample in the mathematical sense. And as a counterexample,
we must reconsider some of our ideas about computers --- perhaps they aren't
as universal as we thought. Counterexamples don't have to be BETTER, they
just have to show that something we thought was universal is not.

Furthermore, both you and Mike Perry have a rather truncated idea of what 
neurons do. I really wish that you would go and learn a bit of neuroscience
before you got involved in this discussion.

As to the complexity of neurons, to be brief, each one works more like a 
small computer than like a single chip in a bigger one. So far as I am 
aware, neurons still have far more connections with other neurons than
any single chip, and show a variation in response according to their 
electrical and chemical inputs which is much larger than (say) a single
processor. Not only that, but the "random jitteriness" you allude to 
in their behavior has turned out to be part of the processes by which they
work with other neuron. It is not just a "mistake", it's part of their
design. 

As to recursiveness, I can cite you computer papers which claim that is one
feature of Turing machines. Remember recursive languages? As for executing
a recursive algorithm in a parallel machines, they are not parallel as
presented. Usually a bit of thought will yield ANOTHER algorithm which is 
NOT recursive but comes to the same result. I was hardly claiming that to be
impossible, since I've done it lots of times myself. However to claim that
two algorithms are the SAME because they reach the same result completely
misunderstands the notion of algorithm. 

Do Turing machines have stacks? (Not all recursive algorithms need them,
anyway). Well, it should be easy to write the Turing machine's software
so that it creates a stack, if you want one.  As someone who has used
Forth a lot, Forth depends on the presence of at least one stack, and
usually needs more (one for characters, one for integers, fp numbers etc).
You seem to be confusing one particular kind of implementation of a 
recursive algorithm with all such implementations. (Note my wording here!).
If you need a stack then create one.

You commit a second error when you claim that (admittedly I may have 
misunderstood you) your computer can do several things at once though it
is not parallel. As you know, it does this by jumping from one job to
another. To complicate the issue, MMX and other such things are parallel,
and parallelism has found its way into high end ordinary computers through
the design of the processor. But to do two or more things at once, you must
have parallelism SOMEWHERE. Yes, it's possible to produce an ILLUSION of
parallel activity if you work fast enough.

Unfortunately, a lot of the speed of current single processors is wasted 
in trying to look like they're doing lots of things at once. I do not 
retract what I said about sequential computers, and Turing computers,
ultimately being poor tools with which to understand the world.

For that matter, the point about weather prediction was hardly that it 
was impossible, but that it required great speed --- which our computers
most definitely have, at least comparatively. Weather prediction, as a 
mathematical problem, still remains chaotic, so that the farther in 
the future your prediction goes, the more accuracy you must start with.
And parallelism is a good way to get great speed: no matter how fast your
single processor is, if you can join 100 of them together to work on 
a job, you'll get something faster than any single one (though you do
have to remember that you're programming 100 processors rather than just
one, and choose your algorithms accordingly).

I will assume that you have heard of chaos, though your response shows
no sign of that. Weather predictions require such computer power exactly
because their accuracy decreases with time. You may have done elementary
DEs in college; most of the DEs you studied were very simple, and lacked
this chaos problem (although so far as I know, algorithms to solve time
varying problems in computers can easily show chaos despite the properties
of the original equation: this comes because of the finite accuracy of 
any computation done in a computer ... and by a human being, too).

Finally I have a few words to say to Mike:

First of all, it's very easy to imagine some system and decide that you've
imagined a system which will be superior to our biological construction.
However all that remains no more than dreams, and it's trivially easy to
have a dream which surpasses reality in every respect (except for its
reality!). 

Several major features have made biological devices superior to computers
so far. The first and most obvious one is self-repair: any device into which
I was read must have at least equal abilities at self-repair, and ideally
far superior such abilities. Furthermore, as structures brains are much
more complex, with many more processors, than any computer so far built. 
Among their interesting features is that they aren't laid out on flat
boards like computers, but are essentially 3-dimensional. These features
make them much more compact, and allow them to do important tasks such
as recognition and interpretation which computers are currently struggling
with.

You mention speed particularly. Given that speed is useless in itself ---
to be useful there must also be input of some kind, it's not clear that
the kinds of things human beings do now require all that much speed. Sure,
you may wish to trundle around in a racing car, but you'll feel just as
unhappy in traffic as someone riding an old sedan. (Besides which, we
built computers to do computing fast for us!). 

Nor, supposing we became software, would that make us immune to any 
natural or deliberate attack. We already know of computer viruses, and 
they will probably grow more and more elaborate. For that matter, computers
cannot withstand the range of temperatures and environments that biological
people can. Of course, our "medical" problems will become different, but
we will not escape them simply by becoming software.  

The idea of turning ourselves into computers, even with lots of peripherals,
just doesn't look good  --- unless you think of a dream computer. 

This is NOT an argument against changing ourselves. Immortality by whatever
means is just such a change, and we surely will see others. But there seems
to be some idea going about that computers, for some magical reason, are
"superior" to us. Certainly they can compute faster than we can, but then
a steel girder is "superior" to us because it is stronger. Computing is not
even the same as thinking (a much broader concept). If you were to seriously
look at improvements with the aim of really seeing what kind would be 
useful, then I would not suggest looking at computers first of all. Sure,
we want to reconstruct ourselves, but lots of methods to do that can be
imagined, we need not limit ourselves to one technology or range of ideas.
And that technology has some basic faults which make me think we'll have
to be broader if we aren't just dreaming. 

			Best wishes and long long life,

				Thomas Donaldson

Rate This Message: http://www.cryonet.org/cgi-bin/rate.cgi?msg=9285