X-Message-Number: 21681
Date: Sun, 27 Apr 2003 09:49:20 -0400
From: Thomas Donaldson <>
Subject: CryoNet #21671 - #21680

For Mike Perry:

Please answer my questions. So far you have not given any argument 
that an ability to solve the Halting Problem is required for a machine
not to be a Turing machine. Nor have you told me how the world 
is symbolic, as distinct from some of it being told about symbolically.
(I connected your idea of a language which did not require any input
to learn how to interpret the world with your notion that the world
was symbolic: the two notions are vary close). Nor, for that matter,
is isomorphism enough for a symbolic system to represent the world...
where do you argue that it is? 

You do raise a good point about physics and parity. Yes, if both
parties knew of this phenomenon in nuclear physics, they might be
able to define "right" and "left" to one another. So this proposed
fully symbolic language requires that the two parties attempting to
communicate also have virtually identical sets of knowledge about
the world? And if they don't, how does one party tell the other
party how to do parity experiments? Even mathematics involves some
knowledge of the world (though some theorists would claim differently).

For Bob Ettinger:

The semianimals you describe may not be very intelligent, but their
design imitates that of real animals in one essential feature:
they do not interpose a symbolic step between their reactions. A
great deal of work and theory would be involved in making such
a creature aware, but because of their direct relation to the
world they are candidates for awareness, while computers are not.

              Best wishes and long long life for all,

                  Thomas Donaldson

Rate This Message: http://www.cryonet.org/cgi-bin/rate.cgi?msg=21681