X-Message-Number: 11660
Date: Sun, 02 May 1999 11:11:06 +0000
From: Damien Broderick <>
Subject: emulation

>From: Thomas Donaldson <>
>Subject: To D. Broderick, on the subject of awareness once more

>I will note that you have brought into the discussion various sensors etc
>that we have and through which we perceive the world. That's fine, and it
>makes us not at all the same as the classical Turing machine.

But as Mike Perry replies:

>A sequential device can emulate a parallel device, albeit at a loss of
>Not necessarily interact in real-time with processes on the outside, but
>emulate just the same.

Actually, while this is so, pitching the discussion in terms of Turing's
archaic model of a computational device is absurdly restrictive and
unnecessarily constricts the imagination.  

Yes, it seems to me obvious that if one had a sequential linear machine
able to operate billions or trillions of times faster than a human brain,
and it possessed a large rewritable storage buffer, it could emulate a
brain without anyone being the wiser.  But why bother with such strained
thought experiments?  What's at issue, surely, is whether the kinds of
machines we already have, and can expect shortly, will have the formal
powers to emulate (or rather instantiate) a consciousness.  

And don't forget that even a Turing machine needs connections to the
outside world - someone has to insert the program and read the output.

Mike adds:

>But, Thomas, a computer-synthesized world need not be an airtight domain
>totally cut off from the "real" world. Instead, there are various ways of
>establishing a communication link. One way would be to imagine a computer
>program that has characters, some of which are indigenous to the program,
>i.e. contained entirely inside it, others controlled from the outside. 

Just so.  We need not make this a dichotomous thought experiment.  The
density and excess of the Real is always in principle available to any AI
system through sensors and effectors, just as it is to us, sealed inside
our bony heads.  I believe this discussion should *start* from that agreed
understanding, and go somewhere more interesting.

Damien Broderick

Rate This Message: http://www.cryonet.org/cgi-bin/rate.cgi?msg=11660