X-Message-Number: 8545
From: Andre Robatino <>
Subject: Re: CryoNet #8533 - #8540
Date: Fri, 5 Sep 97 20:33:42 EDT

> Message #8538
> From:  (Thomas Donaldson)
> Subject: more comments: computers, intelligence, etc
> Date: Thu, 4 Sep 1997 20:45:50 -0700 (PDT)
> 
> Hi again!
> 
> To Andre Robatino:
> Come now! By ignoring what has been done by neurophysiologists and neuro-
> scientists who are trying to understand how real physical brains work, your
> statement of what is known is awfully weak. So far, you're right that a 
> lot more needs to be learned, but we know more than you say. 
> 
<references snipped>

  Thanks for the references, I will take a look at these, if I can find them.
You're right that I haven't studied the subject in depth, my background is
in math and physics.  But I think that in general, biologists are skeptical
of the idea that Penrose and others have that the brain employs fancy and/or
unknown physics in its operation.  True quantum computation requires
sophisticated error correction to compensate for decoherence, and there is
a lot of work to do before/if we can build useful artificial ones.  I find it
hard to believe that in the highly incoherent environment of living tissue,
it could be done.  On the other hand, there was an article in Science a while
back showing how it could be done with a large ensemble of nuclear states
and NMR ("quantum computation in a coffee cup").  But even if it's possible,
evolution would still have had to figure it out.  Unless the necessary
architecture can be gotten through a series of small modifications, this would
be unlikely.  If it happened, it probably was early on, so we could probably
find evidence in some lower organism like C. elegans or somesuch.

> Finally, about SPEED: for John de Rivas and others. The major word in what
> I said was OPTIMUM. It is not always optimum to be the best on one particular
> parameter, because doing so may cost far too much and leave you open to
> very simple attack in other ways. For instance, suppose that we would all
> work more speedily if nerve conduction went along gold wires (gold is a 
> very good conductor ... better than the commonly used ones, certainly). But
> that leaves out some other issues: gold is rare and hard to find, and takes
> a good deal of energy to refine. An animal which wired together its brain
> using gold would have to expend all that energy to get the gold, one way
> or another. This means that the speed must be enough to allow it to acquire
> that energy, or it will soon be outcompeted by slower but far less expensive
> creatures, who just overwhelm it with numbers. For instance, they may eat up
> all its food sources, and even though it can chase away a few, they keep
> coming back and coming back and coming back, while it hasn't even gotten
> enough gold to reproduce itself.
> 
> This is only a thought experiment, but I hope it makes the issue clear.
> 
> What is optimum at one time and in one environment need not be optimum in
> others, either. We can't use this argument to claim that "all is for the
> best". But it helps to understand our past condition if we want to 
> try for something better.
> 
  Like I said above, it's not just a question of whether the architecture is
truly optimal, but whether evolution is capable of figuring it out.  A lot of
biology seems to be set early on in evolution and then conserved in higher
organisms.  Some of these solutions may truly be optimal, but others may just
be locally optimal and getting anything better may involve such a large
change that evolution is vanishingly unlikely to yield anything that can
survive competition with existing organisms long enough to reach the new local
maximum.  Obviously we don't have a good overview of the entire space of
possibilities (we're still struggling with our own little piece of it) so we
don't know.

> Message #8539
> Date: Thu, 4 Sep 1997 21:01:11 -0700 (PDT)
> From: John K Clark <>
> Subject: Digital Shakespeare
> 

<snip>

> 
>         >People never built significant analog machines, because they were
>         >so darned hard to create and use.
> 
> 
> No. Analog computers are not hard to make, they are impossible to make,
> and that's not a word I often use.
>                
  You're defining a computer as a device where the output is a discrete,
deterministic function of the input.  This property is certainly necessary for
computing prime numbers or doing one's taxes, but not for interacting with
physical systems, which constitutes a large fraction of applications, since
these themselves fail to have both these properties.  See below.

<snip>

> In  #8531  Andre Robatino <> On Wed, 3 Sep 97 Wrote:
>                                   
>         >This is ironic - you posted a positive message a few months ago
>         >regarding Seth Lloyd's quantum simulator article in Science.  But
>         >these are special-  purpose analog computers for computing the
>         >evolution of quantum systems
> 
> 
> I don't see the irony because they're special purpose quantum computers not
> analog. As I said, an analog computer is a continuous device so it must
> contain an infinite number of internal states, the great advantage of a
> 64 qbit quantum computer is that instead of performing an operation on one 64
> bit number it performs an operation on all 64 bit numbers, all 2^64 of them.
> That is certainly a huge improvement, but falls well short of infinity.
> 
  Yes, Seth Lloyd's quantum simulators do indeed have an infinite number of
possible internal states.  Don't let the fact that the input and output are
discrete fool you.  Though an electron's measured spin has just two possible
values, +1/2 and -1/2, its spin state is described by a superposition of pure
spin states of the form a|+1/2> + b|-1/2> where a and b are complex numbers.
Thus it lies in a continuum.  Since a plain vanilla quantum simulator fails to
correct for decoherence and dissipation, the coefficients of the output state
are modified by these.  Thus even though the input and output are discrete
(when measured), the output has a nonzero probability of being wrong if one
attempts to use it for any form of digital computation.  So instead of the
uncertainty being manifested by a small error in the output, it shows up as a
nonzero probability of getting the wrong discrete answer.  Lloyd's simulators
are incapable of performing _any_ form of deterministic computation.  They are
definitely not computers by your definition.  However, they will be extremely
useful, as will flexible analog computers of the type described in the NYT
article.
  To perform digital, deterministic computation requires that decoherence and
dissipation be dealt with by sophisticated error correction.  This is what
distinguishes a quantum computer from a quantum simulator, and what it's not
clear can be achieved.  This would be required for number-theoretic
applications such as discrete-log and factoring.  Simulators are useless for
these.

> 
>         >Practical quantum simulators, on the other hand, are essentially a
>         >sure thing.
>               
> I hope you're right, I'm very optimistic, but I wouldn't say it's a sure
> thing.                             
> 
  It is a sure thing.  We can draw up blueprints right now for crude, somewhat
useful devices.  In fact, they're probably being built as I write this.  The
difficulty in building them increases gracefully with the number of qubits,
so getting really powerful ones is just a matter of moving along the normal
development curve.

>         >The problem of finite accuracy of input affects any simulation of a
>         >physical system, regardless of what form of computation is used.
> 
> 
> The problem is not just the accuracy of the input but the accuracy of the
> output too, neither can be measured with infinite accuracy. Another problem
> is that unlike digital machines internal errors are cumulative.
> 
  If the output is being used to control a physical system (normally the one
the input is coming from) accuracy of output is even less of a problem.  As
far as internal errors go, a flexible analog computer (or whatever you want
to call it) can be used to do a nonchaotic simulation of any physical system
which is itself nonchaotic.  On the other hand, if the physical system itself
is chaotic, one's screwed anyway, since small errors in input will limit how
far the calculation can go regardless of what form of computation is used.
Having the calculation itself be precise gives at most a small increase in
the length of time the simulation is valid.

> A digital computer can not be infinitely accurate but it can be arbitrarily
> accurate, the accuracy of a "analog computer" is strictly limited by the laws
> of physics.

  ..as is the accuracy of the devices providing the interface with the
simulated system.  Though analog computers are useless for inherently discrete
problems, this is no grounds for rejecting them outright, any more than
rejecting parallel machines because not all problems allow making good use of
all those processors.

Rate This Message: http://www.cryonet.org/cgi-bin/rate.cgi?msg=8545