X-Message-Number: 5396
Date: Sun, 10 Dec 1995 18:36:30 -0800
From: John K Clark <>
Subject: SCI.CRYONICS Uploading, quantum effects and speed of cognition

-----BEGIN PGP SIGNED MESSAGE-----

In #5386 (Isak Sebastian Lytting)  On 10 Dec 1995 Wrote:
                


              >If the basis for cognition actually turns out to be quantum
              
              >effects in microtubules inside neurons (as has been suggested
              
              >by Penrose, amongst others), why would substituting electronic
                            >switches for neurons actually lead to any significant increase 

              >in speed of cognition? I should think that the quantum effects
              
              >would already be about as fast as they were going to get in
                            >a biological brain.
              
Interesting question: 

First, nearly everybody thinks Penrose is wrong. I think he's
probably wrong too, but I'm not quite as certain as some that
his ideas have no value at all. 

Second, even Penrose would admit that not all processing is
quantum mechanical and that a vast amount is performed by
conventional, electrochemical signals that are MUCH slower than
electronics. That's why computers can already do many things
MUCH faster than we can, if the brain can do quantum calculations,  
it's hard to understand why we're so much slower than electronic 
computers at arithmetic.

As you say, Penrose thinks that electrons in the microtubules
of cells are doing quantum calculations. How the electrons
manage to maintain quantum  coherence in the hot, chaotic
environment in the microtubules he does not say. There is no
question that as a mathematician and physicist Penrose is 
absolutely world class, but most biologists think he's a
crackpot who shouldn't be poaching in their domain. I don't
think he's right either but he's no crackpot. Interestingly, he
finds his most sympathetic audience among computer designers.
To understand how it MIGHT work it's best to look into the
Everett interpretation of Quantum Mechanics.              

Quantum Mechanics has been more successful, over a wider range
of conditions than any theory I know of, but it has always
suffered from philosophical problems. Is that a wave or a particle? 
In what direction is that photon polarized? The standard Copenhagen 
interpretation, believed by most Physicists, is that nothing is anything 
until a measurement is made and the wave function collapses. 
If you decide to use a photographic plate, an electron is a particle. 
If you decide to use an interferometer, an electron is a wave. 
This works fine except its a bit vague about what a measurement is 
and who is taking the measurement. Are Human Beings the only ones able 
to make an observation and change a potential reality into a actual one?
What about a chimp, or a dog, or a bug?                

In 1957 Hugh Everett solved the measurement problem but at a very high
price, too high most thought. He said that when any particle
undergoes the smallest detectable change ( a quantum jump ) the
entire universe, including you the observer, duplicates itself
and splits in two, but we have (almost) no way of communicating
with the other universe. If true this would mean there is an
infinite number (some say just an astronomical number) of universes. 

Many universes would differ from this one in only one tiny detail,
but some would be very different. Even EXTREMELY unlikely events
would happen someplace. In one I was just elected The Pope, in
another I graduated from Ringling Brothers Clown Collage, and in
yet another one all the molecules of air diffused by chance to the 
other side of the room as I type this and I suffocate.              

The idea behind quantum computing is that when the computer
splits into an astronomical number of different machines in
different universes, each computer could work on a part of the
problem in parallel, then you could re integrate the computer to give 
you the finished answer. Richard Feynman and David Deutsch did some
early work on this. Recently Seth Lloyd published in the most
respected scientific journal in the country a design for a
quantum computer, although some still say it won't work, 
see the September 17 1993 issue of Science page 1569  
" A Potentially Realizable Quantum Computer" by Seth Lloyd. 
I know all this sounds a bit crazy and perhaps it is, time will tell.

Our own Charles  Platt has written a popular account of Lloyd's
work in the  March 1995 issue of Wired magazine. Charles Platt
by the way is also the author of the best science fiction novel
about uploading ever written " The Silicon Man".               

Peter Shore recently proved that IF a quantum computer is possible, it could 
perform tasks that a regular computer could not, because the problems would 
be intractable. A problem  with X element is considered tractable if it 
can be solved in polynomial time, that is, if the time to solve it is 
proportional to  X^N where N is any finite number. A problem is not tractable 
if the time to solve it is proportional to N^X  if N is any number greater 
than 1, the reason for this is that even a small increase in X means the 
time to solve it jumps exponentially so that no conceivable computer could 
finish it's computations before the heat death of the universe. Many problems 
are suspected of being non computable in this way but very few have been 
proven to be so, nevertheless, all this would be devastating to my position 
about uploading except for one thing. Human beings aren't one bit better at 
solving these sort of problems than computers are, actually computers are 

better at it than people, but still, I admit, not very good.

Probably the most famous problem of this sort is the factoring problem. 
It's easy to multiply two prime numbers together but given the composite
number very difficult to find the factors, it's believed to be,
but not proven to be intractable. Nearly all public key ciphers
are biased  on this fact. When Peter Shore of ATT proved last year
that IF a quantum computer could ever be built, factoring would
be tractable, that is, it could factor a number in polynomial
time, it caused a minor panic in the cryptographic community.


                                             John K Clark     

-----BEGIN PGP SIGNATURE-----
Version: 2.6.i

iQCzAgUBMMuUh303wfSpid95AQFcKATvQqOa9/oofM4jqx673cCxn70ttClZGnff
a7r0w62N+ISt6/iCfo4F4eRJt3J3Pf9U8WwV2TMxFuYVFZk05LLIPJrt9fii/ac4
dVEL71d8kRGoD2AvTNuCnLbTQtvfQnHyJMmwxpVrIDETX0eX67BsK4PHJDglThhi
fmxJOkif9oyyNm8HdrTzaKlxrDEqXnol6BmfOV6m1OQU8xQw+zQ=
=bIo5
-----END PGP SIGNATURE-----


Rate This Message: http://www.cryonet.org/cgi-bin/rate.cgi?msg=5396