X-Message-Number: 94
From arpa!Xerox.COM!merkle.pa Wed Jun 14 11:04:36 PDT 1989
Received: from Cabernet.ms by ArpaGateway.ms ; 14 JUN 89 11:04:59 PDT
Date: Wed, 14 Jun 89 11:04:36 PDT
From: 
Subject: CRYONICS: Energy Limits to the Computational Power of the Human Brain
To: 
Message-ID: <>

Energy Limits to the Computational Power of the Human Brain
by Ralph C. Merkle

Xerox PARC 
3333 Coyote Hill Road 
Palo Alto, CA 94304 


This article will appear in Foresight Update #6

The Brain as a Computer

The view that the brain can be seen as a type of computer has gained
general acceptance in the philosophical and computer science community.
Just as we ask how many mips or megaflops an IBM PC or a Cray can perform,
we can ask how many operations the human brain can perform.  Neither the
mip nor the megaflop seems quite appropriate, though; we need something
new.  One possibility is the number of synapse operations per second.

A second possible "basic operation" is inspired by the observation that
signal propagation is a major limit.  As gates become faster, smaller, and
cheaper, simply getting a signal from one gate to another becomes a major
issue.  The brain couldn't compute if nerve impulses didn't carry
information from one synapse to the next, and propagating a nerve impulse
using the electrochemical technology of the brain requires a measurable
amount of energy.  Thus, instead of measuring synapse operations per
second, we might measure the total distance that all nerve impulses
combined can travel per second, e.g., total nerve-impulse-distance per
second.

Other Estimates

There are other ways to estimate the brain's computational power.  We might
count the number of synapses, guess their speed of operation, and determine
synapse operations per second.  There are roughly 10**15 synapses operating
at about 10 impulses/second [2], giving roughly 10**16 synapse operations
per second.

A second approach is to estimate the computational power of the retina, and
then multiply this estimate by the ratio of brain size to retinal size. The
retina is relatively well understood so we can make a reasonable estimate
of its computational power.  The output of the retina -- carried by the
optic nerve -- is primarily from retinal ganglion cells that perform
"center surround" computations (or related computations of roughly similar
complexity).  If we assume that a typical center surround computation
requires about 100 analog adds and is done about 100 times per second [3],
then computation of the axonal output of each ganglion cell requires about
10,000 analog adds per second.  There are about 1,000,000 axons in the
optic nerve [5, page 21], so the retina as a whole performs about 10**10
analog adds per second.  There are about 10**8 nerve cells in the retina
[5, page 26], and between 10**10 and 10**12 nerve cells in the brain [5,
page 7], so the brain is roughly 100 to 10,000 times larger than the
retina.  By this logic, the brain should be able to do about 10**12 to
10**14 operations per second (in good agreement with the estimate of
Moravec, who considers this approach in more detail [4, page 57 and 163]).

The Brain Uses Energy

A third approach is to measure the total energy used by the brain each
second, and then determine the energy used for each "basic operation."
Dividing the former by the latter gives the maximum number of basic
operations per second.  We need two pieces of information: the total energy
consumed by the brain each second, and the energy used by a "basic operation."
The total energy consumption of the brain is about 25 watts [2].  Inasmuch
as a significant fraction of this energy will not be used for "useful
computation," we can reasonably round this to 10 watts.

Nerve Impulses Use Energy

Nerve impulses are carried by either myelinated or un-myelinated axons.
Myelinated axons are wrapped in a fatty insulating myelin sheath,
interrupted at intervals of about 1 millimeter to expose the axon.  These
interruptions are called "nodes of Ranvier."  Propagation of a nerve
impulse in a myelinated axon is from one node of Ranvier to the next --
jumping over the insulated portion.

A nerve cell has a "resting potential" -- the outside of the nerve cell is
0 volts (by definition), while the inside is about -60 millivolts.  There
is more Na+ outside a nerve cell than inside, and this chemical
concentration gradient effectively adds about 50 extra millivolts to the
voltage acting on the Na+ ions, for a total of about 110 millivolts [1,
page 15].   When a nerve impulse passes by, the internal voltage briefly
rises above 0 volts because of an inrush of Na+ ions.

The Energy of a Nerve Impulse

Nerve cell membranes have a capacitance of 1 microfarad per square
centimeter, so the capacitance of a relatively small 30 square micron node
of Ranvier is 3 x 10**-13 farads (assuming small nodes tends to
overestimate the computational power of the brain).  The internodal region
is about 1,000 microns in length, 500 times longer than the 2 micron node,
but because of the myelin sheath its capacitance is about 250 times lower
per square micron [5, page 180; 7, page 126] or only twice that of the
node.  The total capacitance of a single node and internodal gap is thus
about 9 x 10**-13 farads. The total energy in joules held by such a
capacitor at 0.11 volts is 1/2 V**2 x C, or 1/2 x 0.11**2 x 9 x 10**-13, or
5 x 10**-15 joules.  This capacitor is discharged and then recharged
whenever a nerve impulse passes, dissipating 5 x 10**-15 joules.  A 10 watt
brain can therefore do at most 2 x 10**15 such "Ranvier ops" per second.
Both larger myelinated fibers and unmyelinated fibers dissipate more
energy.  Various other factors not considered here increase the total
energy per nerve impulse [8], causing us to somewhat overestimate the
number of "Ranvier ops" the brain can perform. It still provides a useful
upper bound and is unlikely to be in error by more than an order of
magnitude.

To translate "Ranvier ops" (1-millimeter jumps) into synapse operations we
must know the average distance between synapses, which is not normally
given in neuroscience texts.  We can estimate it:  a human can recognize an
image in about 100 milliseconds, which can take at most 100 one-millisecond
synapse delays.  A single signal probably travels 100 millimeters in that
time (from the eye to the back of the brain, and then some).  If it passes
100 synapses in 100 millimeters then it passes one synapse every millimeter
-- which means one "synapse operation" is about one "Ranvier operation."

Discussion

Both "synapse ops" and "Ranvier ops" are quite low-level.  The higher level
"analog addition ops" seem intuitively more powerful, so it is perhaps not
surprising that the brain can perform fewer of them.

While the software remains a major challenge, we will soon be able to build
hardware powerful enough to perform more such operations per second than
can the human brain.  There is already a massively parallel multi-processor
being built at IBM Yorktown with a raw computational power of 10**12
floating point operations per second: the TF-1.  It should be working in
1991 [6].  When we can build a desktop computer able to deliver 10**25 gate
operations per second and more (as we will surely be able to do with a
mature nanotechnology) and when we can write software to take advantage of
that hardware (as we will also eventually be able to do), a single computer
with abilities equivalent to a billion to a trillion human beings will be a
reality.  If a problem might today be solved by freeing all humanity from
all mundane cares and concerns, and focusing all their combined
intellectual energies upon it, then that problem can be solved in the
future by a personal computer.  No field will be left unchanged by this
staggering increase in our abilities. 

Conclusion

The total computational power of the brain is limited by several factors,
including the ability to propagate nerve impulses from one place in the
brain to another.  Propagating a nerve impulse a distance of 1 millimeter
requires about 5 x 10**-15 joules.  Because the total energy dissipated by
the brain is about 10 watts, this means nerve impulses can collectively
travel at most 2 x 10**15 millimeters per second.  By estimating the
distance between synapses we can in turn estimate how many synapse
operations per second the brain can do.  This estimate is only slightly
smaller than one based on multiplying the estimated number of synapses by
the average firing rate, and two orders of magnitude greater than one based
on functional estimates of retinal computational power.  It seems
reasonable to conclude that the human brain has a "raw" computational power
between 10**13 and 10**16 "operations" per second.

References

1.  Ionic Channels of Excitable Membranes, by Bertil Hille, Sinauer, 1984.
2.  Principles of Neural Science, by Eric R. Kandel and James H. Schwartz,
2nd edition, Elsevier, 1985.
3.  Tom Binford, private communication.
4.  Mind Children, by Hans Moravec, Harvard University Press, 1988.
5.  From Neuron to Brain, second edition, by Stephen W. Kuffler, John G.
Nichols, and A. Robert Martin, Sinauer, 1984.
6.  "The switching network of the TF-1 Parallel Supercomputer" by Monty M.
Denneau, Peter H. Hochschild, and Gideon Shichman, Supercomputing, winter
1988 pages 7-10.
7.  Myelin, by Pierre Morell, Plenum Press, 1977.
8.  "The production and absorption of heat associated with electrical
activity in nerve and electric organ" by J. M. Ritchie and R. D. Keynes,
Quarterly Review of Biophysics 18, 4 (1985), pp. 451-476.

Acknowledgements
The author would like to thank Richard Aldritch, Tom Binford, Eric Drexler,
Hans Moravec, and Irwin Sobel for their comments and their patience in
answering questions.

Rate This Message: http://www.cryonet.org/cgi-bin/rate.cgi?msg=94