Brains greatly exceed digital computers in computation per volume per watt

human brain BlueGene parallel computer


You are living during an interesting moment in human history in that, according to some estimates, the computational capability of the most powerful digital computers is now roughly comparable to the computational capability of a single adult human brain. The most powerful current digital computers (called supercomputers) can carry out about 100 petaflops or 1017 floating point operations per second, where a floating point operation (or flop) is a some calculation that combines the bits in two 64-bit computer words to produce a new 64-bit computer word.

Biological brains involve different principles than digital computers and these principles are not yet understood so that it is difficult to compare directly the computational capability of a human brain with a supercomputer. But some order-of-magnitude estimates (say based on the number of action potentials transmitted per second in an entire human brain between the approximately 1011 neurons that have about 1014 interconnections between each other) suggest that human brains also carry out the equivalent of 100-1000 petaflops of information processing.

But consider the huge differences between the power and volume requirements of these two computing systems. A 100 petaflop supercomputer requires about 15,000,000 watts (enough power to support a city of about 10,000 homes), occupies an area of about an American football field of interconnected cabinets of CPUs, and requires a sophisticated and expensive cooling system to withdraw the large amount of heat produced. In contrast, your brain, even when solving a difficult physics problem, consumes about 15 watts (the power to keep lit a rather dim light bulb) and has a volume of about two fists. (And of course, brains are more impressive than supercomputers in other ways in that they self-assemble from a single cell, and they are self-learning entities that can master physics, math, language, art, music, and sports without being explicitly programmed.)

So this raises a challenging and relevant biophysics question for 21st century science and society: what are the tricks that evolution has discovered and uses to achieve so much computation in such small volumes and with so little power? Could we use these tricks to eliminate the many large power-intensive computer centers that companies like Amazon, Google, Microsoft, IBM, and Apple are using to maintain their cloud information systems?

As physicists, we feel obliged to ask further related questions about the relation between computation and power. Although brains are impressive compared to supercomputers, how good are they compared to bounds set by the laws of physics? For example, if one has a computing device of any kind that consists of molecules and that operates at room temperature (300 K), what is the maximum amount of computation that can be carried out per watt, biological or not? The answer is not known and is being studied by a variety of physicists, engineers, computer scientists, and biologists.

The maximum amount of computation per unit mass is roughly known if one removes the restriction of operating at room temperature and with molecular components. Seth Lloyd and others have used back-of-the-envelope estimates based on quantum mechanics, special relativity, statistical physics, and general relativity to show that the maximum amount of computation that one can carry out with a kilogram of matter is about 1050 flops, far beyond the approximately 1017 flops that a human brain achieves. But it is utterly impractical to achieve this limit since it requires heating the kilogram of matter to a super-hot plasma of 109 K and simultaneously crushing the plasma into a black hole. The computational bound is then attained just as the plasma passes irreversibly through the black hole's event horizon.

176 Home Page       Department of Physics