The Thermodynamics of Brains and Computers

human brain BlueGene parallel computer


If you have used a notebook computer on your lap, you likely have noticed that your lap can get hot, especially if your laptop is doing some intensive processing such as displaying a movie. This simple observation raises several interesting and difficult questions related to thermal physics: does a computer have to generate heat to carry out a computation? If so, what is the least amount of heat that can be generated for some given amount of computation? Or alternatively, what is the most computation one can obtain from a given amount of energy? How do the details of heat generation depend on the physical components of the computer, and how can one choose the components to reduce the production of heat?

To give you some perspective on these issues, let's compare a biological computer like an adult human brain (see upper left image) with the Blue Waters parallel computer (see upper right image) which, when it becomes available for academic research in 2011 at the National Center for Supercomputing Applications in Illinois, will be one of the world's most powerful computers. It is difficult to compare brains with digital silicon-based computers since they work on such different principles (and because we actually don't know yet all the principles of how brains process information). But a rough estimate based on the number of digital electrical pulses (action potentials) that the approximately 1011 neurons in a human brain send to one another per second via approximately 1014 contacts (synapses) suggests that an adult human brain carries out about one thousand trillion (1015) logical operations per second, which is known as a petaflop of computational ability. The Blue Water supercomputer will also have a raw hardware capability of about one petaflop. (So you are living during an interesting time in history when digital computers are just starting to attain the raw computational capability of an adult human brain).

Although a human brain and and the Blue Waters computer are roughly comparable in computational power, they differ enormously in terms of their thermodynamic properties. The Blue Waters computer will occupy approximately 2,000 square meters of floor space, will consume about 15,000,000 watts, will require an intricate cooling system (hidden under the floor) to dissipate the resulting heat, and will be utterly non-portable. A human brain is about the size of your two fists, consumes about 15 watts (even when you are solving a hard physics problem), and is completely practical to carry around since it has a mass of about 1.5 kg. (It is also worth pointing out that brains, unlike computers, self-assemble themselves and are also able to program themselves.)

So brains are much more impressive than the current best supercomputers in terms of computational capability per unit volume per unit power. But as physicists, we are obliged to ask an additional question: although brains are impressive compared to computers, are they impressive in some absolute physical sense? That is, has evolution led to brains that are close to optimal in that no other physical system that operates at room temperature and that consumes 15 watts can achieve a petaflop of computation or more? The answer is not known and is being studied by a variety of physicists, engineers, computer scientists, and biologists.

176 Home Page       Department of Physics