^

Science and Environment

Maxwell’s demon, entropy and the destruction of information

STAR SCIENCE - Alfonso M. Albano, Ph.D. -

The Scottish physicist James Clerk Maxwell was one of the scientific giants of the 19th century. In Einstein’s view, Maxwell’s work was “the most profound and the most fruitful that physics had experienced since the time of Newton.”

Einstein was undoubtedly referring to Maxwell’s unification of electricity, magnetism, and optics. At the turn of the 19th century, these three were more or less separate and unrelated fields. By the late 1860s, Maxwell had united all three fields. “Maxwell’s Wonderful Equations,” as they are sometimes called, showed that electricity and magnetism are but different aspects of the same set of phenomena, and that light is a propagating electric and magnetic disturbance — an electromagnetic wave.

Maxwell’s great unification often overshadows his other scientific achievements, which were considerable. His contributions to thermodynamics, by themselves, would have been enough to secure his place among the scientific greats of the century.

Thermodynamics is summarized by laws of such remarkable generality that they characterize the interactions of all physical systems involving exchanges of heat, energy, and matter. They account for many aspects of the behavior of the gases that chemistry students manipulate in their laboratories, of automobile engines, of galaxies or of the entire universe.

The first of these laws recognizes that heat is a form of energy, and that while the energy of an isolated system may change from one form to another, its total value remains constant. Only processes that conserve energy can occur, the first law says. But not all energy-conserving processes do occur. The second law delineates which energy-conserving processes actually occur.

The second law says that spontaneous energy transformations in an isolated system tend to make less energy available for doing work. This second law is expressed in a large variety of ways: “perpetual motion machines are impossible,” “heat spontaneously flows from hot to cold,” “life is the gradual degradation of the useful into the useless.”

These various versions are elegantly brought together by a concept introduced by Rudolf Clausius in 1865. He called it “entropy,” a measure of the energy that is unavailable for doing work. Clausius’ statement is so pithy that it could be put on a bumper sticker: “The entropy of an isolated system never decreases.”

A newly boiled egg dropped into a bowl of cool water loses heat to the water until both are equally lukewarm. But a lukewarm egg does not draw heat from lukewarm water resulting in a once again hot egg and cool water. Spontaneous heat flow from hot to cold is allowed by the second law, flow the other way around is not. The second law defines the direction of spontaneously occurring processes; it defines an “arrow of time.”

At a time when the notion that matter is constituted of molecules was just beginning to be accepted, sometimes grudgingly, Maxwell helped lay the foundations for an explanation of thermodynamics based on the behavior of molecules. Similar work was done independently by Ludwig Boltzmann, who eventually brought it to completion in the 1870s.

Boltzmann’s formulation added a new twist to entropy. In addition to its thermodynamic meaning, Boltzmann established that entropy is also a measure of “missing information.” It is a measure of the extra information that is needed to specify certain molecular details of the system. This new notion of entropy was eventually used by Claude Shannon in the 1940s to create what is now known as “Information Theory.” Except that instead of dealing with information about molecules, Shannon dealt with information about messages being transmitted along telephone lines (he was working for Bell Telephone Laboratories at the time).

One of the tenets of the new molecular explanation of thermodynamics is that temperature is a measure of the energy of random molecular motion. The faster the molecules move — the more energetic they are — the hotter the body is. Maxwell made use of this concept in 1872 to create what is now known as “Maxwell’s Demon,” a fiend of molecular proportions that has since bedeviled generations of physicists.

Imagine a box containing a gas. The box is divided into two by a wall. The two sides initially have the same temperature. That is, their molecules have the same distribution of velocities. Now, imagine a hole in the wall with a trapdoor that is controlled by the demon who selectively allows fast molecules to go from one side to the other, and slow molecules to go the other way. Eventually, there are more fast molecules on one side than on the other. Faster molecules mean more energy, higher temperature. The demon has created a temperature difference apparently without causing any other changes.

This seems to be a clear violation of the second law — it is as if heat had flown from one part of the gas to another; as if a lukewarm egg had absorbed heat from lukewarm water, making the egg hot and the water cool yet once more!

Since Maxwell’s time, there have been numerous attempts to explain why the demon does not “really” violate the second law. There have also been equally numerous refutations of the explanations. The currently unrefuted explanation is due to Charles Bennett who brings information into the discussion using results of Rolf Landauer, a former colleague of his at IBM. “Information is physical,” Landauer wrote, and processes manipulating information have physical implications. Using information theory, computer science, and thermodynamics, Landauer proved that while recording information may not have entropic consequences, destroying information does. Destroying information is irreversible. Forgetting is costly. It increases the entropy of the universe.

In the process of determining which molecules it should allow to go to one side of the container or the other, the demon had to gather information about the molecules. So, it was not just the distribution of molecules that was altered. The demon, too, had changed. Before we concentrate our attention on what has happened to the molecules, we must first restore the demon to its original state. All the information it obtained has to be erased — but erasure increases entropy. Bennett showed in 1982 that the increase is enough to make up for the entropy decrease that resulted from the separation of the fast molecules from the slow ones.

The existence of a connection between the entropy of information and the entropy of material systems is truly significant, and it is turning out to be useful in an area that Maxwell could not have anticipated — computer science. This is not surprising, since what computers do is manipulate information. There is work underway to design the implementation of logical operations that minimize entropy production, or to use physical processes that do the same. The less entropy a computer generates, the less energy it wastes — an important consideration not only for the development of more efficient computers but also for the responsible stewardship of our planet’s limited resources. After more than a century, Maxwell’s demonic creation seems to have been exorcised by the destruction of information and transformed into a benign spirit.

* * *

Dr. Alfonso M. Albano is Marion Reilly Professor Emeritus of Physics at Bryn Mawr College in Bryn Mawr, Pennsylvania, USA. His research interests are in nonlinear dynamics and the use of nonlinear dynamical techniques for the analysis of complex biological and biomedical signals. He can be reached at [email protected].

vuukle comment

BELL TELEPHONE LABORATORIES

ENERGY

ENTROPY

INFORMATION

MOLECULES

  • Latest
Latest
Latest
abtest
Are you sure you want to log out?
X
Login

Philstar.com is one of the most vibrant, opinionated, discerning communities of readers on cyberspace. With your meaningful insights, help shape the stories that can shape the country. Sign up now!

Get Updated:

Signup for the News Round now

FORGOT PASSWORD?
SIGN IN
or sign in with