Life's Ratchet: How Molecular Machines Extract Order from Chaos (24 page)

BOOK: Life's Ratchet: How Molecular Machines Extract Order from Chaos
4.85Mb size Format: txt, pdf, ePub

Maxwell’s demon illustrates why there is a connection between missing information and entropy. As discussed in
Chapter 3
, the reason the entropy of a gas at equilibrium is high is that it could be in many different microstates, which all would be compatible with the observed macrostate (pressure, temperature, and so on). But we do not know what the particular microstate of the gas is at each moment in time. Maxwell’s demon, however,
would
know the microstate of the system, and thus, in some sense, reduce the entropy of the system just by having more information about the system (reducing the missing information). But in order to
repeatedly
learn more about the microstate of the gas (which changes all the time as gas molecules collide), the demon would need to erase old information to make space for the new, turning knowledge back into missing information. This erasure, according to Landauer and Bennett, comes with an energy cost. In other words, each time you erase information, you dissipate energy and increase entropy.

In the case of the demon, erasing information restores or “resets” the system to its original state, allowing a new measurement cycle to begin. But since this erasure leads to an increase in entropy, the demon could do his demonic deed only once without paying a price. He would not be able to do it “for free” repeatedly. The same applies to machines. To make a small machine perform repeated motions, a reset step is needed, as the machine needs to be returned to its original state before it can begin a new cycle. And it is this reset step that leads to an inevitable increase in entropy.

More recently, it has been pointed out that the entropy increase due to erasure of information assumes the second law, and therefore cannot be used to prove the second law. That would be a circular argument. Maxwell’s demon may thus continue to haunt physicists’ dreams. Whatever the correct answer to Maxwell’s demonic challenge, my personal feeling is that Landauer and Bennett were on the right track. Clearly, such demons do not exist—otherwise, highballs would boil spontaneously, as in Gamow’s tale of Mr. Tompkins. Any attempt to make a demon has failed (as evidenced by our inability to devise a perpetual-motion machine). Moreover, recent nanotechnology-based experiments have confirmed Landauer’s conjecture that erasure of information creates entropy.

At the 2011 American Physical Society meeting in Dallas, physicists Yonggun Jun and John Bechhoefer of Simon Fraser University in Burnaby, Canada, reported on an experiment where they stored and erased information using a 200-nm plastic bead suspended in water. Following its motion with a microscope, they could calculate the heat associated when each bit of information was erased. A bit is a unit of information, contained in “yes” or “ no,” or “1” or “0.” Landauer had postulated a minimum amount of dissipated energy when a bit of information is erased. This minimum is given by Boltzmann’s entropy formula, times the temperature at which the bit is erased. Jun and Bechhoefer found that sometimes the heat released was less than Landauer’s limit. This was due to water molecules occasionally helping the bead along. But this only happened at random times. In the long run —i.e., in the statistical limit—Landauer’s limit stood. This work brilliantly confirmed Smoluchowski’s and Maxwell’s hunch that the second law
can
be violated, but not repeatedly and not predictably. It is a statistical law.

Reversibility
 

In 1918, Smoluchowski (posthumously) published a paper titled, “About the Concept of Chance and the Origin of the Laws of Probability in Physics.” When this paper was published, the kinetic picture of matter, pioneered by Maxwell and Boltzmann, was already solidly accepted. Even the old critics, who had given Boltzmann such a hard time, had grudgingly accepted the existence of atoms and molecules. Probability now reigned supreme in theories of gases and liquids. The final blow to the anti-atomists was Perrin’s experimental results, which completely confirmed the Einstein-Smoluchowski theory of Brownian motion.

Yet, just a few decades earlier, Boltzmann seemed to be fighting a losing battle. The kinetic theory was heavily criticized. One of the main objections was what Manfred Eigen would much later call Loschmidt’s demon. Boltzmann had tried to show that the second law was a direct result of the motions of atoms—in other words, even in the case of just a few interacting atoms, the second law would hold. Boltzmann reasoned as follows: If molecules were initially in some low-entropy state, their collisions with other molecules would make their distribution of velocities more random, and increase entropy. However, the Austrian physicist Josef Loschmidt (1821–1895), a friend of Boltzmann’s, pointed out that if that were true, what would happen if we reversed time? Loschmidt’s demon was a powerful creature that could reverse time at will. In the realm of atoms, a collision obeys all known laws of physics, no matter if you play time forward or backward. Think of two billiard balls: Ignoring the player and the cue, concentrate on the moment when the two billiard balls collide. If you were to film this instant and play it to an audience forward or backward in time, it would be impossible to tell which is which. Simple elastic collisions, like collisions between molecules, are time reversible—they look the same run forward or backward. With this in mind, how could a time-
irreversible
law, like the second law, emerge from the
reversible
mechanics of molecules?

The answer to this conundrum was twofold. First, for molecules to move toward a more probable velocity distribution, they must be starting
out with a less probable distribution. Thus, to see the second law in action, we have to assume that initially, the velocity distribution was improbable, and the entropy low. Then collisions shook things up, making the distribution more probable and increasing entropy. Thus, irreversibility came from the fact that the initial system was not at equilibrium. That is, it was not in a state of maximum entropy. This has consequences for the entire universe we live in: If there is such a thing as the arrow of time, which points from past to future, this arrow can only be there because the universe started in a very low-entropy state. Stars, galaxies, planets, and living beings have been feeding off the low entropy ever since.

The second part of the answer to how irreversibility can emerge from the reversible mechanics of particles is that the system has to be large enough—must contain enough molecules—so that collisions always mix things up. This is because in a large system, motions are generally uncorrelated, and molecular chaos reigns. If this is the case, what would happen in small systems?

In the late 1990s, a Los Alamos nuclear physicist, Christopher Jarzynski, derived an equality that electrified physics, especially the study of small molecular systems. Jarzynski’s equality quantified how often small molecular systems violate the second law. As we have seen, small systems can violate the law at random times—and this is why, strictly speaking, we should not apply the second law to such small systems. Leaving this caveat aside, how often do molecules violate the law, and what would be the consequences?

The second law tells us any directed motion of a system will always encounter the resistance of friction. Friction is the result of many randomly moving molecules scavenging energy away from any nonrandom motion. Now let’s imagine that a clever high school student has just learned about the conservation of energy. She devises a scheme for measuring the height of a mountain: Roll a ball down the mountain, starting the ball from rest, and measure its speed at the bottom. Then calculate the height. This calculation is an easy exercise, and I give problems like this to my introductory physics students. All you have to do is realize that according to energy conservation, the initial gravitational energy (ball on top of the mountain) has to equal the final kinetic energy when the ball reaches the bottom of the mountain. Gravitational potential energy is proportional to height, and
thus equating the two energies, we can solve for the height (it comes out to
h
=
v
2
/ (2
g
), where
h
is the height of the mountain,
v
is the speed of the ball at the bottom, and
g
is the acceleration of falling). But in practice, the measurement always falls short of the calculation. The kinetic energy of the ball at the foot of the mountain is
a little bit less than
the gravitational potential energy at the top of the mountain. This is because some of the energy is lost as heat, due to friction. Now the clever high school student decides to improve her accuracy by repeating the experiment a hundred times. Would that help? Not really; friction will always be there, and every single measurement will fall short.

Now let us imagine a similar experiment at the nanoscale. Shrunk to the nanoscale, our high school student repeats her experiment on a nanosize mountain. Most of the time, her measurements show the same trend as the macroscopic measurements: The speed is less than expected from the height of the mountain, because friction has taken its toll. But much to the nanoscale student’s surprise, rarely and at completely random times, the speed of the ball is
more
than what is expected. The randomly moving atoms in the surroundings did not resist the motion of the ball, as one would expect, but actually pushed the ball along! When systems are small enough, there is a finite probability, though rare, that the atomic chaos surrounding the system actually adds energy to the system, rather than stealing energy.

Is there a way to combine measurements and find the height of the nanomountain? Yes, there is. Jarzynski’s equality makes this possible, by averaging over an exponential function of the kinetic energy and not over the measured kinetic energies. Jarzynski showed that theoretically, you could obtain energy differences between two states (for example, top of the mountain and bottom of the mountain) from measurements in the presence of molecular chaos, and thus friction.

Experimental confirmation of this astonishing theorem did not have to wait long: Using laser tweezers, the biophysicists Carlos Bustamante and Jan Liphardt at University of California–Berkeley pulled on a single RNA molecule containing a loop. They wanted to know the energy difference between RNA molecules with the loop closed and with the loop open. But how could this be measured? Each time they pulled, they got a different answer. The surrounding water molecules created friction and
made the measured energy difference between the open- and closed-loop states larger than the actual energy difference between the two states. One way to get close to the correct answer was to do the measurement very, very slowly. Going slow helps, because slow motion is associated with low kinetic energy, and if the kinetic energy is low, the surrounding atoms cannot steal as much. However, when they pulled on the loop with high speed, the measured energy difference was almost always higher than the values measured at slow speeds. Friction had taken its toll.

Sometimes, they saw the opposite, and the energy difference they measured was less than the minimum energy required to open the loop. This meant that the second law was occasionally violated. In these rare cases, randomly moving water molecules helped open the loop instead of resisting. Applying Jarzynski’s formula, Bustamante and Liphardt averaged all their data, and the correct answer emerged. It was now experimentally confirmed: Nanoscale systems occasionally violate the second law of thermo dynamics. At the molecular scale, entropy can sometimes spontaneously decrease (although, strictly speaking, entropy is not defined at this scale). When that happens, it is as if time has reversed.

Thus at the nanoscale, and for short times, Loschmidt’s and Maxwell’s demons
can
rouse from their slumbers and seemingly violate the second law. Could life’s machines be Maxwell demons, creating order out of chaos by relying on the rare and unpredictable occasions when the second law is violated?

Perpetuum Mobile
 

To answer the question posed in the previous section, the answer is clearly no. All available evidence shows that life is not based, in any shape or form, on violating the second law. How do we know this? We know this since Lavoisier, Helmholtz, and many others determined that our bodies do not create energy, but rather waste energy. The efficiency of a human body (i.e., the amount of physical work obtained compared with the food energy intake) is about 20 percent. The rest (80 percent of food energy intake) is either directly turned into heat through friction or serves to maintain basic metabolic processes in our cells.

Helmholtz and Mayer had already realized that any violation of the first or second law by living organisms would mean that a perpetual-motion machine was possible—a machine that could either make energy out of nothing or operate at 100 percent efficiency. Both are impossible. The impossibility of a
perpetuum mobile
provided the main argument against the existence of a vital force. The argument went as follows: If there were a mysterious force that did not come from physical energy conversions, but was somehow inherent in life itself, such a force could add additional energy to a living system—energy that was not supplied as food. Helmholtz’s obituary in the
Proceedings of the Royal Society
made a similar argument: “Helmholtz was led to the discussion of this subject [the conservation of energy] by reflections on the nature of the ‘vital force.’” He had convinced himself that if it were true that living organisms could restrain or liberate the action of chemical or physical forces, perpetual motion would be realized.” A life force could create energy out of nothing—a prospect that was clearly absurd. Even in the eighteenth century, most physicists firmly believed that the creation of a perpetual-motion machine was—in principle—impossible.

Other books

The Last Straw by Simone, Nia
Blazer Drive by Sigmund Brouwer
House of Secrets by Columbus, Chris, Vizzini, Ned
Moondogs by Alexander Yates
Wolf Moon by Desiree Holt