Read Life's Ratchet: How Molecular Machines Extract Order from Chaos Online
Authors: Peter M. Hoffmann
We are now close to a solution of Schrödinger’s conundrum. A living organism is not based on a solid. It is not clockwork. And statistical mechanics can teach us a lot about how it works.
As we’ve seen in this chapter, randomness is here to stay. Far from the destructive force it has been made out to be through the millennia, it is good for us—or at least good for life as a whole. In the chapters that follow, we will learn how randomness is part of every aspect of life—even in the simple act of lifting an arm or converting food into motion.
How can we visualize the relationship between necessity, the laws of nature, and randomness? One way was suggested by biochemist Manfred Eigen in his 1975 book
The Laws of the Game: How the Principles of Nature Govern Chance
. Eigen won a Nobel Prize for the study of ultrafast chemical reactions and realized that the interplay of necessity and chance resembles games. Different games can represent different phenomena we may encounter in the principles that govern life: chemical reactions, population growth, the regulation of enzymes in cells, or evolution. A good game combines elements of necessity (it must have rules), chance (there must be surprise), and sufficient complexity. Chess has simple rules, but the totality of all chess games ever played and the variety of chess strategies used show that chess is a game full of subtle complexity. Games are models of emergence—the appearance of unexpected features arising from the interactions of many different parts, rules of the game, chance, and space and time. Life can best be understood as a game of chance—played on the chessboard of space and time with the rules supplied by physics and mathematics. To gain a physicist’s understanding of life, we need to begin with the rules the game of life obeys. To start, let us learn about the kind of games atoms play.
It has been [our] principal indeavour to enlarge and strengthen the Senses . . . by . . . outward Instruments. . . . By this means [we] find . . . that those effects of Bodies, which have been commonly attributed to Qualities, and those confess’d to be occult, are perform’d by the small Machines of Nature, which are not to be discern’d without these helps.
—R
OBERT
H
OOKE
,
M
ICROGRAPHIA
So nat’ralists observe, a flea
Hath smaller fleas that on him prey,
And these have smaller fleas that bite ’em,
And so proceed ad infinitum.
—J
ONATHAN
S
WIFT
W
HEN ROBERT HOOKE PEERED THROUGH HIS PRIMITIVE microscope, he found a new world of tiny “Machines of Nature,” from dimples on poppy seeds to the sting of a bee. The “machines” he saw through his microscope in the late 1600s were just the beginning: As micro scopes improved, all of Hooke’s machines were found to be made of cells, which themselves were entire factories of even smaller machines, each made of smaller parts yet—all the way down to atoms and molecules.
It became clear that living things, while immensely complex, were made of the same stuff as the rest of nature.
How do atoms and molecules assemble into a flower or a human? Where do we cross the threshold from lifeless atoms and molecules to living organisms? What makes an object alive? These hard questions puzzled scientists and philosophers for millennia. Yet, we may be the first generation to glimpse answers to these questions. To understand these answers, we must begin with the basic building blocks of nature: atoms and molecules. Atoms are tiny clumps of matter, so tiny, that it takes 300,000 carbon atoms to span the width of a single human hair. A humble
E. coli
bacterium is only one-quadrillionth (10
−15
) the mass of a human, and yet it contains 100,000 billion atoms. Molecules are assemblies of atoms bound together by strong electrical bonds. Molecules can contain as few as two atoms and as many as tens of thousands.
Atoms and molecules are restless. Democritus, Epicurus, and their fellow atomists already understood this important point. In air, molecules of nitrogen, oxygen, carbon dioxide, and water vapor randomly swirl around, colliding at high speeds. Without noticing, we are continuously bombarded by supersonic gas molecules from the surrounding air. The calmness we see around us is an illusion. We are surrounded—no, immersed—in chaos. Yet from such chaos order can arise: On cold winter days, randomly swirling water molecules, high in the clouds, find each other and create beautiful, symmetric snowflakes. The world we see around us—the
macroscopic
world—is one of order and regularity. A book on a table does not jump suddenly; nor does it spontaneously burst into flames. Yet, seen at a very small scale, a book is a mass of atoms that rattle and shake, collide, and send each other hurling off into space. How can visible order and life’s complexity arise from such chaos?
In the late 1800s, this question occupied physicists such as Ludwig Boltzmann in Austria, James Clerk Maxwell in Scotland, and Josiah Willard Gibbs in the United States. For them, the relatively simple example of a gas provided the perfect starting point. A macroscopic volume of gas follows simple laws that relate pressure, volume, and temperature, but how did these laws arise? To find the answers, these scientists turned to the new science of statistics, and invented
statistical mechanics
. This discipline applies statistics to the mechanics of atoms and molecules. In their thinking,
if statistics can describe the height of a thousand men or the marriage age of a thousand women, it sure should be able to describe the behavior of a billion billion atoms. In everyday life, we use statistics to calculate such figures as average income, IQ (which is defined as a standard deviation from average intelligence), and income distributions. Similarly, physicists discovered how to calculate averages, deviations, and distributions of the speeds and energies of atoms. Although atoms move randomly, their collisions conform to physical rules. Maxwell and Boltzmann showed that the distribution of speeds in a gas was just a normal distribution—originally derived to generalize Pascal’s gambling formula. Applying statistics to the chaos of atoms and molecules, they found that averaged over time and space, the randomness of atomic motion gives way to order and regularity.
Life is based on molecules. These molecules are subject to the underlying chaos of the molecular storm—which at first glance seems to be a destructive force. How can life survive and possibly benefit from this chaos? This was Schrödinger’s famous question. Schrödinger saw a contradiction between the chaos of atoms and the structure of life. But today we know that the chaotic motions of atoms and molecules—controlled by life’s intricate structure—give rise to life’s activity. There is no contradiction. Life emerges from the random motions of atoms, and statistical mechanics can capture the essence of this emergence.
When I was a graduate student in Baltimore in the early 1990s, I had the unfortunate experience of being robbed at gunpoint. All I had on me was ten dollars, so it was not a big loss, but it was an upsetting experience nevertheless. Money is what makes the world go round, but it also makes people do unpleasant things. In physics and biology, we have a different currency that makes things happen—we call it energy. Money and energy have a lot in common. In a transaction, where one party gives money to another party, the total amount of money is conserved: The robber gained ten dollars, and I lost ten dollars. Is it possible, in the same transaction, for me to lose eight dollars and the robber to gain ten dollars? No. Money does not appear out of nowhere.
The same is true of energy. As Helmholtz had shown, energy conservation is the strictest law of nature. In “energy transactions,” the energy
before
equals the energy
after
the transaction. Energy can be transferred from one object to another or converted to a different form, but energy is never gained or lost.
Imagine you are standing on the moon, in the absence of air friction, and you pick up a rock. As the rock rests in your outstretched hand, it has
gravitational energy
(stored in the attraction between the rock and the moon). When you drop the rock, it accelerates as it falls. Motion is associated with a form of energy called
kinetic energy
. Where does the energy for the rock’s motion originate? It comes from the gravitational potential energy stored in the attraction between the rock and moon when you lift the rock off the ground. The falling rock “pays” for the kinetic energy (motion) by using up gravitational energy. Throughout the rock’s fall, the
total
energy of the rock (gravitational
plus
kinetic) is always constant. Energy is conserved.
Suddenly, your rock hits the ground in a cloud of moon dust and stops moving. Having lost gravitational energy through the fall, it loses its kinetic energy as well. If energy conservation is true, where does the energy go on impact?
Throughout the eighteenth and nineteenth centuries, physicists studied what happened when moving objects were slowed down by impact or friction. The law of energy conservation was new, and situations like our falling rock presented a conundrum. This changed when Sir Benjamin Thompson, Count Rumford, studied the heat generated while boring a cannon from a cylinder of metal. Finding that motion (kinetic energy) could be continuously turned into heat, he therefore concluded that heat had to be a form of energy. Before this realization, heat was thought to be a fluid (called
caloric
) that flowed from hot to cold objects, a fluid that eventually ran out. But Rumford refuted this idea: “It is hardly necessary to add that anything which any insulated body can continue to furnish without limitation cannot possibly be a material substance; and it appears to me to be extremely difficult . . . to form any distinct idea of anything capable of being excited and communicated, in the manner the heat was being excited and communicated in these experiments, except it be motion.” With the knowledge that heat could be created from motion, or kinetic energy, scientists wondered, “What kind of energy
is
heat?”
Matter is made of atoms, which are in perpetual motion. How do we know this? For a gas sealed in a container, an increase of temperature is always associated with an increase in pressure. Early work by Maxwell, Boltzmann, and others in the kinetic theory of gases explained this pressure increase by relating both temperature and pressure to the motion of atoms. In this view, pressure was the result of innumerable impacts by atoms with the walls of the container. The faster the atoms moved, the harder they hit the walls, and therefore, the greater the pressure. The atoms could be made to move faster if heat was added and temperature was increased. Temperature seemed to be related to the kinetic energy of the atoms in the gas. Kinetic theory neatly explained the macroscopic laws governing gases, but it presumed the existence of hypothetical, continually moving atoms—tiny objects no one had ever observed. Boltzmann, the Austrian father of statistical mechanics, suffered extreme ridicule for suggesting the existence of atoms. This added to his deep depression, which ended in his suicide in 1906.
Yet the idea that gases such as air were made of restless atoms was not a new idea. Atomic motion was indirectly discovered by the botanist Robert Brown in 1827. Like his successor, Charles Darwin, Brown made his mark as a naturalist serving on a British surveying expedition. During his voyage, Brown collected thousands of Australian plant specimens, many of them previously unknown species—only to lose most of them in a shipwreck. Nevertheless, he became a well-respected naturalist, who is credited with naming the cell nucleus. Despite his daring travels, physicists best remember Brown for a discovery he made in the safety of his own home. Brown observed that pollen grains suspended in air or liquid perform a jittery dance, as if pushed by an invisible, random force. Today we call this dance
Brownian motion
. In some sense, Brown really only rediscovered what Democritus had observed two thousand years earlier—the “motes in the air” that were “always in movement, even in complete calm.”