Read Life's Ratchet: How Molecular Machines Extract Order from Chaos Online
Authors: Peter M. Hoffmann
During energy transformation, it is difficult to keep track of the energy among all of the atoms involved. Just like the socks in your room, some energy will be left here, some there—in the form of random atomic motion. Friction and impact are great randomizers of energy. When a rock hits the ground, energy is “lost” to thermal motion of atoms. A tidy situation turns messy. Energy that is completely organized, concentrated, and tidy is a rather artificial, low-probability situation. If we let a system do whatever it “wants,” it behaves like an unruly teenager. Energy becomes scattered, dispersed, messy—and unusable.
This tendency of energy to become more and more dispersed and thus unusable is what the second law of thermodynamics is all about. The
second law states that in any transformation in a closed system, entropy always increases. The room gets messier, energy disperses, and the money slips through my fingers.
The second law of thermodynamics is one of the most profound, far reaching, and misunderstood laws of physics (because people do not read the fine print!). Let me restate the second law, but in a more precise fashion: There can be no process whose
only
result is to convert high-entropy (randomly distributed) energy into low-entropy (simply distributed, or concentrated) energy. Moreover, each time we convert one type of energy into another, we always end up overall with higher-entropy energy. In energy conversions,
overall
entropy always increases.
In these statements of the second law, I put certain words in italics—
only
and
overall
. As innocuous as these words may look, they are of great importance in our understanding of the second law and how it relates to life. By ignoring these words, creationists have been able to claim that life and evolution violate the second law of thermodynamics. Not at all!
Let’s look at what these words mean, starting with
only
. The second law does not say that a process that converts high-entropy (distributed) energy into low-entropy (concentrated) energy is impossible. If it were, you could not eat ice cream tonight, because a refrigerator is a machine that locally reduces entropy (by cooling things down). But cooling ice cream is not the
only
result of refrigeration. Your fridge also gobbles up electricity (a low-entropy source of energy) and turns most of it into heat (highentropy energy), which it releases into the kitchen. This is why you cannot cool down your kitchen by leaving the refrigerator door open.
Overall
, your refrigerator increases entropy by a large amount even though it locally decreases entropy. You can locally reduce entropy, but you need to do a lot of work and consume a lot of low-entropy energy in the process (think of tidying your room—it is not impossible, but you end up sweating and cursing—and increasing the entropy of your surroundings by burning low-entropy food energy and turning into high-entropy heat and waste). The same applies to how life works: Life uses a low-entropy source of energy (food or sunlight) and locally decreases entropy (creating
order by growing) at the cost of creating a lot of high-entropy “waste energy” (heat and chemical waste). The creationist claim that the second law does not allow for life or evolution does not hold water.
Entropy and the second law of thermodynamics are among the most important concepts of physics and among the most misunderstood. Part of the problem is there are several definitions of entropy and none are straightforward. Often, entropy is simply equated with disorder. This is not a good description unless we clearly understand what is meant by disorder. Again, creationists have exploited this confusion, claiming increases in entropy are incompatible with life. Their argument is that because entropy is disorder and life is order, the second law proves that life could not have emerged spontaneously.
Equating entropy with disorder is convenient, but it is not a precise definition by any means. Scientists run into this problem frequently. When casting (usually mathematical) definitions into everyday language, the details get lost in translation. Entropy is not the same as the disorder we think of every day (our tidy-versus-messy room example was only an analogy to explain the concept of microstates). Instead, entropy measures the degree to which energy is spread out. Sometimes an orderly-appearing system may have energy that is more dispersed than that of a “disordered” system. Such a system, although seemingly more ordered, would have higher entropy.
A surprisingly simple example illustrating this difference between entropy and the everyday concept of disorder is a collection of hard spheres (think of marbles). As we put more and more marbles into a container, the marbles reach a critical density (number of marbles per volume) at which the highest entropy state (i.e., the one we would expect to be most disordered) is the state in which the marbles are perfectly stacked in an ordered pattern. How is this is possible? Marbles can be filled into a container in two fundamentally different ways: We can just pour them into a jar and let them fall where they fall—or we can carefully stack them into an ordered array. Just pouring them leads to a situation where marbles are touching, but are otherwise in random positions. Such an arrangement is called
random stacking
by physicists. When marbles are randomly stacked, some marbles will become completely wedged and will not be able to move. Random stacking reduces freedom of motion, leading to a more
narrow energy distribution and thus lower entropy. Nicely stacked marbles, on the other hand, have on average a little bit more wiggle room, and therefore higher entropy. Marbles are a good illustration of a simple system where higher entropy means more order, not less. Simply equating entropy with disorder can be misleading. In biological cells, there are many ordered structures that form spontaneously while increasing entropy. These include assemblies of proteins, cell membrane structures, and fibers. In these cases, the entropy is increased by exporting the disorder to the surrounding water molecules. More about how this works in
Chapter 4
.
Once again, we find that the second law does not preclude the emergence and continued presence of life. Entropy can be reduced locally if it is increased globally (in our refrigerator, for example), and sometimes, an increase in entropy (hard spheres, for example) leads to more order. Life takes advantage of both of these apparent loopholes of the second law.
Life reduces entropy locally while increasing it globally. The concept of entropy describes this situation reasonably well, but it has one drawback. To square life’s processes with the second law of thermodynamics, we need to analyze the entropy of an organism
and
the entropy of its surroundings (because the surroundings ensure that the second law is not violated). However, details about the environment may not be well known, and all we really want to know is whether heat (and entropy) can flow to the environment or not. For this we only need to know the temperature of the surroundings. This fact led scientists to establish a new way to analyze thermodynamic systems that are immersed in constant-temperature environments (for example, a living organism or a cell).
How can the temperature of a system be kept constant? If you are chemist and your system is a test tube, you can place the test tube into a large bath maintained at constant temperature (for example, a bucket of ice and water at 0 degrees Celsius). If the system begins to heat up from an exothermic (heat-releasing) reaction, the heat will quickly flow into the surrounding bath until the system’s temperature equals that of the bath. If the system cools down (an endothermic reaction), it will draw heat from the surrounding bath. If the bath is large enough, the heat flow from the
system will not change the temperature of the bath. Thus the bath serves as an energy reservoir and a temperature control.
Living organisms also act as heat baths. Our cells are immersed in a large 37 degree Celsius temperature bath—our bodies. All chemical reactions in our cells happen at this constant temperature. If we want to apply the second law to this situation, it becomes tricky. The second law states that
overall
entropy increases. Therefore, if I look at a reaction happening in one of my kidney cells, I would need to know not only how the entropy of the participant molecules in the reaction changes, but also how entropy of the surrounding tissue changes. I would need to analyze the change of entropy in legions of complicated cells just to determine the outcome of one tiny reaction.
There is an easier way. The exchange of energy and entropy with the surrounding heat bath can be represented simply by the temperature, as long as we know the temperature is kept constant by the bath. To do this, we use the concept of
free energy
. Free energy
F
is the total energy
E
minus the product of temperature
T
and the entropy
S
of the system (or
F
=
E
–
TS
). Because entropy represents how much energy has become dispersed and useless, free energy represents that part of the energy that is still “concentrated” and useful (because we are subtracting the useless part,
TS
).
To analyze a system within the constant-temperature bath of our own bodies, we need to know how the system’s free energy changes. In the language of free energy, the second law is restated this way: At constant temperature, a system tends to decrease its free energy, until, at equilibrium, free energy has reached a minimum. The second law tells us that useful energy will become degraded, and eventually we will only be left with dispersed, unusable energy.
Because free energy includes both energy and entropy, the concept of free energy explains many of the paradoxical situations we discussed above. For example, in the random-stacking example, there is not much difference between the energy of marbles stacked randomly or orderly, but the entropy is higher for the ordered stacking. Therefore, the free energy, given by energy
minus
entropy times temperature, is
lower
for the ordered marbles, and because the system tends to minimize its free energy, it will tend to an ordered state at equilibrium.
We can also think of examples where both the entropy and the free energy of the system are lowered. What is so special about this? When we lower entropy, the free energy tends to go
up
, because we subtract the entropy term from the total energy. For example, if the total energy of a molecule is 5 eV, and entropy times temperature is 2 eV, then the free energy is
F
= 5 eV − 2 eV = 3 eV. Now, if the entropy term decreases to 1 eV (for example, by cooling the system), the free energy would
increase
to 5 eV − 1 eV = 4 eV.
Can both free energy and entropy decrease at the same time? Keeping the temperature constant, this is only possible if the total energy of the system,
E
, also decreases, and decreases more than the entropy term. For example, if total energy decreases from 5 eV to 3 eV, while the entropy term decreases, as before, from 2 eV to 1 eV, the new free energy would be 3 eV − 1 eV = 2 eV. This is less than the original 3 eV, despite the fact that entropy is decreased.
Why doesn’t this example violate the second law? Isn’t entropy supposed to increase? When the total energy of the system changed from 5 eV to 3 eV, the “missing” energy had to go somewhere (remember energy conservation). Energy passed into the environment, heated it up, and increased the environment’s entropy. As before,
overall
(system + environment) entropy increased, even though, by itself, the entropy of our small system decreased. You can see how the language of free energy makes it easier to think about such a scenario: As long as the free energy of a system decreases, we are obeying the second law.
Does nature provide examples of spontaneous decreases in entropy? All the time! Take the creation of a snowflake. Compared with a liquid drop of water, a snowflake has much lower entropy (it is also much more ordered). Yet, snowflakes form spontaneously under the right conditions. This is because the energy (
E
in our formula) of a snowflake is much lower than the energy of a water droplet. Once the temperature has fallen below the freezing point of water, the entropy reduction is overwhelmed by this reduction in energy, and the free energy is reduced when water freezes into beautiful, ordered crystals. Thus, at low temperature, a snowflake has lower free energy than a water droplet. At high temperature, however, the entropy term wins again (it is multiplied by the now higher temperature) over the energy (
E
) term, and water turns liquid. At higher temperature,
liquid water has lower free energy than frozen water. This is shown in
Figure 3.1
.
We can also think about the competition between liquid and frozen water in the language of the molecular storm. At low temperatures, the forces between atoms are stronger than the shaking of the molecular storm and draw atoms together to form structures; at high temperatures, the forces between atoms are no match for the more violent molecular storm, and snowflakes melt. The concept of free energy captures the tugof-war between deterministic forces (chemical bonds) and the molecular storm—or in other words, between necessity and chance, in one elegant formula,
F
=
E − TS
.