Read In Pursuit of the Unknown Online
Authors: Ian Stewart
Entropy, and the mathematical formalism that goes with it, provides an excellent model for many things. It explains why heat engines can only reach a particular level of efficiency, which prevents engineers wasting valuable time and money looking for a mare's nest. That's not just true of Victorian steam engines, it applies to modern car engines as well. Engine design is one of the practical areas that has benefited from knowing the laws of thermodynamics. Refrigerators are another. They use chemical reactions to transfer heat out of the food in the fridge. It has to go somewhere: you can often feel the heat rising from the outside of the fridge's motor housing. The same goes for air-conditioning. Power generation is another application. In a coal, gas, or nuclear power station, what it initially generated is heat. The heat creates steam, which drives a turbine. The turbine, following principles that go back to Faraday, turns motion into electricity.
The second law of thermodynamics also governs the amount of energy we can hope to extract from renewable resources, such as wind and waves. Climate change has added new urgency to this question, because renewable energy sources produce less carbon dioxide than conventional
ones. Even nuclear power has a big carbon footprint, because the fuel has to be made, transported, and stored when it is no longer useful but still radioactive. As I write there is a simmering debate about the maximum amount of energy that we can extract from the ocean and the atmosphere without causing the kinds of change that we are hoping to avoid. It is based on thermodynamic estimates of the amount of free energy in those natural systems. This is an important issue: if renewables
in principle
cannot supply the energy we need, then we have to look elsewhere. Solar panels, which extract energy directly from sunlight, are not directly affected by the thermodynamic limits, but even those involve manufacturing processes and so on. At the moment, the case that such limits are a serious obstacle relies on some sweeping simplifications, and even if they are correct, the calculations do not rule out renewables as a source for most of the world's power. But it's worth remembering that similarly broad calculations about carbon dioxide production, performed in the 1950s, have proved surprisingly accurate as a predictor of global warming.
The second law works brilliantly in its original context, the behaviour of gases, but it seems to conflict with the rich complexities of our planet, in particular, life. It seems to rule out the complexity and organisation exhibited by living systems. So the second law is sometimes invoked to attack Darwinian evolution. However, the physics of steam engines is not particularly appropriate to the study of life. In the kinetic theory of gases, the forces that act between the molecules are short-range (active only when the molecules collide) and repulsive (they bounce). But most of the forces of nature aren't like that. For example, gravity acts at enormous distances, and it is attractive. The expansion of the universe away from the Big Bang has not smeared matter out into a uniform gas. Instead, the matter has formed into clumps â planets, stars, galaxies, supergalactic clusters. . . The forces that hold molecules together are also attractiveâexcept at very short distances where they become repulsive, which stops the molecule collapsing â but their effective range is fairly short. For systems such as these, the thermodynamic model of independent subsystems whose interactions switch on but not off is simply irrelevant. The features of thermodynamics either don't apply, or are so long-term that they don't model anything interesting.
The laws of thermodynamics, then, underlie many things that we take for granted. And the interpretation of entropy as âdisorder' helps us to understand those laws and gain an intuitive feeling for their physical basis.
However, there are occasions when interpreting entropy as disorder seems to lead to paradoxes. This is a more philosophical realm of discourse â and it's fascinating.
One of the deep mysteries of physics is time's arrow. Time seems to flow in one particular direction. However, it seems logically and mathematically possible for time to flow backwards instead â a possibility exploited by books such as Martin Amis's
Time's Arrow
, the much earlier novel
Counter-Clock World
by Philip K. Dick, and the BBC television series
Red Dwarf
, whose protagonists memorably drank beer and engaged in a bar brawl in reverse time. So why can't time flow the other way? At first sight, thermodynamics offers a simple explanation for the arrow of time: it is the direction of entropy increase. Thermodynamic processes are irreversible: oxygen and nitrogen will spontaneously mix, but not spontaneously unmix.
There is a puzzle here, however, because any classical mechanical system, such as the molecules in a room, is time-reversible. If you keep shuffling a pack of cards at random, then eventually it will get back to its original order. In the mathematical equations, if at some instant the velocities of all particles are simultaneously reversed, then the system will retrace its steps, back-to-front in time. The entire universe can bounce, obeying the same equations in both directions. So why do we never see an egg unscrambling?
The usual thermodynamic answer is: a scrambled egg is more disordered than an unscrambled one, entropy increases, and that's the way time flows. But there's a subtler reason why eggs don't unscramble: the universe is very, very unlikely to bounce in the required manner. The probability of that happening is ridiculously small. So the discrepancy between entropy increase and timeâreversibility comes from the initial conditions, not the equations. The equations for moving molecules are time-reversible, but the initial conditions are not. When we reverse time, we must use âinitial' conditions given by the
final
state of the forward-time motion.
The most important distinction here is between symmetry of equations and symmetry of their solutions. The equations for bouncing molecules have time-reversal symmetry, but individual solutions can have a definite arrow of time. The most you can deduce about a solution, from time-reversibility of the equation, is that there must also exist
another
solution that is the time-reversal of the first. If Alice throws a ball to Bob, the time-reversed solution has Bob throwing a ball to Alice. Similarly, since the equations of mechanics allow a vase to fall to the ground and smash into a
thousand pieces, they must also allow a solution in which a thousand shards of glass mysteriously move together, assemble themselves into an intact vase, and leap into the air.
There's clearly something funny going on here, and it repays investigation. We don't have a problem with Bob and Alice tossing a ball either way. We see such things every day. But we don't see a smashed vase putting itself back together. We don't see an egg unscrambling.
Suppose we smash a vase and film the result. We start with a simple, ordered state â an intact vase. It falls to the floor, where the impact breaks it into pieces and propels those pieces all over the floor. They slow down and come to a halt. It all looks entirely normal. Now play the movie backwards. Bits of glass, which just happen to be the right shape to fit together, are lying on the floor. Spontaneously, they start to move. They move at just the right speed, and in just the right direction, to meet. They assemble into a vase, which heads skywards. That doesn't seem right.
In fact, as described, it's not right. Several laws of mechanics appear to be violated, among them conservation of momentum and conservation of energy. Stationary masses can't suddenly move. A vase can't gain energy from nowhere and leap into the air.
Ah, yes. . . but that's because we're not looking carefully enough. The vase didn't leap into the air of its own accord. The floor started to vibrate, and the vibrations came together to give the vase a sharp kick into the air. The bits of glass were similarly impelled to move by incoming waves of vibration of the floor. If we trace those vibrations back, they spread out, and seem to die down. Eventually friction dissipates all movement. . . Oh, yes, friction. What happens to kinetic energy when there's friction? It turns into heat. So we've missed some details of the time-reversed scenario. Momentum and energy do balance, but the missing amounts come from the floor losing heat.
In principle, we could set up a forward-time system to mimic the time-reversed vase. We just have to arrange for molecules in the floor to collide in just the right way to release some of their heat as motion of the floor, kick the pieces of glass in just the right way, then hurl the vase into the air. The point is not that this is impossible in principle: if it were, time-reversibility would fail. But it's impossible in practice, because there is no way to control that many molecules that precisely.
This, too, is an issue about boundary conditions â in this case, initial conditions. The initial conditions for the vase-smashing experiment are easy to implement, and the apparatus is easy to acquire. It's all very robust, too: use another vase, drop it from a different height. . . much the same
will happen. The vase-assembling experiment, in contrast, requires extraordinarily precise control of gazillions of individual molecules and exquisitely carefully made pieces of glass. Without all that control equipment disturbing a single molecule. That's why we can't actually do it.
However, notice how we're thinking here: we're focusing on
initial
conditions. That sets up an arrow of time: the rest of the action comes later than the start. If we looked at the vase-smashing experiment's
final
conditions, right down to the molecular level, they would be so complex that no one in their right mind would even consider trying to replicate them.
The mathematics of entropy fudges out these very small scale considerations. It allows vibrations to die away but not to increase. It allows friction to turn into heat but not heat to turn into friction. The discrepancy between the second law of thermodynamics and microscopic reversibility arises from coarse-graining, the modelling assumptions made when passing from a detailed molecular description to a statistical one. These assumptions implicitly specify an arrow of time: large-scale disturbances are allowed to die down below the perceptible level
as time passes
, but small-scale disturbances are not allowed to follow the time-reversed scenario. Once the dynamics passes through this temporal trapdoor, it's not allowed to come back.
If entropy always increases, how did the chicken ever create the ordered egg to begin with? A common explanation, advanced by the Austrian physicist Erwin Schrödinger in 1944 in a brief and charming book
What is Life?
, is that living systems somehow borrow order from their environment, and pay it back by making the environment even more disordered than it would otherwise have been. This extra order corresponds to ânegative entropy', which the chicken can use to make an egg without violating the second law. In
Chapter 15
we will see that negative entropy can, in appropriate circumstances, be thought of as information, and it is often claimed that the chicken accesses information â provided by its DNA, for example â to obtain the necessary negative entropy. However, the identification of information with negative entropy makes sense only in very specific contexts, and the activities of living creatures are not one of them. Organisms create order through the processes that they carry out, but those processes are not thermodynamic. Chickens don't access some storehouse of order to make the thermodynamic books balance: they use
processes for which a thermodynamic model is inappropriate, and throw the books away because they don't apply.
The scenario in which an egg is created by borrowing entropy would be appropriate if the process that the chicken used were the time-reversal of an egg breaking up into its constituent molecules. At first sight this is vaguely plausible, because the molecules that eventually form the egg are scattered throughout the environment; they come together in the chicken, where biochemical processes put them together in an ordered manner to form the egg. However, there is a difference in the initial conditions. If you went round beforehand labelling molecules in the chicken's environment, to say âthis one will end up in the egg at such and such a location', you would in effect be creating initial conditions as complex and unlikely as those for unscrambling an egg. But that's not how the chicken operates. Some molecules happen to end up in the egg and are conceptually labelled as part of it
after
the process is complete. Other molecules could have done the same job â one molecule of calcium carbonate is just as good for making a shell as any other. So the chicken is not creating order from disorder. The order is assigned to the end result of the egg-making process â like shuffling a pack of cards into a random order and then numbering them 1, 2, 3, and so on with a felt-tipped pen. Amazing â they're in numerical order!
To be sure, the egg looks more ordered than its ingredients, even if we take account of this difference in initial conditions. But that's because the process that makes an egg is not thermodynamic. Many physical processes do, in effect, unscramble eggs. An example is the way minerals dissolved in water can create stalactites and stalagmites in caves. If we specified the exact form of stalactite we wanted, ahead of time, we'd be in the same position as someone trying to unsmash a vase. But if we're willing to settle for any old stalactite, we get one: order from disorder. Those two terms are often used in a sloppy way. What matters are what kind of order and what kind of disorder. That said, I
still
don't expect to see an egg unscrambling. There is no feasible way to set up the necessary initial conditions. The best we can do is turn the scrambled egg into chickenfeed and wait for the bird to lay a new one.