From Eternity to Here (29 page)

Read From Eternity to Here Online

Authors: Sean Carroll

Tags: #Science

BOOK: From Eternity to Here
5.93Mb size Format: txt, pdf, ePub

Figure 43:
The evolution of the entropy of a divided box of gas. The gas starts with most molecules on the left, and the distribution evens out in time, as we saw in Figure 42. The entropy correspondingly rises, as there are more ways for the molecules to be distributed evenly than to be mostly on one side or the other. For convenience we have plotted the entropy in terms of the maximum entropy, so the maximum value attainable on this plot is 1.

All of this seems superficially plausible and will turn out to be basically true. But along the way we made some “reasonable” leaps of logic, which deserve more careful examination. For the rest of this chapter we will bring to light the various assumptions that go into Boltzmann’s way of thinking about entropy, and try to decide just how plausible they are.

USEFUL AND USELESS ENERGY

One interesting feature of this box-of-gas example is that the arrow of time is only temporary. After the gas has had a chance to even itself out (at around time 150 in Figure 43), nothing much happens anymore. Individual molecules will continue to bounce between the right and left sides of the box, but these will tend to average out, and the system will spend almost all of its time with approximately equal numbers of molecules on each side. Those are the kinds of configurations that correspond to the largest number of rearrangements of the individual molecules, and correspondingly have the highest entropy the system can possibly have.

A system that has the maximum entropy it can have is in
equilibrium
. Once there, the system basically has nowhere else to go; it’s in the kind of configuration that is most natural for it to be in. Such a system has no arrow of time, as the entropy is not increasing (or decreasing). To a macroscopic observer, a system in equilibrium appears static, not changing at all.

Richard Feynman, in
The Character of Physical Law
, tells a story that illustrates the concept of equilibrium.
129
Imagine you are sitting on a beach when you are suddenly hit with a tremendous downpour of rain. You’ve brought along a towel, but that also gets wet as you dash to cover. Once you’ve reached some cover, you start to dry yourself with your towel. It works for a little while because the towel is a bit drier than you are, but soon you find that the towel has gotten so wet that rubbing yourself with it is keeping you wet just as fast as it’s making you dry. You and the towel have reached “wetness equilibrium,” and it can’t make you any drier. Your situation maximizes the number of ways the water molecules can arrange themselves on you and the towel.
130

Once you’ve reached equilibrium, the towel is no longer useful for its intended purpose (drying you off). Note that the total amount of water doesn’t change as you dry yourself off; it is simply transferred from you to the towel. Similarly, the total
energy
doesn’t change in a box of gas that is isolated from the rest of the world; energy is conserved, at least in circumstances where we can neglect the expansion of space. But energy can be arranged in more or less useful ways. When energy is arranged in a low-entropy configuration, it can be harnessed to perform useful work, like propelling a vehicle. But the same amount of energy, when it’s in an equilibrium configuration, is completely useless, just like a towel that is in wetness equilibrium with you. Entropy measures the uselessness of a configuration of energy.
131

Consider our divided box once again. But instead of the divider being a fixed wall with a hole in it, passively allowing molecules to move back and forth, imagine that the divider is movable, and hooked up to a shaft that reaches outside the box. What we’ve constructed is simply a piston, which can be used to do work under the right circumstances.

In Figure 44 we’ve depicted two different situations for our piston. The top row shows a piston in the presence of a low-entropy configuration of some gas—all the molecules on one side of the divider—while the bottom row shows a high-entropy configuration—equal amounts of gas on both sides. The total number of molecules, and the total amount of energy, is assumed to be the same in both cases; the only difference is the entropy. But it’s clear that what happens in the two cases is very different. In the top row, the gas is all on the left side of the piston, and the force of the molecules bumping into it exerts pressure that pushes the piston to the right until the gas fills the container. The moving piston shaft can be used to do useful work—run a flywheel or some such thing, at least for a little while. That extracts energy from the gas; at the end of the process, the gas will have a lower temperature. (The pistons in your car engine operate in exactly this way, expanding and cooling the hot vapor created by igniting vaporized gasoline, performing the useful work of moving your car.)

Figure 44:
Gas in a divided box, used to drive a cylinder. On the top, gas in a low-entropy state pushes the cylinder to the right, doing useful work. On the bottom, gas in a high-entropy state doesn’t push the cylinder in either direction.

On the bottom row in the figure, meanwhile, we imagine starting with the same amount of energy in the gas but in an initial state with a much higher entropy—an equal number of particles on each side of the divider. High entropy implies equilibrium, which implies that the energy is useless, and indeed we see that our piston isn’t going anywhere. The pressure from gas on one side of the divider is exactly canceled by pressure coming from the other side. The gas in this box has the same total energy as the gas in the upper left box, but in this case we can’t harness that energy to make the piston move to do something useful.

This helps us understand the relationship between Boltzmann’s viewpoint on entropy and that of Rudolf Clausius, who first formulated the Second Law. Remember that Clausius and his predecessors didn’t think of entropy in terms of atoms at all; they thought of it as an autonomous substance with its own dynamics. Clausius’s original version of the Second Law didn’t even mention entropy; it was the simple statement that “heat never flows spontaneously from a colder object to a hotter one.” If we put two objects with different temperatures into contact with each other, they will both evolve toward a common middle temperature; if we put two objects with the same temperature into contact with each other, they will simply stay that way. (They’re in thermal equilibrium.)

From the point of atoms, this all makes sense. Consider the classic example of two objects at different temperatures in contact with each other: an ice cube in a glass of warm water, discussed at the end of the previous chapter. Both the ice cube and the liquid are made of precisely the same kind of molecules, namely H
2
O. The only difference is that the ice is at a much lower temperature. Temperature, as we have discussed, measures the average energy of motion in the molecules of a substance. So while the molecules of the liquid water are moving relatively quickly, the molecules in the ice are moving slowly.

But that kind of condition—one set of molecules moving quickly, another moving slowly—isn’t all that different, conceptually, from two sets of molecules confined to different sides of a box. In either case, there is a broad-brush limitation on how we can rearrange things. If we had just a glass of nothing but water at a constant temperature, we could exchange the molecules in one part of the glass with molecules in some other part, and there would be no macroscopic way to tell the difference. But when we have an ice cube, we can’t simply exchange the molecules in the cube for some water molecules elsewhere in the glass—the ice cube would move, and we would certainly notice that even from our everyday macroscopic perspective. The division of the water molecules into “liquid” and “ice” puts a serious constraint on the number of rearrangements we can do, so that configuration has a low entropy. As the temperature between the water molecules that started out as ice equilibrates with that of the rest of the glass, the entropy goes up. Clausius’s rule that temperatures tend to even themselves out, rather than spontaneously flowing from cold to hot, is precisely equivalent to the statement that the entropy as defined by Boltzmann never decreases in a closed system.

None of this means that it’s impossible to cool things down, of course. But in everyday life, where most things around us are at similar temperatures, it takes a bit more ingenuity than heating them up. A refrigerator is a more complicated machine than a stove. (Refrigerators work on the same basic principle as the piston in Figure 44, expanding a gas to extract energy and cool it off.) When Grant Achatz, chef of Chicago’s Alinea restaurant, wanted a device that would rapidly freeze food in the same way a frying pan rapidly heats food up, he had to team with culinary technologist Philip Preston to create their own. The result is the “anti-griddle,” a microwave-oven-sized machine with a metallic top that attains a temperature of -34 degrees Celsius. Hot purees and sauces, poured on the anti-griddle, rapidly freeze on the bottom while remaining soft on the top. We have understood the basics of thermodynamics for a long time now, but we’re still inventing new ways to put them to good use.

DON’T SWEAT THE DETAILS

You’re out one Friday night playing pool with your friends. We’re talking about real-world pool now, not “physicist pool” where we can ignore friction and noise.
132
One of your pals has just made an impressive break, and the balls have scattered thoroughly across the table. As they come to a stop and you’re contemplating your next shot, a stranger walks by and exclaims, “Wow! That’s incredible!”

Somewhat confused, you ask what is so incredible about it. “Look at these balls at those
exact positions
on the table! What are the chances that you’d be able to put all the balls in precisely those spots? You’d never be able to repeat that in a million years!”

The mysterious stranger is a bit crazy—probably driven slightly mad by reading too many philosophical tracts on the foundations of statistical mechanics. But she does have a point. With several balls on the table, any particular configuration of them is extremely unlikely. Think of it this way: If you hit the cue ball into a bunch of randomly placed balls, which rattled around before coming to rest in a perfect arrangement as if they had just been racked, you’d be astonished. But that particular arrangement (all balls perfectly arrayed in the starting position) is no more or less unusual than any other precise arrangement of the balls.
133
What right do we have to single out certain configurations of the billiard balls as “astonishing” or “unlikely,” while others seem “unremarkable” or “random”?

This example pinpoints a question at the heart of Boltzmann’s definition of entropy and the associated understanding of the Second Law of Thermodynamics: Who decides when two specific microscopic states of a system look the same from our macroscopic point of view?

Boltzmann’s formula for entropy hinges on the idea of the quantity
W
, which we defined as “the number of ways we can rearrange the microscopic constituents of a system without changing its macroscopic appearance.” In the last chapter we defined the “state” of a physical system to be a complete specification of all the information required to uniquely evolve it in time; in classical mechanics, it would be the position and momentum of every single constituent particle. Now that we are considering statistical mechanics, it’s useful to use the term
microstate
to refer to the precise state of a system, in contrast with the
macrostate
, which specifies only those features that are macroscopically observable. Then the shorthand definition of
W
is “the number of microstates corresponding to a particular macrostate.”

For the box of gas separated in two by a divider, the microstate at any one time is the position and momentum of every single molecule in the box. But all we were keeping track of was how many molecules were on the left, and how many were on the right. Implicitly, every division of the molecules into a certain number on the left and a certain number on the right defined a “macrostate” for the box. And our calculation of
W
simply counted the number of microstates per macrostate.
134

The choice to just keep track of how many molecules were in each half of the box seemed so innocent at the time. But we could imagine keeping track of much more. Indeed, when we deal with the atmosphere in an actual room, we keep track of a lot more than simply how many molecules are on each side of the room. We might, for example, keep track of the temperature, and density, and pressure of the atmosphere at every point, or at least at some finite number of places. If there were more than one kind of gas in the atmosphere, we might separately keep track of the density and so on for each different kind of gas. That’s still enormously less information than the position and momentum of every molecule in the room, but the choice of which information to “keep” as a macroscopically measurable quantity and which information to “forget” as an irrelevant part of the microstate doesn’t seem to be particularly well defined.

Other books

Penny Jordan by [The Crightons 09] Coming Home
Claire's Prayer by Yvonne Cloete
The Journal: Cracked Earth by Moore, Deborah D.
The Immortal Prince by Jennifer Fallon
Beneath a Winter Moon by Shawson M Hebert
Terrarium by Scott Russell Sanders
The Last Romanov by Dora Levy Mossanen