Read The Fabric of the Cosmos: Space, Time, and the Texture of Reality Online

Authors: Brian Greene

Tags: #Science, #Cosmology, #Popular works, #Astronomy, #Physics, #Universe

The Fabric of the Cosmos: Space, Time, and the Texture of Reality (20 page)

BOOK: The Fabric of the Cosmos: Space, Time, and the Texture of Reality
6.5Mb size Format: txt, pdf, ePub
ads
Entropy, the Second Law, and the Arrow of Time

The tendency of physical systems to evolve toward states of higher entropy is known as the
second law of thermodynamics.
(The first law is the familiar conservation of energy.) As above, the basis of the law is simple statistical reasoning: there are more ways for a system to have higher entropy, and "more ways" means it is more likely that a system will evolve into one of these high-entropy configurations. Notice, though, that this is not a law in the conventional sense since, although such events are rare and unlikely, something
can
go from a state of high entropy to one of lower entropy. When you toss a jumbled stack of pages into the air and then gather them into a neat pile, they
can
turn out to be in perfect numerical order. You wouldn't want to place a high wager on its happening, but it
is
possible. It is also possible that the bumping and jostling will be just right to cause all the dispersed carbon dioxide molecules to move in concert and swoosh back into your open bottle of Coke. Don't hold your breath waiting for this outcome either, but it can happen.
10

The large number of pages in
War and Peace
and the large number of gas molecules in the room are what makes the entropy difference between the disordered and ordered arrangements so huge, and what causes low-entropy outcomes to be so terribly unlikely. If you tossed only two double-sided pages in the air over and over again, you'd find that they landed in the correct order about 12.5 percent of the time. With three pages this would drop to about 2 percent of the tosses, with four pages it's about .3 percent, with five pages it's about .03 percent, with six pages it's about .002 percent, with ten pages it's .000000027 percent, and with 693 pages the percentage of tosses that would yield the correct order is so small—it involves so many zeros after the decimal point—that I've been convinced by the publisher not to use another page to write it out explicitly. Similarly, if you dropped only two gas molecules side by side into an empty Coke bottle, you'd find that at room temperature their random motion would bring them back together (within a millimeter of each other), on average, roughly every few seconds. But for a group of three molecules, you'd have to wait days, for four molecules you'd have to wait years, and for an initial dense blob of a million billion billion molecules it would take a length of time far greater than the current age of the universe for their random, dispersive motion to bring them back together into a small, ordered bunch. With more certainty than death and taxes, we can count on systems with many constituents evolving toward disorder.

Although it may not be immediately apparent, we have now come to an intriguing point. The second law of thermodynamics seems to have given us an arrow of time,
one that emerges when physical systems have a
large number of constituents.
If you were to watch a film of a couple of carbon dioxide molecules that had been placed together in a small box (with a tracer showing the movements of each), you'd be hard pressed to say whether the film was running forward or in reverse. The two molecules would flit this way and that, sometimes coming together, sometimes moving apart, but they would not exhibit any gross, overall behavior distinguishing one direction in time from the reverse. However, if you were to watch a film of 10
24
carbon dioxide molecules that had been placed together in the box (as a small, dense cloud of molecules, say), you could easily determine whether the film was being shown forward or in reverse: it is overwhelmingly likely that the forward time direction is the one in which the gas molecules become more and more uniformly spread out,
achieving higher and higher entropy.
If, instead, the film showed uniformly dispersed gas molecules swooshing together into a tight group, you'd immediately recognize that you were watching it in reverse.

The same reasoning holds for essentially all the things we encounter in daily life—things, that is, which have a large number of constituents: the forward-in-time arrow points in the direction of increasing entropy. If you watch a film of a glass of ice water placed on a bar, you can determine which direction is forward in time by checking that the ice melts—its H
2
O molecules disperse throughout the glass, thereby achieving higher entropy. If you watch a film of a splattering egg, you can determine which direction is forward in time by checking that the egg's constituents become more and more disordered—that the egg splatters rather than unsplatters, thereby also achieving higher entropy.

As you can see, the concept of entropy provides a precise version of the "easy versus difficult" conclusion we found earlier. It's easy for the pages of
War and Peace
to fall out of order because there are
so many
out-of-order arrangements. It's difficult for the pages to fall in perfect order because hundreds of pages would need to move in just the right way to land in the unique sequence Tolstoy intended. It's easy for an egg to splatter because there are
so
many ways to splatter. It's difficult for an egg to unsplatter, because an enormous number of splattered constituents must move in perfect coordination to produce the single, unique result of a pristine egg resting on the counter. For things with many constituents, going from lower to higher entropy—from order to disorder—is easy, so it happens all the time. Going from higher to lower entropy—from disorder to order—is harder, so it happens rarely, at best.

Notice, too, that this entropic arrow is not completely rigid; there is no claim that this definition of time's direction is 100 percent foolproof. Instead, the approach has enough flexibility to allow these and other processes to happen in reverse as well. Since the second law proclaims that entropy increase is only a statistical likelihood, not an inviolable fact of nature, it allows for the rare possibility that pages can fall into perfect numerical order, that gas molecules can coalesce and reenter a bottle, and that eggs can unsplatter. By using the mathematics of entropy, the second law expresses precisely how statistically unlikely these events are (remember, the huge number on pages 152-53 reflects how much more likely it is that pages will land out of order), but it recognizes that they can happen.

This seems like a convincing story. Statistical and probabilistic reasoning has given us the second law of thermodynamics. In turn, the second law has provided us with an intuitive distinction between what we call past and what we call future. It has given us a practical explanation for why things in daily life, things that are typically composed of huge numbers of constituents, start like
this
and end like
that,
while we never see them start like
that
and end like
this.
But over the course of many years— and thanks to important contributions by physicists like Lord Kelvin, Josef Loschmidt, Henri Poincaré, S. H. Burbury, Ernst Zermelo, and Willard Gibbs—Ludwig Boltzmann came to appreciate that the full story of time's arrow is more surprising. Boltzmann realized that although entropy had illuminated important aspects of the puzzle, it had
not
answered the question of why the past and the future seem so different. Instead, entropy had redefined the question in an important way, one that leads to an unexpected conclusion.

Entropy: Past and Future

Earlier, we introduced the dilemma of past versus future by comparing our everyday observations with properties of Newton's laws of classical physics. We emphasized that we continually experience an obvious directionality to the way things unfold in time but the laws themselves treat what we call forward and backward in time on an exactly equal footing. As there is no arrow within the laws of physics that assigns a direction to time, no pointer that declares, "Use these laws in this temporal orientation but not in the reverse," we were led to ask: If the laws underlying experience treat both temporal orientations symmetrically, why are the experiences themselves so temporally lopsided, always happening in one direction but not the other? Where does the observed and experienced directionality of time come from?

In the last section we seemed to have made progress, through the second law of thermodynamics, which apparently singles out the future as the direction in which entropy increases. But on further thought it's not that simple. Notice that in our discussion of entropy and the second law, we did not modify the laws of classical physics in any way. Instead, all we did was use the laws in a "big picture" statistical framework: we ignored fine details (the precise order of
War and Peace
's unbound pages, the precise locations and velocities of an egg's constituents, the precise locations and velocities of a bottle of Coke's CO
2
molecules) and instead focused our attention on gross, overall features (pages ordered vs. unordered, egg splattered vs. not splattered, gas molecules spread out vs. not spread out). We found that when physical systems are sufficiently complicated (books with many pages, fragile objects that can splatter into many fragments, gas with many molecules), there is a huge difference in entropy between their ordered and disordered configurations. And this means that there is a huge likelihood that the systems will evolve from lower to higher entropy, which is a rough statement of the second law of thermodynamics. But the key fact to notice is that the second law is
derivative:
it is merely a consequence of probabilistic reasoning applied to Newton's laws of motion.

This leads us to a simple but astounding point:
Since Newton's laws of
physics have no built-in temporal orientation, all of the reasoning we have
used to argue that systems will evolve from lower to higher entropy toward
the future works equally well when applied toward the past.
Again, since the underlying laws of physics are time-reversal symmetric, there is no way for them even to distinguish between what we call the past and what we call the future. Just as there are no signposts in the deep darkness of empty space that declare this direction up and that direction down, there is nothing in the laws of classical physics that says this direction is time future and that direction is time past. The laws offer no temporal orientation; it's a distinction to which they are completely insensitive. And since the laws of motion are responsible for how things change—both toward what we call the future and toward what we call the past—the statistical/probabilistic reasoning behind the second law of thermodynamics applies equally well in both temporal directions. Thus,
not only is there an overwhelming probabilitythat the entropy of a physical system will be higher in what we call
the future, but there is the same overwhelming probability that it was higher
in what we call the past.
We illustrate this in Figure 6.2.

This is
the
key point for all that follows, but it's also deceptively subtle. A common misconception is that if, according to the second law of thermodynamics, entropy increases toward the future, then entropy necessarily
decreases
toward the past. But that's where the subtlety comes in. The second law actually says that if at any given moment of interest, a physical system happens not to possess the maximum possible entropy, it is extraordinarily likely that the physical system will subsequently have
and
previously had more entropy. That's the content of Figure 6.2b. With laws that are blind to the past-versus-future distinction, such time symmetry is inevitable.

Figure 6.2 (a) As it's usually described, the second law of thermodynamics implies that entropy increases toward the future of any given moment. (b) Since the known laws of nature treat forward and backward in time identically, the second law actually implies that entropy increases both toward the future and toward the past from any given moment.

That's the essential lesson. It tells us that the entropic arrow of time is
double-headed.
From any specified moment, the arrow of entropy increase points toward the future
and
toward the past. And that makes it decidedly awkward to propose entropy as the explanation of the
one-way
arrow of experiential time.

Think about what the double-headed entropic arrow implies in concrete terms. If it's a warm day and you see partially melted ice cubes in a glass of water, you have full confidence that half an hour later the cubes will be more melted, since the more melted they are, the more entropy they have.
11
But you should have exactly the same confidence that half an hour earlier they were also more melted, since
exactly
the same statistical reasoning implies that entropy should increase toward the past. And the same conclusion applies to the countless other examples we encounter every day. Your assuredness that entropy increases toward the future— from partially dispersed gas molecules' further dispersing to partially jumbled page orders' getting more jumbled—should be matched by
exactly
the same assuredness that entropy was also higher in the past.

The troubling thing is that half of these conclusions seem to be flatout wrong. Entropic reasoning yields accurate and sensible conclusions when applied in one time direction, toward what we call the future, but gives apparently inaccurate and seemingly ridiculous conclusions when applied toward what we call the past. Glasses of water with partially melted ice cubes do not usually start out as glasses of water with no ice cubes in which molecules of water coalesce and cool into chunks of ice, only to start melting once again. Unbound pages of
War and Peace
do not usually start thoroughly out of numerical order and through subsequent tosses get less jumbled, only to start getting more jumbled again. And going back to the kitchen, eggs do not generally start out splattered, and then coalesce into a pristine whole egg, only to splatter some time later.

Or do they?

BOOK: The Fabric of the Cosmos: Space, Time, and the Texture of Reality
6.5Mb size Format: txt, pdf, ePub
ads

Other books

Owen Marshall Selected Stories by Vincent O'Sullivan
The Lost Level by Brian Keene
The Ring Bearer by Felicia Jedlicka
Killer Kisses by Sharon Buchbinder
The Lady of the Camellias by Alexandre Dumas fils
Typhoon Island by Franklin W. Dixon