Cosmic Apprentice: Dispatches from the Edges of Science (11 page)

BOOK: Cosmic Apprentice: Dispatches from the Edges of Science
13.39Mb size Format: txt, pdf, ePub
ads

PART III

GAIA SINGS THE BLUES

CHAPTER 8

THERMOSEMIOSIS

Boltzmann’s Sleight, Trim’s Hat, and the Confusion concerning Entropy

THERMODYNAMICS STARTED OFF
bright enough, practical and blond, saving the world from its limits. But then, overcome by shadows, its shiny children got dirt in their fingernails, soot in their hair; the world darkened with a foreboding of smokestacks. To the injury of overpopulation was added the attractiveness of thermodynamics as an incentive for geek speak, theoretical discussions that, with poetic justice, generated more heat than light.

Unlike economics, a different kind of dismal science, thermodynamics was an indisputable success, its application helping ignite the Industrial Revolution and its theory, in the form of Maxwell’s demon, helping kindle computers and the information age. Indeed, thermodynamics may be responsible for your existence, as well as most of the nitrogen atoms in your body. In early 1912 the German chemists Fritz Haber and Carl Bosch produced inexpensive ammonia using nitrogen from the air and hydrogen gas. This in turn enabled heavily populated countries to make cheap ammonia-based fertilizers, staving off starvation on a global scale. An interesting feedback loop: technology is man-made and man now is factory-made.

Even if you are a vegan eating organic food, some 50 percent of the nitrogen atoms inside your body, including in the amino acids that make up your proteins, and in your DNA, are synthetic: they were made under high pressures and temperatures in giant factories that use 2 percent of Earth’s energy, breaking the covalent bonds of nitrogen atoms in the atmosphere into forms that can be taken in by crops, eaten by food animals and us. According to Thomas Hager, these “giant factories, usually located in remote areas, that drink rivers of water, inhale oceans of air . . . burn about 2 percent of all the earth’s energy. If all the machines these men invented were shut down today, more than two billion people would starve to death.”
1

Yet despite its importance, the essence of thermodynamics remains confusing. Perhaps the enormous success of thermodynamics, in both academic theory and industrial production, caused experts to ignore simple descriptions of what the second law means over the past century. The astronomer Arthur Eddington said,

The law that entropy always increases—the second law of thermodynamics—holds, I think, the supreme position among the laws of Nature. If someone points out to you that your pet theory of the universe is in disagreement with Maxwell’s equations—then so much the worse for Maxwell’s equations. If it is found to be contradicted by observation—well, these experimentalists do bungle things sometimes. But if your theory is found to be against the second law of thermodynamics I can give you no hope; there is nothing for it but to collapse in deepest humiliation.
2

Appointed to give the Sir Robert Rede Lecture on May 7, 1959, Charles Percy Snow chose to critique higher education. Snow—a baron of the city of Leicester, England, as well as a physicist, mystery writer, defender of the realist novel, and author of the seven-volume
Strangers and Brothers
(made into a BBC series)—prodded his august audience in words that were, for all intents and purposes, the first shots in what would become the culture wars:

A good many times I have been present at gatherings of people who, by the standards of the traditional culture, are thought highly educated and who have with considerable gusto been expressing their incredulity at the illiteracy of scientists. Once or twice I have been provoked and have asked the company how many of them could describe the Second Law of Thermodynamics. The response was cold: it was also negative. Yet I was asking something which is about the scientific equivalent of: “Have you read a work of Shakespeare’s?”

I now believe [he added later to published versions of his remarks] that if I had asked an even simpler question—such as, What do you mean by mass, or acceleration, which is the scientific equivalent of saying, “Can you read?”—not more than one in ten of the highly educated would have felt that I was speaking the same language. So the great edifice of modern physics goes up, and the majority of the cleverest people in the western world have about as much insight into it as their Neolithic ancestors would have had.
3

According to the chemist Frank L. Lambert, who has written extensively about simple thermodynamics,
4
even Snow neglected to define the essence of thermodynamics he said was so important. It has nothing to do with mandating an inevitable increase in disorder, as all sorts of cultural theorists and science geeks believe. Rather, the elegant essence of the phenomenon the second law describes is that energy, if not hindered, spreads.

The intellectual sin of focusing on disorder is wildly democratic in its choice of victims. It afflicts not only the Cal Tech astrophysicist Sean Carroll, in
From Eternity to Here,
but Pope Pius XII, who offers the second law as proof of the existence of God, because only he had the wherewithal, in creating organized life and man, to resist this all-inclusive law of ever-increasing disorder. Yet hardly anyone is safe from this widespread mistaken meme, not even the most scientific-seeming rationalists and atheists. For example, the Darwinist philosopher Daniel Dennett repeats a version of this same mistake when he writes that life-forms “are things that defy” and constitute a “systematic reversal” of the second law.
5
Superficially, this may seem to be the case, as some of life’s key chemicals concentrate rather than spread energy. But it is crucial to realize that, overall, living systems spread energy and that their partial molecular concentration of energy, and production of gradients, abets this process. Saying that life defies the second law is like saying that Robin Hood is against the spread of wealth because he gives it to the poor. A watch must have its watchmaker. A car does not put itself together from parts. Nonetheless, moving atoms do join to form compounds and more complex molecules. Unhindered, energy spontaneously spreads out.

While a red Ferrari doesn’t assemble itself from spare parts in a junkyard during a windstorm, this has little to do with the organization we see in life. The macro-objects of our everyday life do not behave in the same way as atoms and molecules. Car parts in a junkyard don’t routinely whiz by at two hundred to two thousand miles an hour, colliding with one another, fusing and releasing so much energy that they become white-hot. Such behavior, however, is normal for molecules.

The vast majority of compounds, some quite complex, form easily. But molecules are not atoms mixed up at random like the batter around a chicken leg in a bag of Shake ’n Bake. When three or more atoms aggregate to make a molecule, they possess a precise order. Their atoms, in a relatively fixed geometric relationship, generally stay stable. When atoms “bond” after their violent collisions, they aggregate into molecules so stable that temperatures of thousands of degrees are needed to pry them apart. Melt them together (to make them move more rapidly), and amino acids form huge new compounds. The melted amino acids make “proteinoids” with hundreds to thousands of amino acid units firmly joined in the same kind of bonds that hold proteins together. The result is not useful or valuable proteins, but it does show how easily gigantic complex molecules can form naturally.

There are millions of compounds that have less energy in them than the elements of which they are composed; they are the result of “downhill” reactions, formed easily, resulting in the spread of energy. Their formation is no more mysterious than a glob of toothpaste appearing at the end of a squeezed tube.

The rules of energy science favor formation of complex, geometrically ordered molecules. But there are also compounds that require, like objects in a factory, additions of energy from outside, leading to compounds having more energy in them than they had before. Such molecules may result, for example, by molecules being energized by lightning.

While less likely, such reactions to create higher-energy molecules happen all the time. Alkanes, for example, are among the simplest of organic compounds, composed only of carbon and hydrogen, and containing portions or sections with one carbon atom holding two or three hydrogen atoms. Both simple and complex alkanes have been detected by spectroscopic methods in space. Simple alkanes with two to five carbon atoms joined to one another (and hydrogens attached to each carbon) all contain less energy than their elements. More complex alkanes, with six or more carbon atoms joined to make their molecules, have more energy in them than the elements from which they come.

These alkanes, like life’s key energy-storage molecule, ATP—adenosine triphosphate, structurally a cousin to DNA—require energy to be formed. ATP is an amazing molecule. Not only is it as omnipresent in life as DNA or RNA, but because it is built up from energy and spent in metabolism, it is like cash in a casino: You synthesize and break up roughly your entire stock (about 8.8 ounces) of ATP each day.

Alkanes and similar so-called
endergonic
chemicals contain more energy in them than the elements that go into them because they are forged via input of external energy—ultraviolet or X-rays, both plentiful in many parts of the universe. This energy is not so hard to come by. Indeed, high-energy cosmic rays are penetrating your body this very moment. On average, about five cosmic rays penetrate every square inch of your body every second, right now, even as you are reading this sentence: gamma rays, X-rays, subatomic particles. Some come from the sun, but a lot come from supernova explosions.

Such abundant energy—and UV radiation was more plentiful on the early Earth—bombarded simple chemicals all the time, sometimes creating more energetic compounds that stored energy which, released later, created a cascade of delayed reactions. So even with science’s splendid emphasis on connecting humans and life to natural cosmic processes, it is easy to see where the notion that life defies the second law comes from, but it is wrong. It is far better to think of living systems as temporarily deferring second law–“based” predictions of immediate gradient breakdown, but as part of a process of greater overall and longer-lasting energy delocalization. Temperature measurements by low-flying airplanes over the H. J. Andrews Experimental Forest in Oregon corroborated thermal satellite pictures showing that rain forests in summer are (because of cloud cover) as cool as Siberia in the winter. Quarries and clear-cut forests have higher temperatures than a twenty-five-year-old Douglas fir plantation and a natural forest twenty-five years old, and neither of these was as cool as an old-growth forest four hundred years old. At first one might be tempted to explain these data by saying that the capture of solar energy in the cooler versus more fallow areas is due to the buildup in them of energy-storing chemicals that prevent energy’s spread. Yet there is another interpretation, which is quite different. Consider a refrigerator, keeping itself cool internally but generating excess heat. Is not this the essence of the grasslands compared with the desert, the forests compared with grasslands, and the great jungles compared with temperate forests? Most of the solar energy in the plants goes not into “blocking” the second law to make energy-storing compounds but into the thermodynamically open process of evapotranspiration. Latent heat is released as rain. Given the solar gradient, the difference between the hot sun and cold space, the cooling provided by evapotranspiration-produced clouds must, like a refrigerator, lead to energy spread, entropy production, farther out. Like natural nonliving complex systems, and our cooling machines that require an outside source of energy and dump heat into their surroundings, organisms have impressive internal structure. Yet, seen as energetic processes rather than firmly bound things and compared with less-organized regions of matter, they produce more heat, even as they keep themselves cool. They spread more energy. Constitutively open systems, they do not defy the second law. Rather, their order is connected to a more effective, elegant, and continuous production of entropy, dispersed energy.

Entropy is a confusing word. In 1854 the German physicist Rudolf Clausius combined the word
energy
with “tropos,” Greek for transformation, to come up with entropy for a change in energy,
dq
: Δ
S
=
dq
/
T
.
6
This was later given a statistical formation by Ludwig Boltzmann (1844–1906), one of the founders of modern thermodynamics. Boltzmann in a single statement is probably responsible for the lion’s share of our confusion about the conceptual meaning of entropy. After more than four hundred pages of heavy math, in a common-language summation, Boltzmann writes (in
Lectures on Gas Theory
[
Vorlesungenuber Gastheorie
]) that the universe, “or at least most of the parts of it surrounding us are initially in a very ordered—therefore very improbable—state. When this is the case, then whenever two or more small parts of it come into interaction with each other, the system formed by these parts is also initially in an ordered state and when left to itself it rapidly proceeds to the disordered most probable state.”
7

This concept—of entropy as “disorder” and thus any type of disorder as “entropy”—was dominant throughout the twentieth century. In the equation for entropy the symbol for entropy is
S
. Boltzmann had developed an equation for the entropy change, Δ
S
, in terms of energy states, but he could not do actual calculations because he did not know how to discover the value of
k
, now known as Boltzmann’s constant. Before Boltzmann committed suicide, but without his hearing about it, the physicist Max Planck established that
k
was equal to
R
/
N
—the gas constant divided by the number of molecules in a mole. The equation
S
=
k
log
W
(engraved on Boltzmann’s 1906 tombstone) is actually a version of this equation, coined about 1900 by Planck.

BOOK: Cosmic Apprentice: Dispatches from the Edges of Science
13.39Mb size Format: txt, pdf, ePub
ads

Other books

Pandora's Box by Natale Stenzel
Kissed by Reality by Carrie Aarons
The Genius Thieves by Franklin W. Dixon
The Peddler by Prather, Richard S