In Pursuit of the Unknown (40 page)

BOOK: In Pursuit of the Unknown
4.51Mb size Format: txt, pdf, ePub

Fig 47
Carnot cycle.
Left
: In terms of pressure and volume.
right
: In terms of temperature and entropy.

Entropy is like heat: it is defined in terms of a change of state, not a state as such. Suppose that a fluid in some initial state changes to a new state. Then the difference in entropy between the two states is the total change in the quantity ‘heat divided by temperature'. In symbols, for a small step along the path between the two states, entropy
S
is related to heat
q
and temperature
T
by the differential equation
dS=dq/T
. The change in entropy is the change in heat per unit temperature. A large change of state can be represented as a series of small ones, so we add up all these small changes in entropy to get the overall change of entropy. Calculus tells us that the way to do this is to use an integral.
1

Having defined entropy, the second law of thermodynamics is very simple. It states that in any physically feasible thermodynamic process, the entropy of an isolated system must always increase.
2
In symbols, dS≥0. For example, suppose we divide a room with a movable partition, put oxygen on one side of the partition and nitrogen on the other. Each gas has a particular entropy, relative to some initial reference state. Now remove the partition, allowing the gases to mix. The combined system also has a particular entropy, relative to the same initial reference states. And the entropy of the combined system is always greater than the sum of the entropies of the two separate gases.

Classical thermodynamics is phenomenological: it describes what you can measure, but it's not based on any coherent theory of the processes involved. That step came next with the kinetic theory of gases, pioneered by Daniel Bernoulli in 1738. This theory provides a physical explanation of pressure, temperature, the gas laws, and that mysterious quantity entropy. The basic idea – highly controversial at the time – is that a gas consists of a
large number of identical molecules, which bounce around in space and occasionally collide with each other. Being a gas means that the molecules are not too tightly packed, so any given molecule spends a lot of its time travelling through the vacuum of space at a constant speed in a straight line. (I say ‘vacuum' even though we're discussing a gas, because that's what the space between molecules consists of.) Since molecules, though tiny, have nonzero size, occasionally two of them will collide. Kinetic theory makes the simplifying assumption that they bounce like two colliding billiard balls, and that these balls are perfectly elastic, so no energy is lost in the collision. Among other things, this implies that the molecules keep bouncing forever.

When Bernoulli first suggested the model, the law of conservation of energy was not established and perfect elasticity seemed unlikely. The theory gradually won support from a small number of scientists, who developed their own versions and added various new ideas, but their work was almost universally ignored. The German chemist and physicist August Krönig wrote a book on the topic in 1856, simplifying the physics by not allowing the molecules to rotate. Clausius removed this simplification a year later. He claimed he had arrived at his results independently, and is now ranked as one of the first significant founders of kinetic theory. He proposed one of the key concepts of the theory, the mean free path of a molecule: how far it travels, on average, between successive collisions.

Both König and Clausius deduced the ideal gas law from kinetic theory. The three key variables are volume, pressure, and temperature. Volume is determined by the vessel that contains the gas, it sets ‘boundary conditions' that affect how the gas behaves, but is not a feature of the gas as such. Pressure is the average force (per square unit of area) exerted by the molecules of the gas when they collide with the walls of the vessel. This depends on how many molecules are inside the vessel, and how fast they are moving. (They don't all move at the same speed.) Most interesting is temperature. This also depends on how fast the gas molecules are moving, and it is proportional to the average kinetic energy of the molecules. Deducing Boyle's law, the special case of the ideal gas law for constant temperature, is especially straightforward. At a fixed temperature, the distribution of velocities doesn't change, so pressure is determined by how many molecules hit the wall. If you reduce the volume, the number of molecules per cubic unit of space goes up, and the chance of any molecule hitting the wall goes up as well. Smaller volume means denser gas means more molecules hitting the wall, and this argument can be made quantitative. Similar but more complicated arguments produce the ideal
gas law in all its glory as long as the molecules aren't squashed too tightly together. So now there was a deeper theoretical basis for Boyle's law, based on the theory of molecules.

Maxwell was inspired by Clausius's work, and in 1859 he placed kinetic theory on mathematical foundations by writing down a formula for the probability that a molecule will travel with a given speed. It is based on the normal distribution or bell curve (
Chapter 7
). Maxwell's formula seems to have been the first instance of a physical law based on probabilities. He was followed by the Austrian physicist Ludwig Boltzmann, who developed the same formula, now called the Maxwell–Boltzmann distribution. Boltzmann reinterpreted thermodynamics in terms of the kinetic theory of gases, founding what is now called statistical mechanics. In particular, he came up with a new interpretation of entropy, relating the thermodynamic concept to a statistical feature of the molecules in the gas.

The traditional thermodynamic quantities, such as temperature, pressure, heat, and entropy, all refer to large-scale average properties of the gas. However, the fine structure consists of lots of molecules whizzing around and bumping into each other. The same large-scale state can arise from innumerable different small-scale states, because minor differences on the small scale average out. Boltzmann therefore distinguished macrostates and microstates of the system: large-scale averages and the actual states of the molecules. Using this, he showed that entropy, a macrostate, can be interpreted as a statistical feature of microstates. He expressed this in the equation

S
=
k log W

Here
S
is the entropy of the system,
W
is the number of distinct microstates that can give rise to the overall macrostate, and
k
is a constant. It is now called Boltzmann's constant, and its value is 1.38 × 10
–23
joules per degree kelvin.

It is this formula that motivates the interpretation of entropy as disorder. The idea is that fewer microstates correspond to an ordered macrostate than to a disordered one, and we can understand why by thinking about a pack of cards. For simplicity, suppose that we have just six cards, marked 2, 3, 4, J, Q, K. Put them in two separate piles, with the low-value cards in one pile and the court cards in the other. This is an ordered arrangement. In fact, it retains traces of order if you shuffle each pile, but keep the piles separate, because however you do this, the low-value cards are all in one pile and the court cards are in the other. However, if you shuffle both piles together, the two types of card can become mixed, with
arrangements like 4QK2J3. Intuitively, these mixed-up arrangements are more disordered.

Let's see how this relates to Boltzmann's formula. There are 36 ways to arrange the cards in their two piles: six for each pile. But there are 720 ways (6! = 1 × 2 × 3 × 4 × 5 × 6) to arrange all six cards in order. The type of ordering of the cards that we allow – two piles or one – is analogous to the macrostate of a thermodynamic system. The exact order is the microstate. The more ordered macrostate has 36 microstates, the less ordered one has 720. So the more microstates there are, the less ordered the corresponding macrostate becomes. Since logarithms get bigger when the numbers do, the greater the logarithm of the number of microstates, the more disordered the macrostate becomes. Here

log 36 = 3.58      log 720 = 6.58

These are effectively the entropies of the two macrostates. Boltzmann's constant just scales the values to fit the thermodynamic formalism when we're dealing with gases.

The two piles of cards are like two non-interacting thermodynamic states, such as a box with a partition separating two gases. Their individual entropies are each log 6, so the total entropy is 2 log 6, which equals log 36. So the logarithm makes entropy
additive
for non-interacting systems: to get the entropy of the combined (but not yet interacting) system, add the separate entropies. If we now let the systems interact (remove the partition) the entropy increases to log 720.

The more cards there are, the more pronounced this effect becomes. Split a standard pack of 52 playing cards into two piles, with all the red cards in one pile and all the black cards in the other. This arrangement can occur in (26!)
2
ways, which is about 1.62 × 10
53
. Shuffling both piles together we get 52! microstates, roughly 8.07 × 10
67
. The logarithms are 122.52 and 156.36 respectively, and again the second is larger.

Boltzmann's ideas were not received with great acclaim. At a technical level, thermodynamics was beset with difficult conceptual issues. One was the precise meaning of ‘microstate'. The position and velocity of a molecule are continuous variables, able to take on infinitely many values, but Boltzmann needed a finite number of microstates in order to count how many there were and then take the logarithm. So these variables had to be ‘coarse-grained' in some manner, by splitting the continuum of possible values into finitely many very small intervals.
Another issue, more philosophical in nature, was the arrow of time – an apparent conflict between the time-reversible dynamics of microstates and the one-way time of macrostates, determined by entropy increase. The two issues are related, as we will shortly see.

The biggest obstacle to the theory's acceptance, however, was the idea that matter is made from extremely tiny particles, atoms. This concept, and the word atom, which means ‘indivisible', goes back to ancient Greece, but even around 1900 the majority of physicists did not believe that matter is made from atoms. So they didn't believe in molecules, either, and a theory of gases based on them was obviously nonsense. Maxwell, Boltzmann, and other pioneers of kinetic theory were convinced that molecules and atoms were real, but to the skeptics, atomic theory was just a convenient way to picture matter. No atoms had ever been observed, so there was no scientific evidence that they existed. Molecules, specific combinations of atoms, were similarly controversial. Yes, atomic theory fitted all sorts of experimental data in chemistry, but that was not proof that atoms existed.

One of the things that finally convinced most objectors was the use of kinetic theory to make predictions about Brownian motion. This effect was discovered by a Scottish botanist, Robert Brown.
3
He pioneered the use of the microscope, discovering, among other things, the existence of the nucleus of a cell, now known to be the repository of its genetic information. In 1827 Brown was looking through his microscope at pollen grains in a fluid, and he spotted even tinier particles that had been ejected by the pollen. These tiny particles jiggled around in a random manner, and at first Brown wondered if they were some diminutive form of life. However, his experiments showed the same effect in particles derived from non-living matter, so whatever caused the jiggling, it didn't have to be alive. At the time, no one knew what caused this effect. We now know that the particles ejected by the pollen were organelles, tiny subsystems of the cell with specific functions; in this case, to manufacture starch and fats. And we interpret their random jiggles as evidence for the theory that matter is made from atoms.

The link to atoms comes from mathematical models of Brownian motion, which first turned up in statistical work of the Danish astronomer and actuary Thorvald Thiele in 1880. The big advance was made by Einstein in 1905 and the Polish scientist Marian Smoluchowski in 1906. They independently proposed a physical explanation of Brownian motion: atoms of the fluid in which the particles were floating were randomly bumping into the particles and giving them tiny kicks. On this basis, Einstein used a mathematical model to make quantitative predictions
about the statistics of the motion, which were confirmed by Jean Baptiste Perrin in 1908–9.

Boltzmann committed suicide in 1906 – just when the scientific world was starting to appreciate that the basis of his theory was real.

In Boltzmann's formulation of thermodynamics, molecules in a gas are analogous to cards in a pack, and the natural dynamics of the molecules is analogous to shuffling. Suppose that at some moment all the oxygen molecules in a room are concentrated at one end, and all the nitrogen molecules are at the other. This is an ordered thermodynamic state, like two separate piles of cards. After a very short period, however, random collisions will mix all the molecules together, more or less uniformly throughout the room, like shuffling the cards. We've just seen that this process typically causes entropy to increase. This is the orthodox picture of the relentless increase of entropy, and it is the standard interpretation of the second law: ‘the amount of disorder in the universe steadily increases'. I'm pretty sure that this characterisation of the second law would have satisfied Snow if anyone had offered it. In this form, one dramatic consequence of the second law is the scenario of the ‘heat death of the universe', in which the entire universe will eventually become a lukewarm gas with no interesting structure whatsoever.

Other books

The Song is You (2009) by Arthur Phillips
Rachel's Accident by Barbara Peters
All Was Revealed by Adele Abbott
Just for Kicks by Robert Rayner
My Lord Deceived by King, Rebecca
SYLVIE'S RIDDLE by WALL, ALAN
The Waterstone by Rebecca Rupp
The Cairo Codex by Linda Lambert
The Phantom by Rob MacGregor