Read Farewell to Reality Online

Authors: Jim Baggott

Farewell to Reality (39 page)

BOOK: Farewell to Reality
4.66Mb size Format: txt, pdf, ePub
ads

Austrian physicist Ludwig Boltzmann established that entropy and the second law are essentially statistical in nature. Molecules of air might be injected into one corner of the container, but they soon discover that this system has many more microscopic physical states available than the small number accessible to them huddled in the corner.
*
Statistically speaking, there are many more states in which the air molecules move through all the space available in the container than there are states in which the molecules group together.

Another way of putting this is that the macroscopic state with a uniform average distribution of molecular positions and speeds is the most
probable,
simply because there are so many more microscopic
states that contribute to the average. The air molecules expand in the container from a less probable to a more probable macroscopic state, and the entropy increases. Boltzmann discovered that the entropy is proportional to the logarithm of the number of possible microscopic states that the system can have. The higher the number of these states, the higher the probability of the macroscopic state that results.

Note that this is all about statistical probabilities. There is in principle nothing preventing all the air molecules in my study from suddenly rushing into one corner of the room, causing me to die from asphyxiation. It's just that this macroscopic state of the air molecules is very, very improbable.

There's yet another way of thinking about all this. Suppose we wanted to keep track of the positions and velocities of molecules in a sample of water. This is obviously a lot easier to do if the water is in the form of ice, as the molecules form a reasonably regular and predictable array with fixed positions. But as we heat the ice and convert it eventually to steam, we lose the ability to keep track of these molecules. The molecules are all still present, and we can use statistics to give us some notion of their average speeds, but we can no longer tell where every molecule is, where it's going or how fast.

Now imagine we could apply a similar logic to one of the great soliloquies from Shakespeare's
Macbeth.
From Act V, scene v, we have:

She should have died hereafter;

There would have been a time for such a word.

Tomorrow, and tomorrow, and tomorrow,

Creeps in this petty pace from day to day,

To the last syllable of recorded time;

And all our yesterdays have lighted fools

The way to dusty death. Out, out, brief candle!

Life's but a walking shadow, a poor player

That struts and frets his hour upon the stage

And then is heard no more. It is a tale

Told by an idiot, full of sound and fury

Signifying nothing.
7

Let's suppose we can ‘heat' this soliloquy. At first, the passage melts and the words lose their places in the structure — ‘And syllable but shadow
a frets sound all …' Eventually, the words come apart and transform into a soup of individual letters — ‘s', ‘t', ‘A', ‘n', ‘e'… But the letters of the English alphabet can be coded as a series of bit strings.
*
With further heating, the bit strings come apart to produce a random ‘steam' of bits-‘0', ‘0', ‘1', ‘0', ‘1' …

All the resonance and meaning in the soliloquy — all the
information
it contained — has not exactly been lost in this process. After all, we still have all the bits. But our ability to recover the information has become extremely difficult. Our ignorance has increased. It would take an enormous amount of effort to reconstruct the soliloquy from the now scrambled bits, just as it would take an awful lot of work to reconstruct the cocktail glass from all the shards picked up from the floor. If the information isn't lost, then it has certainly become almost irretrievably ‘hidden' (or, alternatively, our ignorance of the soliloquy has become very stubborn).

What this suggests is that there is a deep relationship between information and entropy.

In 1948, American mathematician and engineer Claude Shannon developed an early but very powerful form of information theory. Shannon worked at Bell Laboratories in New Jersey, the prestigious research establishment of American Telephone and Telegraph (AT&T) and Western Electric (it is now the research and development subsidiary of Alcatel-Lucent). He was interested in the efficiency of information transfer via communications channels such as telegraphy, and he found that ‘information' as a concept could be expressed as the logarithm of the inverse of the probability of the value of a random variable used to communicate the information.

What does this mean? Suppose we agree a simple code to communicate a particular state of affairs, such as ‘my plane took off on time'. We code this event as ‘1'. The alternative, ‘my plane did not take off on time', is coded as ‘0'. Knowing airline punctuality as you do, you expect to receive the message ‘0' with 85 per cent probability. If you do indeed receive the message ‘0', then Shannon's formula
implies that its information content is very low.
*
But if you receive the message ‘1', then the information content is high. In this context, information is a measure of the ‘unexpectedness' of the message, or the extent to which you're surprised by it.

So, the entropy of Macbeth's soliloquy is low,
**
and its information content is high (the particular sequence of words is of low probability and therefore unexpected or ‘surprising', as well as very moving). Heat the soliloquy into a steam of bits. The entropy increases and the information content is, if not lost, then hidden very, very deeply.

In 1961, this kind of logic led IBM physicist Rolf Landauer to declare that ‘information is physical'. He was particularly interested in the processing of information in a computer. He concluded that when information is erased during a computation, it is actually dumped into the environment surrounding the processor, adding to the entropy. This increase in entropy results in an increase in temperature: the environment surrounding the processor heats up. Anyone who has ever run a complex computation on their laptop computer will have noticed how, after a short while, the computer starts to get uncomfortably hot.

Landauer's famous statement requires some careful interpretation, but it's enough for now to note the direct connection between the processing of information and physical quantities such as entropy and temperature. It seems that ‘information' is not an abstract concept invented by the human mind. It is a real, physical thing with real, physical consequences.

Now we're coming to it. The second law of thermodynamics insists that in a spontaneous change, entropy will always increase or information will always be diluted or degraded and so ‘hidden'. So what then happens when we throw stuff — a volume of steam, for
example — into a black hole? By definition, when the material crosses the black hole's event horizon, there's no coming back. If we dispose of a lot of high-entropy material in a black hole, this seems to imply that the net entropy of the universe has somehow reduced. Material with high entropy has disappeared from the universe. The change is spontaneous, so this appears to contradict the second law, which says that entropy can never decrease.

And if the entropy of the universe has reduced, this implies that its information content has somehow increased.

It is estimated that the total information in the universe is of the order of 10
120
bits.
8
If we regard information to be an elementary, physical constituent of the universe, then this implies that, like energy, it can be neither created nor destroyed. How, then, can throwing high-entropy material into a black hole increase the information content of the universe?

Black holes and the second law

Gravity might be the weakest of nature's forces, but it is ultimately irresistible. Gravity binds together and compresses clouds of gas drifting in space. Compressing clouds of gas with sufficient mass sparks fusion reactions at their cores, and stars are born. The pressure of radiation released by the fusion reactions holds back further compression, and the star enters a period of relative stability.

However, as the fuel is expended, the force of gravity grips tighter. For any mass greater than about 1.4
M
(1.4 solar masses),
*
the force of gravity is ultimately irresistible. It crushes the body of matter into the obscurity of a black hole, a name coined by John Wheeler.

For a time it was thought that black holes would indeed invalidate the second law. The only way to preserve the law would be to ensure that the entropy of the material that was consumed by a black hole was somehow transferred to the black hole itself.

But does it make any sense to think of a black hole as something that has entropy?

In the late 1960s, young Cambridge University physicist Stephen Hawking produced a series of papers on black hole physics in collaboration with mathematician Roger Penrose, then at Birkbeck College in London. General relativity, they claimed, predicted that at the heart of a black hole there beats a singularity, a region of infinite density and spacetime curvature where the laws of physics break down. Of course, what goes on in the region of a singularity is completely hidden from observation by the black hole's event horizon, a fact that Penrose elevated to the status of a principle, which he called the cosmic censorship hypothesis.

Working with Canadian Werner Israel, Australian Brandon Carter and British physicist David Robinson, Hawking demonstrated that, in terms of the mathematics needed to describe them, holes are surprisingly simple. Their properties and behaviour depend only on their mass, angular momentum and electric charge, a conjecture called the ‘no hair' theorem. In this context, ‘hair' means all other kinds of information apart from mass, angular momentum and electric charge. Beyond these basic properties, a black hole is featureless — it has no hair. All other kinds of information are judged to be lost behind the black hole's event horizon.

In a moment of inspiration one night in November 1970, Hawking realized that the properties of the event horizon meant that this could never shrink — the surface area of a black hole (meaning the area bounded by the event horizon) could in principle never decrease. If a black hole consumes an amount of material, then its surface area increases by a proportional amount.

We do a mental double-take. Isn't there one other well-known physical property that in a spontaneous change can never decrease? So, could there be a connection between the surface area of a black hole and its entropy?

In his 1972 Princeton PhD thesis, Israeli theorist Jacob Bekenstein (another of Wheeler's students) claimed precisely this. He identified the surface area of a black hole with its entropy. So, when a black hole consumes some high-entropy material, its surface area increases (as Hawking had observed), and this indicates that its entropy increases too, in accordance with the second law.

He was shouted down from all sides.

Hawking was irritated. While this appeared to offer a neat solution, it dragged along with it a number of implications which he felt
Bekenstein hadn't properly addressed. For one thing, a body with entropy also has to have a temperature. And a body with a temperature has to emit radiation. This made no sense at all. How could a black hole, with properties and behaviour determined only by its mass, angular momentum and electric charge, possess a temperature and
emit
radiation?

A black hole is supposed to be ‘black'. Right?

Hawking radiation

A few years later, Hawking set out to refute Bekenstein's hypothesis. Lacking a fully fledged quantum theory of gravity, he chose to approach the problem using an essentially classical general relativistic description of the black hole itself, and applied quantum field theory to the curved spacetime around the event horizon. What he found was quite shocking.

As he later explained:

However, when I did the calculation, I found, to my surprise and annoyance, that even nonrotating black holes should apparently create and emit particles at a steady rate. At first I thought that this emission indicated that one of the approximations I had used was not valid. I was afraid that if Bekenstein found out about it, he would use it as a further argument to support his ideas about the entropy of black holes, which I still did not like.
9

Hawking found that within the constraints imposed by Heisenberg's uncertainty relation, virtual particle—anti-particle pairs are produced in the curved spacetime near the black hole's event horizon. The particles are produced with zero net energy. This means that one particle in the pair may possess positive energy and the other negative energy.

Under normal circumstances, the pair would quickly annihilate. But if the negative energy particle were to be drawn into the black hole before it can be annihilated, it can acquire positive energy and become a ‘real' particle. Another way of looking at this is to think of the negative energy particle as having negative mass (from E = mc
2
). As the negative mass particle falls into the back hole, it gains mass and becomes a real particle.

BOOK: Farewell to Reality
4.66Mb size Format: txt, pdf, ePub
ads

Other books

Fighting Back by Cathy MacPhail
Señores del Olimpo by Javier Negrete
Midnight Hero by Diana Duncan
No Way to Kill a Lady by Nancy Martin
Just This Night by Mari Madison
Ride Dirty by Remi Fox
A Pearl Among Princes by Coleen Paratore
Painted Cities by Galaviz-Budziszewski, Alexai