Genius (16 page)

Read Genius Online

Authors: James Gleick

BOOK: Genius
9.65Mb size Format: txt, pdf, ePub

In Feynman’s senior year, just over a decade after the three-year revolution of Heisenberg, Schrödinger, and Dirac, the applied branches of physics and chemistry had been drawn into an explosion of activity. To outsiders quantum mechanics might have seemed a nuisance, with its philosophical entanglements and computational nightmares. In the hands of those analyzing the structures of metals or chemical reactions, however, the new physics was slicing through puzzles that classical physics found impenetrable. Quantum mechanics was triumphing not because a few leading theorists found it mathematically convincing, but because hundreds of materials scientists found that it worked. It gave them insights into problems that had languished, and it gave them a renewed livelihood. One had only to understand the manipulation of a few equations and one could finally compute the size of an atom or the precise gray sheen of a pewter surface.

Chief in the new handbook was Schrödinger’s wave equation. Quantum mechanics taught that a particle was not a particle but a smudge, a traveling cloud of probabilities, like a wave in that the essence was spread out. The wave equation made it possible to compute with smudges and accommodate the probability that a feature of interest might appear anywhere within a certain range. This was essential. No classical calculation could show how electrons would arrange themselves in a particular atom: classically the negatively charged electrons should seek their state of lowest energy and spiral in toward the positively charged nuclei. Substance itself would vanish. Matter would crumple in on itself. Only in terms of quantum mechanics was that impossible, because it would give the electron a definite pointlike position. Quantum-mechanical uncertainty was the air that saved the bubble from collapse. Schrödinger’s equation showed where the electron clouds would find their minimum energy, and on those clouds depended all that was solid in the world.

Often enough, it became possible to gain an accurate picture of where the electrons’ charge would be distributed in the three-dimensional space of a solid crystal lattice of molecules. That charge distribution in turn held the massive nuclei of the atoms in place—again, in places that kept the overall energy at a minimum. If a researcher wanted to calculate the forces working on a given nucleus, there was a way to do it—a laborious way. He had to calculate the energy, and then calculate it again, this time with the nucleus slightly shifted out of position. Eventually he could draw a curve representing the change in energy. The slope of that curve represented the sharpness of the change—the force. Each varied configuration had to be computed afresh. To Feynman this seemed wasteful and ugly.

It took him a few pages to demonstrate a better method. He showed that one could calculate the force directly for a given configuration, without having to look at nearby configurations at all. His computational technique led directly to the slope of the energy curve—the force—instead of producing the full curve and deriving the slope secondarily. The result caused a small sensation among MIT’s physics faculty, many of whom had spent enough time working on applied molecular problems to appreciate Feynman’s remark, “It is to be emphasized that this permits a considerable saving of labor of calculations.”

Slater made him rewrite the first version. He complained that Feynman wrote the way he talked, hardly an acceptable style for a scientific paper. Then he advised him to submit a shortened version for publication. The
Physical
Review
accepted it, with the title shortened as well, to “Forces in Molecules.”

Not all computational devices have analogues in the word pictures that scientists use to describe reality, but Feynman’s discovery did. It corresponded to a theorem that was easy to state and almost as easy to visualize: The force on an atom’s nucleus is no more or less than the electrical force from the surrounding field of charged electrons—the electrostatic force. Once the distribution of charge has been calculated quantum mechanically, then from that point forward quantum mechanics disappears from the picture. The problem becomes classical; the nuclei can be treated as static points of mass and charge. Feynman’s approach applies to all chemical bonds. If two nuclei act as though strongly attracted to each other, as the hydrogen nuclei do when they bond to form a water molecule, it is because the nuclei are each drawn toward the electrical charge concentrated quantum mechanically between them.

That was all. His thesis had strayed from the main line of his thinking about quantum mechanics, and he rarely thought about it again. When he did, he felt embarrassed to have spent so much time on a calculation that now seemed trivial and self-evident. As far as he knew, it was useless. He had never seen a reference to it by another scientist. So he was surprised to hear in 1948 that a controversy had erupted among physical chemists about the discovery, now known as Feynman’s theorem or the Feynman-Hellmann theorem. Some chemists felt it was too simple to be true.

Is He Good Enough?

A few months before graduation, most of the thirty-two brothers of Phi Beta Delta posed for their portrait photograph. Feynman, seated at the left end of the front row, still looked smaller and younger than his classmates. He clenched his jaw, obeyed the photographer’s instruction to rest his hands on his knees, and leaned gravely in toward the center. He went home at the end of the term and returned for the ceremony in June 1939. He had just learned to drive an automobile, and he drove his parents and Arline to Cambridge. On the way he became sick to his stomach—from the tension of driving, he thought. He was hospitalized for a few days, but he recovered in time to graduate. Decades later he remembered the drive. He remembered his friends teasing him when he donned his academic robe—Princeton did not know what a rough guy it was getting. He remembered Arline.

“That’s all I remember of it,” he told a historian. “I remember my sweet girl.”

Slater left MIT not many years after Feynman. By then the urgency of war research had brought I. I. Rabi from Columbia to become the vigorous scientific personality driving a new laboratory, the Radiation Laboratory, set up to develop the use of shorter and shorter radio wavelengths for the detection of aircraft and ships through night and clouds: radar. It seemed to some that Slater, unaccustomed to the shadow of a greater colleague, found Rabi’s presence unbearable. Morse, too, left MIT to take a role in the growing administrative structure of physics. Like so many scientists of the middle rank, both men saw their reputations fade in their lifetimes. Both published small autobiographies. Morse, in his, wrote about the challenges in guiding students toward a career as esoteric as physics. He recalled a visit from the father of a graduating senior named Richard. The father struck Morse as uneducated, nervous merely to be visiting a university. He did not speak well. Morse recalled his having said (“omitting his hesitations and apologies”):

My son Richard is finishing his schooling here next spring. Now he tells me he wants to go on to do more studying, to get still another degree. I guess I can afford to pay his way for another three or four years. But what I want to know is, is it worth it for him? He tells me you’ve been working with him. Is he good enough to deserve the extra schooling?

Morse tried not to laugh. Jobs in physics were hard to get in 1939, but he told the father that Richard would surely do all right.

PRINCETON

The apostle of Niels Bohr at Princeton was a compact, gray-eyed, twenty-eight-year-old assistant professor named John Archibald Wheeler who had arrived the year before Feynman, in 1938. Wheeler had Bohr’s rounded brow and soft features, as well as his way of speaking about physics in oracular undertones. In the years that followed, no physicist surpassed Wheeler in his appreciation for the mysterious or in his command of the Delphic catchphrase:

A black hole has no hair
was his. In fact he coined the term “black hole.”

There is no law except the law that there is no law.

I always keep two legs going, with one trying to reach ahead.

In any field find the strangest thing and then explore it.

Individual events. Events beyond law. Events so numerous and so uncoordinated that, flaunting their freedom from formula, they yet fabricate firm form.

He dressed like a businessman, his tie tightly knotted and his white cuffs starched, and he fastidiously pulled out a pocket watch when he began a session with a student (conveying a message: the professor will spare just so much time …). It seemed to one of his Princeton colleagues, Robert R. Wilson, that behind the gentlemanly façade lay a perfect gentleman—and behind that façade another perfect gentleman, and on and on. “However,” Wilson said, “somewhere among those polite façades there was a tiger loose; a reckless buccaneer … who had the courage to look at any crazy problem.” As a lecturer he performed with a magnificent self-assurance, impressing his audience with elegant prose and provocative diagrams. When he was a boy, he spent many hours poring over the drawings in a book called
Ingenious Mechanisms and Mechanical Devices
. He made adding machines and automatic pistols with gears and levers whittled from wood, and his blackboard illustrations of the most foggy quantum paradoxes retained that ingenious flavor, as though the world were a wonderful silvery machine. Wheeler grew up in Ohio, the son of librarians and the nephew of three mining engineers. He went to college in Baltimore, got his graduate degree at Johns Hopkins University, and then won a National Research Council Fellowship that brought him to Copenhagen in 1934 via freighter (fifty-five dollars one way) to study with Bohr.

He and Bohr worked together again, as colleagues this time, in the first months of 1939. Princeton had hired Wheeler and promoted the distinguished Hungarian physicist Eugene Wigner in a deliberate effort to turn toward nuclear physics. MIT had remained deliberately conservative about rushing to board the wagon train; Slater and Compton preferred to emphasize well-roundedness and links to more applied fields. Not so Princeton. Wheeler still remembered the magic of his first vision of radioactivity: how he had sat in a lightless room, staring toward the black of a zinc sulfide screen, counting the intermittent flashes of individual alpha particles sent forth by a radon source. Bohr, meanwhile, had left the growing tumult of Europe to visit Einstein’s institute in Princeton. When Wheeler met his ship at the pier in New York, Bohr was carrying news about what would now rapidly become the most propitious object in physics: the uranium atom.

Compared to the hydrogen atom, stark kernel with which Bohr had begun his quantum revolution, the uranium atom was a monster, the heaviest atom in nature, bulked out with 92 protons and 140-odd neutrons, so scarce in the cosmos that hydrogen atoms outnumber it by seventeen trillion to one, and unstable, given to decaying at quantum mechanically unpredictable moments down a chain of lighter elements or—this was the extraordinary news that kept Bohr at his portable blackboard all through the North Atlantic voyage—splitting, when slugged by a neutron, into odd pairs of smaller atoms, barium and krypton or tellurium and zirconium, plus a bonus of new neutrons and free energy. How was anyone to visualize this bloated nucleus? As a collection of marbles sliding greasily against one another? As a bunch of grapes squeezed together by nuclear rubber bands? Or as a “liquid drop”—the phrase that spread like a virus through the world of physics in 1939—a shimmering, jostling, oscillating globule that pinches into an hourglass and then fissures at its new waist. It was this last image, the liquid drop, that enabled Wheeler and Bohr to produce one of those unreasonably powerful oversimplifications of science, an effective theory of the phenomenon that had been named, only in the past year, fission. (The word was not theirs, and they spent a late night trying to find a better one. They thought about
splitting
or
mitosis
and then gave up.)

By any reasonable guess, a liquid drop should have served as a poor approximation for the lumpy, raisin-studded complex at the heart of a heavy atom, with each of two hundred–odd particles bound to each of the others by a strong close-range nuclear force, a force quite different from the electrical forces Feynman had analyzed on the scale of whole molecules. For smaller atoms the liquid-drop metaphor failed, but for large agglomerations like uranium it worked. The shape of the nucleus, like the shape of a liquid drop, depends on a delicate balance between the two opposing forces. Just as surface tension encourages a compact geometry in a drop, so do the forces of nuclear attraction in an atom. The electrical repulsion of the positively charged protons counters the attraction. Bohr and Wheeler recognized the unexpected importance of the slow neutrons that Fermi had found so useful at his laboratory in Rome. They made two remarkable predictions: that only the rarer uranium isotope, uranium 235, would fission explosively; and that neutron bombardment would also spark fission in a new substance, with atomic number 94 and mass 239, not found in nature and not yet created in the laboratory. To this pair of theoretical assertions would shortly be devoted the greatest technological enterprise the world had ever seen.

The laboratories of nuclear physics were spreading rapidly. Considerable American inventive spirit had gone into the development of an arsenal of machinery designed to accelerate beams of particles, smash them into metal foils or gaseous atoms, and track the collision products through chambers of ionizing gas. Princeton had one of the nation’s first large “cyclotrons”—the name rang proudly of the future—completed in 1936 for the cost of a few automobiles. The university also kept smaller accelerators working daily, manufacturing rare elements and new isotopes and generating volumes of data. Almost any experimental result seemed worthwhile when hardly anything was known. With all the newly cobbled-together equipment came difficulties of measurement and interpretation, often messy and ad hoc. A student of Wheeler’s, Heinz Barschall, came to him in the early fall of 1939 with a typical problem. Like so many new experimenters Barschall was using an accelerator beam to scatter particles through an ionizing chamber, where their energies could be measured. He needed to gauge the different energies that would appear at different angles of recoil. Barschall had realized that his results were distorted by the circumstances of the chamber itself. Some particles would start outside the chamber; others would start inside and run into the chamber’s cylindrical wall, and in neither case would the particle have its full energy. The problem was to compensate, find a way to translate the measured energies into the true energies. It was a problem of awkward probabilities in a complicated geometry. Barschall had no idea where to start. Wheeler said that he was too busy to think about it himself but that he had a very bright new graduate student …

Other books

The Ice Cradle by Mary Ann Winkowski, Maureen Foley
Before the Fall by Sable Grace
Dragons of War by Christopher Rowley
Limitless (Journey Series) by Williams, C.A.
The Mating Ritual by Tory Richards
Origins (A Black Novel, #1) by Jessa L. Gilbert
Napoleon in Egypt by Paul Strathern
Honky Tonk Angel by Ellis Nassour