Computing with Quantum Cats (8 page)

BOOK: Computing with Quantum Cats
13.47Mb size Format: txt, pdf, ePub

JOHNNY AND THE BOMB

In July 1946, von Neumann received both the US Navy's Distinguished Civilian Service Award and President Harry S. Truman's Medal for Merit. The citation for the latter said that he was “primarily responsible for fundamental research by the United States Navy on the effective use of high explosives, which has resulted in the discovery of a new ordnance principle for offensive action.” The reference was, of course, to the nuclear bomb.

Von Neumann had been called back from England to join the Manhattan Project in 1943, and by the autumn he was in Los Alamos, where he made two key contributions to the project.
5
The first was to point out (and prove mathematically) that such a bomb would be more effective if
exploded at altitude rather than at ground level, because both the heat and blast would affect a wider area. The second contribution was much more profound.

A fission bomb works by bringing together forcefully a sufficient amount (“critical mass”) of a radioactive material such as uranium or plutonium. Under such conditions, particles (neutrons) released by the fission (“splitting”) of one atomic nucleus trigger the fission of more nuclei, in a runaway chain reaction. Each “split” converts a little mass into energy, in line with Einstein's famous equation, and the overall result is the explosive release of a lot of energy as heat, light, and other electromagnetic radiation. But if the critical mass is not tightly confined, most of the neutrons escape and the material simply gets hot, rather than exploding. The first method the Los Alamos team considered for achieving the required result was to prepare a critical mass of uranium in two halves at opposite ends of a tube, and fire conventional explosives to smash one of them (the “bullet”) into the other. For obvious reasons, it was called the gun method, and was used in the Little Boy weapon dropped on Hiroshima. But this method could not be used with plutonium for technical reasons, not least the possibility that the plutonium, being more active than uranium, might “pre-ignite” just before the bullet hit the target, releasing its neutrons too gradually and causing the bomb to fizzle rather than explode. This was where von Neumann came in.

Before von Neumann arrived in Los Alamos, another member of the team, Seth Neddermeyer, had suggested setting off explosives to produce shock waves which would squeeze a “subcritical” mass of plutonium to the point where it reached critical mass and exploded. But the idea had not been
followed up, and nobody had been able to work out exactly how to achieve the desired result. Edward Teller, a member of the Los Alamos team who is remembered as the “father of the hydrogen bomb,” later recalled how the problem was solved.
6
Von Neumann had calculated the kind of pressure that would be produced in a lump of plutonium if it was squeezed in the grip of a large number of explosive charges surrounding it and going off simultaneously. He discussed his results, which still seemed to fall short of what was necessary to produce a practical bomb, with Teller, who had worked in geophysics and knew that under very high pressures such as those at the center of the Earth even a substance like iron gets compressed to higher density than at the planet's surface. He pointed out that this compressibility would make the process von Neumann was describing even more effective, because the more the plutonium atoms were squeezed together the easier it would be for a chain reaction to take place. Von Neumann redid the calculation, taking account of compressibility, and found that the trick would work. After a great deal more work by many people, including von Neumann, the result was the Fat Man bomb dropped on Nagasaki, in which a hollow shell of plutonium was triggered into explosive fission by the firing of thirty-two opposing pairs of detonators to produce an inward compression. This process owed a great deal, in the days before electronic calculators, to von Neumann's ability at carrying out mathematical calculations; but it also highlighted the need for faster methods of carrying out such calculations, which could be used when there wasn't a von Neumann around to do the work. This became of crucial importance when von Neumann, who continued to spend two months each year visiting Los Alamos after the war, became involved
in the development of the hydrogen bomb, based on nuclear fusion, not fission: because even von Neumann couldn't do all the calculations on his own.

The calculations for the Manhattan Project had been aided by the use of machines, supplied by the company International Business Machines (IBM), which correlated data using sets of punched cards. These were in no sense computers in the modern meaning of the word, but moronic machines, using mechanical switches, that could be set up to perform basic arithmetical operations. For example, a machine could take two cards punched with holes corresponding to two numbers (say, 7 and 8) and add them up, spitting out a card punched with holes corresponding, in this case, to the number 15. They could alternatively be set up to carry out subtraction or multiplication, and they could handle large numbers of cards. Several machines could be set up so that the output from one machine became the input of the next, and so on. In this way they could carry out tasks along the lines of “take a number, double it, square the result, take away the number you first thought of and punch a card with the answer on.”

In
Surely You're Joking, Mr. Feynman?
, Richard Feynman describes how machines like this were used to carry out the donkey work of computations for the Manhattan Project. It was a colleague, Stanley Frankel, who realized the potential of the IBM machines, but Feynman ended up being in charge of their operation. The first step was to break down the calculations involved in working out things like the compressibility of plutonium into their individual components—something now familiar to anyone who does computer programming. The instructions for the complex calculations were then
passed to a team of “girls” (as Feynman describes them), each armed with a mechanical calculator, like a glorified adding machine, operated by hand. One girl did nothing but multiply the numbers she was given and pass the result on to the next girl; one did nothing but calculate cubes; and so on. The system was tested in this way and “debugged,” in computer jargon, until Frankel and Feynman knew it worked, then put onto a production line basis using the IBM machines. In fact, after a bit of practice the team of girls was just as fast as the room full of IBM machines. “The only difference,” says Feynman, “is that the IBM machines didn't get tired and could work three shifts. But the girls got tired after a while.” Von Neumann was closely involved with this project, as one of Feynman's “customers,” and learned all about the operation of the system in the spring of 1944. To all intents and purposes, this was computing without a computer, with Feynman as the programmer, and it highlights the point that for all Turing's hopes we do not yet have anything like a mechanical intelligence; we only have machines that can do the same thing as a team of humans, but faster and without tiring. In either case, the team, or the machine, needs a human programmer.

THE AMERICAN HERITAGE

Computers of the kind we use today owe as much to von Neumann as to Turing, and in his post-war work von Neumann built on a heritage of American developments. The prehistory of electronic computing in the United States had two strands, one involving computing and the other involving electronics. The direct line to the IBM card-sorting machines which Feynman used to carry out calculations for von
Neumann goes back to the American census of 1890. The tenth US census of 1880 (and all previous ones) had been tabulated by hand, with hundreds of clerks copying stacks of information from the record sheets into various categories. But with the US population growing so rapidly through immigration, the point was being reached where it would be impossible to tabulate the results of one census fully before the next census was due. John Billings, who was in charge of statistical analysis for both the 1880 and the 1890 censuses, was well aware of the problem, and early in the 1880s this led him into a conversation recalled by his colleague Herman Hollerith in 1919:

One Sunday evening at Dr. Billings' tea table, he said to me there ought to be a machine for doing the purely mechanical work of tabulating population and similar statistics. We talked the matter over and I remember [that] he thought of using cards with the descriptions of the individual shown by notches punched in the edge of the card. [I] said that I thought I could work out a solution for the problem and asked him if he would go in with me. The Doctor said he was not interested any further than to see some solution of the problem worked out.
7

So it was Hollerith who put flesh on the bones of Billings' idea, and who by the time of the 1890 census had developed a system based on punched cards (which he chose to be the size of a dollar bill) where the pattern of holes punched in the cards indicated characteristics of the individual, such as whether they were born in the United States or not, their sex, whether they were married or not, how many children they had and so on. The cards were read by an electromechanical
sorter which could, for example, take a batch of cards from a particular city and select from them all the ones corresponding to, say, married white men born in the United States with at least two children.

The success of the equipment, used in the 1890 census to process the records of some 63 million people, using 56 million cards, enabled Hollerith to set up the Tabulating Machine Company in 1896. This evolved into the Computer-Tabulating-Recording Company in 1911, and then into the International Business Machines Corporation (IBM) in 1924. By 1936, when Turing published “On Computable Numbers,” the world was using 4 billion cards each year; with hindsight, each card could be regarded as a “cell” in the endless tape of a Turing machine. The reason for this growth in the use of punched cards was that it had been realized that they could be used not just to tabulate statistics, but to carry out arithmetical operations. And the need to mechanize arithmetic had been made clear by the requirements of the military in the First World War, just as the need for high-speed electronic computers would be made clear by the requirements of the military in the Second World War.

The specific military requirement that encouraged the development of punched-card computing was the need to calculate the flight of shells fired from guns, and later the fall of bombs dropped from aircraft. Ballistics would be a simple science on an airless planet, where projectiles would follow parabolic trajectories described beautifully by Newton's laws, under the influence of gravity. Unfortunately, in the real world shells in flight are affected by the density of the air, which changes with altitude, temperature, humidity and other factors, as well as the initial velocity of the projectile. In order
to hit a given target, even assuming there is no wind to deflect the shell in its flight and ignoring the subtleties caused by the rotation of the Earth, the gun has to be elevated at an angle which takes all of these factors into account. As if this were not tricky enough, each gun has its own firing characteristics; before guns could be supplied to the army they had to be tested, with a so-called “firing table” being worked out for each individual weapon. In the field, the gunners would have to refer to these tables in order to determine exactly how to elevate their guns under different conditions. A typical firing table had several thousand entries, corresponding to different trajectories, and it would take several hours for a human armed with a desk calculator to work out a single trajectory. The result was a major bottleneck between the manufacture of guns in factories and their delivery to armies in the field.

One of the leaders in the field of military ballistics in the United States in the First World War was the mathematician Oswald Veblen, who headed a team at the Army's Aberdeen Proving Ground in Maryland. He would later, as a professor at Princeton University, play a major role in establishing the Institute for Advanced Study, and in particular in ensuring that it started life with a group of eminent mathematicians, including Johnny von Neumann.

In the decade following the war, Hollerith-type punched-card machines began to be used for some scientific purposes, especially the tedious calculation of astronomical tables, and there were parallel developments which have been described by Herman Goldstine, who became von Neumann's right-hand man, in his book
The Computer from Pascal to von Neumann
. But here I shall focus on the thread that led directly to von Neumann himself. A key step came in 1933, when the
Ordnance Department of the US Army established a joint venture with the Moore School of the University of Maryland to develop an improved calculating machine. Another involved John Presper Eckert, working at Columbia University on problems of celestial mechanics, originally using standard IBM machines of the late 1920s; the success of this work encouraged IBM to produce a special machine, called the Difference Tabulator, for the Columbia astronomers. Out of this collaboration grew the Thomas J. Watson Astronomical Computing Bureau,
8
a joint venture of the American Astronomical Society, Columbia University and IBM. This, says Goldstine, “marked the first step in the movement of IBM out of the punch card machine business and into the modern field of electronic computers.” IBM's interest was stimulated further by Howard Aiken, of Harvard University, who proposed developing a system based on punched-card machines to produce an electromechanical computer; a project based on his ideas began in 1939 and achieved success in 1944, although by then it was being overtaken by developments in electronic computing. Another electromechanical device, developed at Bell Laboratories and using telephone switching relays, was also completed in 1944, and suffered the same fate. But the Americans were not the first to do this.

A GERMAN DIVERSION

In the mid-1930s, a German engineer called Konrad Zuse, working in the aircraft industry, developed an electromechanical calculating machine using binary arithmetic. He was completely ignorant of developments in other parts of the world, and worked everything out from scratch. This Z1, completed in 1938, used on/off switches in the form of
steel pins that could be positioned to the right or left of a steel lug using electromagnets. A friend of Zuse's, Helmut Schreyer, suggested that vacuum tubes (of which more below) would be more efficient, and the two of them calculated that a machine using 2,000 tubes would be feasible; but, like their counterparts in the United States they felt that the technology of the time was too unreliable to be put to practical use immediately. With the outbreak of war, Zuse was called up for the army, but after six months he was discharged to work at the Henschel aircraft factory, where he was involved in the development of the V1, the first cruise missile. He offered the authorities his and Schreyer's idea of a 2,000-tube computer to use in directing anti-aircraft fire, but when he said the project would need two years to come to fruition he was told it was not worth funding because the war would be over by then.

Other books

Dual Assassins by Edward Vogler
Drake by Peter McLean
The Windsingers by Megan Lindholm
Up in Flames by Tory Richards
Inside Threat by Jason Elam, Steve Yohn
Man of God by Diaz, Debra
Make You Blush by Beckett, Macy
Diamond Bay by Linda Howard