Darwin Among the Machines (16 page)

Read Darwin Among the Machines Online

Authors: George B. Dyson

BOOK: Darwin Among the Machines
5.92Mb size Format: txt, pdf, ePub

“The perturbation arising from the resistance of the medium . . . does not, on account of its manifold forms, submit to fixed laws and exact description,” Galileo explained in 1638, “and disturbs [the trajectory] in an infinite variety of ways corresponding to the infinite variety in the form, weight, and velocity of the projectiles.”
14
Galileo found the behavior of high-velocity cannon fire mathematically impenetrable and limited his services to the preparation of firing tables for low-velocity trajectories, “since shots of this kind are fired from mortars using small charges and imparting no supernatural momentum they follow their prescribed paths very exactly.”
15
The production of firing tables still demanded equal measures of art and science when von Neumann arrived on the scene at the beginning of World War II. Shells were test-fired down a range equipped with magnetic pickup coils to provide baseline data. Then the influence of as many variables as could be assigned predictable functions was combined to produce a firing table composed of between two thousand and four thousand individual trajectories, each trajectory requiring about 750 multiplications to determine the path of that particular shell for a representative fifty points in time.

A human computer working with a desk calculator took about twelve hours to calculate a single trajectory; the electromechanical differential analyzer at the Ballistic Research Laboratory (a ten-integrator version of the machine that Vannevar Bush had developed at MIT) took ten or twenty minutes. This still amounted to about 750 hours, or a month of uninterrupted operation, to complete one firing table. Even with double shifts and the assistance of a second, fourteen-integrator differential analyzer (constructed at the Moore School of Electrical Engineering at the University of Pennsylvania in Philadelphia), each firing table required about three months of work. The nearly two hundred human computers working at the Ballistic Research Laboratory were falling hopelessly behind. “The number of tables for which work has not been started because of lack of computational facilities far exceeds the number in progress,” reported Herman Goldstine in August 1944. “Requests for the preparation of new tables are being currently received at the rate of six per day.”
16

Electromechanical assistance was not enough. In April 1943, the army initiated a crash program to build an electronic digital computer
based on decimal counting circuits made from vacuum tubes. Rings of individual flip-flops (two vacuum tubes each) were formed into circular, ten-stage counters linked to each other and to an array of storage registers, forming, in effect, the electronic equivalent of Leibniz's stepped reckoner, but running at about six million rpm. The ENIAC (Electronic Numerical Integrator and Computer) was constructed by a team that included John W. Mauchly, John Presper Eckert, and (Captain) Herman H. Goldstine, supervised by John G. Brainerd under a contract between the army's Ballistic Research Laboratory and the Moore School. A direct descendant of the electromechanical differential analyzers of the 1930s, the ENIAC represented a brief but fertile intersection between the otherwise diverging destinies of analog and digital computing machines. Incorporating eighteen thousand vacuum tubes operating at 100,000 pulses per second, the ENIAC consumed 150 kilowatts of power arid held twenty 10-digit numbers in high-speed storage. With the addition of a magnetic-core memory of 120 numbers in 1953, the working life of the ENIAC was extended until October 1955.

Programmed by hand-configured plugboards (like a telephone switchboard) and resistor-matrix function tables (set up as read-only memory, or ROM), the ENIAC was later adapted to a crude form of stored-program control. Input and output was via standard punched-card equipment requisitioned from IBM. It was thus possible for the Los Alamos mathematicians to show up with their own decks of cards and produce intelligible results. Rushed into existence with a single goal in mind, the ENIAC became operational in late 1945 and just missed seeing active service in the war. To celebrate its public dedication in February 1946, the ENIAC computed a shell trajectory in twenty seconds—ten seconds faster than the flight of the shell and a thousand times faster than the methods it replaced. But the ENIAC was born with time on its hands, because the backlog of firing-table calculations vanished when hostilities ceased.

Sudden leaps in biological or technological evolution occur when an existing structure or behavior is appropriated by a new function that spreads rapidly across the evolutionary landscape, taking advantage of a head start. Feathers must have had some other purpose before they were used to fly. U-boat commanders appropriated the Enigma machine first developed for use by banks. Charles Babbage envisioned using the existing network of church steeples that rose above the chaos of London as the foundation for a packet-switched communications net. According to neurophysiologist William Calvin, the human mind appropriated areas of our brains that first evolved as buffers for rehearsing and storing the precise timing sequences
required for ballistic motor control. “Throwing rocks at even stationary prey requires great precision in the timing of rock release from an overarm throw, with the ‘launch window' narrowing eight-fold when the throwing distance is doubled from a beginner's throw,” he observed in 1983. “Paralleled timing neurons can overcome the usual neural noise limitations via the law of large numbers, suggesting that enhanced throwing skill could have produced strong selection pressure for any evolutionary trends that provided additional timing neurons. . . . This emergent property of parallel timing circuits has implications not only for brain size but for brain reorganization, as another way of increasing the numbers of timing neurons is to temporarily borrow them from elsewhere in the brain.”
17
According to Calvin's theory, surplus off-hours capacity was appropriated by such abstractions as language, consciousness, and culture, invading the neighborhood much as artists colonize a warehouse district, which then becomes the gallery district as landlords raise the rent. The same thing happened to the ENIAC: a mechanism developed for ballistics was expropriated for something else.

John Mauchly, Presper Eckert, and others involved in the design and construction of the ENIAC had every intention of making wider use of their computer, but it was von Neumann who carried enough clout to preempt the scheduled ballistics calculations and proceed directly with a numerical simulation of the super bomb. The calculation took a brute-force numerical approach to a system of three partial differential equations otherwise resistant to analytical assault. As one million mass points were shuffled one IBM card at a time through the accumulators and registers at the core of the ENIAC's eighty-foot-long vault, the first step was taken toward the explosion of a device that was, as Oppenheimer put it, “singularly proof against any form of experimental approach.”
18
The test gave a false positive result: the arithmetic was right, but the underlying physics was wrong. Teller and colleagues were led to believe that the super design would work, and the government was led to believe that if the Americans didn't build one the Soviets might build one first. By the time the errors were discovered, the hydrogen-bomb project had acquired the momentum to keep going until an alternative was invented that worked.

The super fizzled, but the ENIAC was hailed as an unqualified success. Because of the secret nature of their problem the Los Alamos mathematicians had to manage the calculation firsthand. They became intimately familiar with the operation of the computer and suggested improvements to its design. A machine capable of performing such a calculation could, in principle, compute an answer to any
problem presented in numerical form. Von Neumann discovered in the ENIAC an instrument through which his virtuoso talents could play themselves out to the full—inventing new forms of mathematics as he went along. “It was his feeling that a mathematician who was pursuing some new field of endeavor or trying to extend the scope of older fields, should be able to obtain clues for his guidance by using an electronic digital machine,” explained Willis Ware in 1953. “It was, therefore, the most natural thing that von Neumann felt that he would like to have at his own disposal such a machine.”
19

During the war, von Neumann had worked with the bomb designers at Los Alamos as well as with conventional-weapon designers calculating ballistic trajectories, blast and shock-wave effects, and the design of shaped charges for armor-piercing shells. It was his experience with the mathematics of shaped charges, in part, that led to the original success of the implosion-detonated atomic bomb. Bombs drew von Neumann's interest toward computers, and the growing power of computers helped sustain his interest in developing more powerful bombs. “You had an explosion a little above the ground and you wanted to know how the original wave would hit the ground, form a reflected wave, then combine near the ground with the original wave, and have an extra strong blast wave go out near the ground,” recalled Martin Schwarzschild, the Princeton astrophysicist whose numerical simulations of stellar evolution had much in common—and shared computer time—with simulations of hydrogen bombs. “That was a problem involving highly non-linear hydrodynamics. At that time it was only just understood descriptively. And that became a problem that I think von Neumann became very much interested in. He wanted a real problem that you really needed computers for.”
20

Software—first called “coding” and later “programming”—was invented on the spot to suit the available (or unavailable) machines. Physicist Richard Feynman served on the front lines in developing the computational methods and troubleshooting routines used at Los Alamos in early 1944, when desk calculators and punched-card accounting machines constituted the only hardware at hand. The calculations were executed by dozens of human computers (“girls” in Feynman's terminology) who passed intermediate results back and forth, weaving together a long sequence, or algorithm, of simpler steps. “If we got enough of these machines in a room, we could take the cards and put them through a cycle. Everybody who does numerical calculations now knows exactly what I'm talking about, but this was kind of a new thing then—mass production with machines.”
The problem was that the punched-card machines that Stan Frankel had ordered from IBM were delayed. So to test out Frankel's program, explained Feynman, “we set up this room with girls in it. Each one had a Marchant [mechanical calculator]. . . . We got speed with this system that was the predicted speed for the IBM machine[s]. The only difference is that the IBM machines didn't get tired and could work three shifts.”
21
By keeping all stages of the computation cycle busy all the time, Feynman invented the pipelining that has maximized the performance of high-speed processors ever since.

Many thriving computer algorithms are direct descendants of procedures worked out by human computers passing results back and forth by hand. The initial challenge was how to break large problems into smaller, computable parts. Physically distinct phenomena often proved to be computationally alike. A common thread running through many problems of military interest was fluid dynamics—a subject that had drawn von Neumann's attention by its mathematical intractability and its wide-ranging manifestations in the physical world. From the unfolding of the afternoon's weather to the flight of a missile through the atmosphere or the explosion, by implosion, of atomic bombs, common principles are at work. But the behavior of fluids in motion, however optically transparent, remained mathematically opaque. Beginning in the 1930s, von Neumann grew increasingly interested in the phenomenon of turbulence. He puzzled over the nature of the Reynolds number, a nondimensional number that characterizes the transition from laminar to turbulent flow. “The internal motion of water assumes one or other of two broadly distinguishable forms,” reported Osborne Reynolds in 1883, “either the elements of the fluid follow one another along lines of motion which lead in the most direct manner to their destination, or they eddy about in sinuous paths the most indirect possible.”
22

“It seemed, however, to be certain if the eddies were owing to one particular cause, that integration [of Stokes equations of fluid motion] would show the birth of eddies to depend on some definite value of cρU/μ,” explained Reynolds, introducing the parameter that bears his name.
23
As the product of length (of an object moving through a fluid or the distance over which a moving fluid is in contact with an object or a wall), density (of the fluid), and velocity (of the fluid or the object) divided by viscosity (of the fluid), the Reynolds number signifies the relative influence of these effects. All instances of fluid motion—water flowing through a pipe, a fish swimming through the sea, a missile flying through the air, or air flowing around the earth—can be compared on the basis of their Reynolds numbers to
predict the general behavior of the flow. A low Reynolds number indicates the predominance of viscosity (owing to molecular forces between individual fluid particles) in defining the character of the flow; a high Reynolds number indicates that inertial forces (due to the mass and velocity of individual particles) prevail. Reynolds identified this pure, dimensionless number as the means of distinguishing between laminar (linear) and turbulent (nonlinear) flow and revealed how (but not why) the development of minute, unstable eddies precipitates self-sustaining turbulence as the transitional value is approached. The critical Reynolds number thus characterizes a transition between an orderly, deterministic regime and a disorderly, probabilistic regime that can be described statistically but not in full detail.

Other books

Defiant in the Desert by Sharon Kendrick
Stripes of Fury by Zenina Masters
Punish Me with Kisses by William Bayer
No Turning Back by HelenKay Dimon
Lunar Lovers by Emma Abbiss
Hannah's Blessing by Collette Scott
One Reckless Night by Stephanie Morris
No One But You by Hart, Jillian