Read Darwin Among the Machines Online
Authors: George B. Dyson
“Von Neumann . . . wanted to find an explanation or at least a way to understand this very puzzling large number,” wrote Ulam. “Small numbers like
Ï
and
e
, are of course very frequent in physics, but here is a number of the order of thousands and yet it is a pure number with no dimensions: it does whet our curiosity.”
24
Von Neumann later suggested a similar distinction in computational complexity, marking the transition from a relatively small number of units foiming an orderly, deterministic system to a probabilistic system composed of a large number of interconnected components whose behavior cannot be described (or predicted) more economically than by a statistical description of the system as a whole. Von Neumann was intrigued by the origins of self-organization in complicated systemsâbehavior reminiscent of the origins of turbulence but on a different scale. He understood that the boundaries between physics and computational models of physics are imprecise. The behavior of a turbulent hydrodynamic system can be predicted only by accounting for all interactions down to molecular scale. The situation can be modeled in computationally manageable form either by adopting a coarser numerical mesh, following a random sample of elements and drawing statistical conclusions accordingly, or by slowing the computation down in time. To make useful predictions of an ongoing processâsay, a forecast of tomorrow's weatherâthe computation has to be speeded up.
The goal of weather prediction stimulated the development of electronic computers on several fronts. John Mauchly first conceived the ENIAC as an electronic differential analyzer to assist in recognizing long-term cycles in the weatherâbefore another war came along with its demands for firing tables for bigger guns. Vladimir Zworykin (1889â1982), the brilliant Russian immigrant who brought electronic
television into existence with his invention of the iconoscope, or camera tube, in 1923, also foresaw the potential of an electronic computer as a meteorological oracle for the world. Norbert Wiener, patron of cybernetics, who embraced the anti-aircraft gun but shunned the bomb, was a vocal proponent of atmospheric modeling and prophesied growing parallels between computers powerful enough to model nonlinear systems, such as the weather, and “the very complicated study of the nervous system which is itself a sort of cerebral meteorology.”
25
A cellular approach to numerically modeling the weather was developed by meteorologist Lewis Fry Richardson (1881â1953), who refined his atmospheric model, calculating entirely by hand and on “a heap of hay in a cold rest billet,” while serving in the Friends' Ambulance Unit attached to the Sixteenth Division of the French infantry in World War I. “During the battle of Champagne in April 1917,” wrote Richardson in the preface to his
Weather Prediction by Numerical Process
(1922), “the working copy was sent to the rear, where it became lost, to be rediscovered some months later under a heap of coal.”
26
Richardson imagined partitioning the earth's surface into several thousand meteorological cells, relaying current observations to the arched galleries and sunken amphitheater of a great hall, where some 64,000 human computers would continuously evaluate the equations governing each cell's relations with its immediate neighbors, constructing a numerical model of the earth's weather in real time. “Perhaps some day in the dim future, it will be possible to advance the computations faster than the weather advances,” hoped Richardson, “and at a cost less than the saving to mankind due to the information gained.”
27
Richardson thereby anticipated massively parallel computing, his 64,000 mathematicians reincarnated seventy years later as the multiple processors of Danny Hillis's Connection Machine. “We had decided to simplify things by starting out with only 64,000 processors,” explained Hillis, recalling how Richard Feynman helped him bring Lewis Richardson's fantasy to life.
28
Even without the Connection Machine, Richardson's approach to cellular modeling would be widely adopted once it became possible to assign one high-speed digital computer rather than 64,000 human beings to keep track of numerical values and relations among individual cells. The faint ghost of Lewis Richardson haunts every spreadsheet in use today.
After the war, Richardson settled down as a meteorologist at Benson, Oxfordshire, contributing to the mathematical theory of turbulence and developing a novel method for remote sensing of
movement in the upper air. By shooting small steel balls (between the size of a pea and a cherry) at the zenith and observing where they fell, Richardson helped turn swords into plowshares with a system that was faster, more accurate, and more robust than using balloons. When the Meteorological Office was transferred to the jurisdiction of the Air Ministry, Department of War, Richardson felt compelled, as a Quaker, to resign his post. Later still, when he discovered that poison-gas technicians were interested in his methods for predicting atmospheric flow, he ended his meteorological research, launching a mathematical investigation into the causes of war to which he devoted the remainder of his life. His studies were published posthumously in two separate volumes:
Arms and Insecurity
, an analysis of arms races, and
Statistics of Deadly Quarrels
, which documents every known category of violent conflict, from murder to strategic bombing, arranged both chronologically and on a scale of magnitude based on the logarithm of the number of victims left dead.
29
Richardson, having managed an electrical laboratory before World War I, might have contributed more to the development of electronic computers had there been laboratory facilities not directly involved in military research. Working entirely on his own in the late 1920s in Paisley, Scotland, he produced an odd but insightful paper. “The Analogy Between Mental Images and Sparks” includes schematic diagrams of two simple electronic devices that Richardson constructed to illustrate his theories on the nature of synaptic function deep within the brain. One of these circuit diagrams is captioned “Electrical Model illustrating a Mind having a Will but capable of only Two Ideas.”
30
Richardson had laid the foundations for massively parallel computing in the absence of any equipment except his own imagination; now, with nothing but a few bits of common electrical hardware, he gave bold hints as to the physical basis of mind. But he had no interest in elaborating on these principles or attempting to embody them on a wider scale. His ideas lay dormant in the pages of the
Psychological Review
.
Von Neumann saw to it that the powers of the electronic computer brought Richardson's dream (and, with the invention of the atomic bomb, his nightmares) to life. The first public announcement of von Neumann's postwar computer project was made by the
New York Times
after a meeting between Zworykin, von Neumann, and Francis W. Reichelderfer, chief of the U.S. Weather Bureau in Washington, D.C. The “development of a new electronic calculator, reported to have astounding potentialities . . . might even make it possible to âdo something about the weather,'” the
Times
reported. “Atomic energy
might provide a means for diverting, by its explosive power, a hurricane before it could strike a populated place.”
31
Stan Ulam hinted at the required scale: “To be used in âweather control,' one will have to consider among other problems the interaction between several, perhaps nearly simultaneous, explosions.”
32
The ENIAC was still a military secret, leading the
Times
to conclude that “none of the existing [computing] machines, however, is as pretentious in scope as the von Neumann-Zworykin device.” It
was
true that no program as ambitious was in the works. Von Neumann and Zworykin were not proposing to build just
a
computer, but a
network
of computers that would span the world. “With enough of these machines (100 was mentioned as an arbitrary figure) area stations could be set up which would make it possible to forecast the weather all over the world.”
33
Richardson's methodsâbreaking up a complex problem into a mosaic of computational cellsâwere equally adaptable to meteorology, fluid dynamics, and the peculiar shock-wave effects that governed both the construction of an atomic bomb and the physical destruction produced when one went off. Von Neumann took care of the bombs first. Later, when he developed his own computing center at the Institute for Advanced Study, he established a numerical meteorological group under Jule Charney that transformed Richardson's proposal into a working operation, leading directly to the system of numerical weather forecasting that models our atmosphere today.
The idea of building a general-purpose electronic digital computer had long been incubating in von Neumann's mind. “Von Neumann was well aware of the fundamental importance of Turing's paper of 1936 âOn computable numbers' which describes in principle the âUniversal Computer' of which every modern computer (perhaps not ENIAC as first completed but certainly all later ones) is a realization,” recalled Stan Frankel, who supervised numerical computation at Los Alamos during the war. “Von Neumann introduced me to that paper and at his urging I studied it with care. Many people have acclaimed von Neumann as the âfather of the computer' (in a modern sense of the term) but I am sure that he would never have made that mistake himself.”
34
Separated by personality and style, Turing and von Neumann labored independently to bring digital computers to life. While Turing was at Princeton University in 1937, preparing his doctoral thesis under Alonzo Church, he worked in close proximity to von Neumann. But he declined the offer of a position as von Neumann's assistant, choosing instead to return to England and his destiny as the mastermind of Bletchley Park.
In contrast to the respectful distance that characterized his relations with Turing, von Neumann maintained a close friendship and long correspondence with Rudolf Ortvay, director of the Theoretical Physics Institute at the University of Budapest. There were two complementary approaches to digital computers. The first was to start from the most elementary beginnings, in the style of Leibniz or Turing, using nothing except 1s and 0sâor switches of some kind or another, which are 1s and 0s in physical form. The other approach, advocated by Ortvay, was to proceed in the opposite direction, taking as a starting point that most complicated known computer, the human brain.
âI read through your paper on games, and it gave me hope that you might succeed in formulating the problem of switching of brain cells if I succeed in drawing your attention to it,” wrote Ortvay to von Neumann in 1941. Ortvay's suggestions encouraged von Neumann's efforts to develop a theory of automata general enough to apply both to the construction of digital computers and to the operation of the brain. “The brain can be conceived as a network with brain cells in its nodes. These are connected in a way that every individual cell can receive impulses from more than one other cell and can transmit impulses to several cells. Which of these impulses are received from or passed on to other cells may depend on the state of the cell, which in turn depends on the effects of anything that previously affected this particular cell. . . . The actual state of the cells (which I conceive as being numbered) would characterize the state of the brain. There would be a certain distribution corresponding to every spiritual state. . . . This model may resemble an automatic telephone switch-board; there is, however, a change in the connections after every communication.”
35
The link between game theory and a theory of neural nets was never brought to fruition by von Neumann, although there are hints that his theory of automata was so inclined.
A Turing machine assembles a complex computation from a sequence of atomistic steps, whereas, as Ortvay suggested, the brain represents a computational process by a network of intercommunicating components, the chain of events being spatially distributed and not necessarily restricted to one computational step at a time. In 1943, neuropsychiatrist Warren S. McCulloch and mathematician Walter Pitts published their “Logical Calculus of the Ideas Immanent in Nervous Activity,” showing that in principle (and for extremely simplified theoretical neurons) the computational behavior of any neural net can be duplicated exactly by an equivalent Turing machine.
36
This paper was widely cited in support of analogies between
digital computers and brains. When von Neumann compiled the
First Draft of a Report on the EDVAC
, a document that launched the breed of stored-program computers that surround us today, he adopted the McCulloch-Pitts symbolism in diagramming the logical structure of the proposed computer and introduced terms such as
organ, neuron
, and
memory
, which were more common among biologists than among electrical engineers.
The EDVAC (Electronic Discrete Variable Automatic Computer) was conceived while the ENIAC was being built. The ability to modify its own instructions gained the EDVAC a distinction as the first full-fledged stored-program computer design. Beset by a series of technical and administrative delays, the original EDVAC did not become operational until late 1951, preceded and outperformed by its own more nimble offspring in both England and the United States. The EDVAC was nonetheless immortalized as the conceptual nucleus around which successive generations of computers formed. The project was initiated by Mauchly, Eckert, Goldstine, Arthur Burks, and others at the Moore School, but it was von Neumann's involvement that ignited the chain reaction that spread computers around the world. The EDVAC stored both data and instructions in mercury delay-line memory, as binary code. As in Turing's universal machine, long strings of bits defined not only numbers to be operated on but the sequence and potentially dynamic structure of the operations to be performed. The EDVAC thus embodied Turing's principle that complexity and adaptability could be more profitably assigned to the coding than to the machine.