Read It Began with Babbage Online
Authors: Subrata Dasgupta
Although computing research had enjoyed a very short life thus far, evolution was in evidence. Phylogenies were created. However, this evolutionary process was not Darwinian, for the latter demands lack of purpose, randomness, chance. Rather, it was evolution driven by purpose. Each member of an evolutionary family was the product of an intended goal or purpose, a design that constituted a theory of the proposed artifactâand a hypothesis that if the artifact was built according to the design, it would satisfy the intended purpose; and an implementation that tested the theory followed by modification or revision of the design as a result of the experiment or modified purposes; and a new design. Each design became a theory of (or for) a particular computing machine. Each implementation became an extended experiment that tested the associated theory.
Almost as an aside stood Alan Turing's work. His abstract machine was a purely logico-mathematical device, albeit a device quite alien to most mathematicians and logicians of the time. At this stage of our story, Turing's machine stands in splendid isolation from the empirical, experimental design-as-theory, implementation-as-experiment work that was going on in Britain, the United States, and Germany before and during the war years. The Turing machine had had no impact on the design of computing machines thus far. Even in Bletchley Park, despite the fact that Turing had worked there, despite the fact that many of his colleagues there knew of his 1936 paper, the architecture of the Colossus was quite uninfluenced by Turing's machine.
5
On the other hand, the inventors and designers of computing machines during the 1930s and throughout the war years, be they mathematicians, physicists, astronomers, or engineers, clearly envisioned their machines as mathematical and scientific (and, in the case of the Colossus, logical) instruments. In
this
sense, they were mathematical machines.
The data processing, punched-card machines pioneered by Hollerith and evolved by such companies as IBM during the first third of the 20th century were, if not mathematical, certainly
number processors
. So mathematics in some form or another was the central
preoccupation of these designers. Unlike the Turing machine, though, they were
real
artifacts. They had to be built. And machine building was the stuff of engineering.
This is where matters stood circa 1945. If people recognized that a discipline of computing was emerging, they had no name for it, nor was there a firmly established shared framework, a paradigm, in place. At best, these early pioneers may have thought their unnamed craft lay in a kind of no-man's land between a new kind of mathematics and a new kind of engineering.
 Â
1
. T. S. Kuhn. (1970).
The structure of scientific revolutions
(2nd ed.) Chicago, IL: University of Chicago Press (original work published 1962).
 Â
2
. The literature on Kuhn's theory of paradigms is vast. An early set of responses is the collection of essays in I. Lakatos & A. Musgrave. (Eds.). (1970).
Criticism and the growth of knowledge
. Cambridge, UK: Cambridge University Press. An important later critique is L. Laudan. (1977).
Progress and its problems
. Los Angeles, CA: University of California Press. See also G. Gutting. (1980).
Paradigms and revolutions
. Notre Dame, IN: University of Notre Dame Press. For a more recent critical study of Kuhn, see S. Fuller. (2000).
Thomas Kuhn: A philosophical history for our times
. Chicago, IL: University of Chicago Press. This book also has an extensive bibliography on Kuhn and his theory. Kuhn's own thoughts following the publication of the second edition of
Structure
, in response to his critics, are published in J. Conant & J. Haugeland. (Eds.). (2000).
The road since
Structure. Chicago, IL: University of Chicago Press.
 Â
3
. Kuhn, op cit., p. 12.
 Â
4
. Ibid., p. 15.
 Â
5
. There was an idea that Max Newman, the principle architect of the Colossus, was influenced by Turing's work, but this claim was never substantiated. See B. Randell (1980), “The Colossus”. In N. Metropolis, J.S. Howlett, & G.-C. Rota (Eds.).
A history of computing in the twentieth century
. (pp. 47â92). New York: Academic Press.
ON FEBRUARY
15, 1946, a giant of a machine called the
ENIAC
, an acronym for Electronic Numerical Integrator And Computer, was commissioned at a ceremony at the Moore School of Electrical Engineering at the University of Pennsylvania, Philadelphia.
The name is noteworthy. We see that the word
computerâ
to mean the machine and not the personâhad cautiously entered the emerging vocabulary of computer culture. Bell Laboratories named one of its machines Complex Computer; another, Ballistic Computer (see
Chapter 5
, Section I). Still, the embryonic world of computing was hesitant; the terms “calculator”, “calculating machine”, “computing machine”, and “computing engine” still prevailed. The ENIAC's full name (which, of course, would never be used after the acronym was established) seemed, at last, to flaunt the fact that this machine had a definite identity, that it was a computer.
The tale of the ENIAC is a fascinating tale in its own right, but it is also a very important tale. Computer scientists and engineers of later times may be ignorant about the Bell Laboratories machines, they may be hazy about the Harvard Mark series, they may have only an inkling about Babbage's dream machines, but they will more than likely have heard about the ENIAC. Why was this so? What was it about the ENIAC that admits its story into the larger story?
It was not the first
electronic
computer; the Colossus preceded the ENIAC by 2 years. True, no one outside the Bletchley Park community knew about the Colossus, but from a historical perspective, for historians writing about the state of computing in the 1940s, the Colossus clearly took precedence over the ENIAC. In fact (as we will soon see), there was another electronic computer built in America that preceded the ENIAC. Nor was
the ENIAC the first
programmable
computer. Zuse's Z3 and Aiken's Harvard Mark I, as well as the Colossus, well preceded the ENIAC in this realm.
As for that other Holy Grail,
general purposeness
, this was, as we have noted, an elusive target (see
Chapter 6
, Section III). No one would claim that the Colossus was general purpose; it had been described as a “Boolean calculating machine” (see
Chapter 6
, Section XIII).
1
But, the ENIAC provoked more uncertainty. For one person who was intimately involved in its design and construction, the ENIAC was “a general-purpose scientific computer”
2
âthat is, a computer capable of solving, very fast, a wide variety of scientific, mathematical, and engineering problems.
3
For another major participant in the ENIAC project, it was “a mathematical instrument.”
4
A later writer somewhat extravagantly called it a “universal electronic calculator.”
5
A more tempered assessment by a computer scientist and historian of computing spoke of the ENIAC as comparable with the Colossus; both were special-purpose machinesâthe former specialized for numeric computation; the latter, for Boolean calculations.
6
Perhaps, then, it seems reasonable to claim that the ENIAC was a general-purpose
numeric
computer, specialized for solving mathematical and scientific problems using methods of numeric analysis. It was an analytical engine as Babbage had dreamed of.
However, the ENIAC's historical significance, its
originality
, lay in other directions. There was, first, its sheer scale physically, technologically, and computationally. Physically, the machine was a mammoth, occupying three walls of a 30-foot-by-50-foot room and much of the central space. Technologically, it used 18,000 vacuum tubes of 16 different types.
7
Added to that, it used 70,000 resistors, 10,000 capacitors, 1500 relays, and 6000 manual switches.
8
This was an order of technological complexity far in excess of anything achieved in computing before. And, computationally, because of its electronic technology, it was vastly faster than any other previous computing machinesâabout 1000 times faster than its nearest competitor, the electromechanical Harvard Mark I.
9
The significance of using 18,000 vacuum tubes from the perspective of reliability is worth noting. The ENIAC was a synchronous machine, pulsed by a clock signal every 10 microseconds. If any one of these vacuum tubes malfunctioned, an error would occur every 10 microseconds. With this many tubes, the reliability of the components was of the essence. Even the failure of a single vacuum tube could cause a digit to be erroneous.
10
By carefully selecting rigidly tested components that were then operated well below their “normal ratings,”
11
the reliability of the computer was maintained at an acceptable level. Writing several months after its commission, Arthur Burks (1915â2008)âa mathematician who would later become known as much as a computer theorist and philosopher of science, as one of the ENIAC's lead engineersâcommented that, after the initial phase of testing, the failure rate was about two or three per week. These failures, however, could be identified and corrected quickly by operators thoroughly conversant with the ENIAC design so that, in effect, only a few hours were lost per week as a result of failures.
12
This, then, was one of the ENIAC's major achievements: it demonstrated the
viability
of large-scale use of electronic components in digital computers. It heralded the viability and the advent of large-scale electronic digital computers.
The other historically significant factor was that the ENIAC had
consequences
. Experience with its design, especially its organizational principles, and the resulting dissatisfaction showed the way for a new, crucial concept: the stored-program computer (discussed much more later). This concept was like the crucial missing piece of a jigsaw puzzle. It was instrumental in the formation of a style for the logical and functional organization of computersâfor computer architecture (in present-centered language). So compelling was this style, so quickly was it accepted by the fledgling computer culture of its time, that it became the foundation of the first genuine paradigm in computing (in Thomas Kuhn's sense; see
Chapter 6
). As we will see, discontent with the ENIAC was a catalyst that led to the birth of computer science.
For these various reasonsâits general-purposeness in the domain of numeric computation; its scale of physical size, technological complexity, and speed of operation; its consequence for the making of the paradigm for a science of computingâthe ENIAC has a compelling place in our story. But there is more. The story of the ENIAC, both in terms of the genesis of its principlesâthat is, the past that fed into itâand the future it helped to shape form a tangled web of ideas, concepts, insights, and personalities. We learn much about the
ontogeny
of artifacts (to use a biological term)âits developmental historyâfrom the story of the ENIAC.
13
The ENIAC was, of course, a child of World War II, although it never saw wartime action. The ENIAC project began in June 1943 and the machine was commissioned in February 1946, exactly 6 months after the Japanese surrender and the end of the war. The project began in the Ballistics Research Laboratory (BRL) of the Aberdeen Proving Ground in Maryland. With the American entry into the war in December 1941, this laboratory established a scientific advisory committee of some of the country's leading scientists.
14
One of them was Hungarian-born John von Neumann (1903â1957), mathematician extraordinaire, a professor at the Institute of Advanced Study, Princeton (along with the likes of Einstein and Gödel), and an influential figure in the corridors of power in Washington, DC.
Among the scientistsâmathematicians, physicists, chemists, astronomers, and astrophysicistsâassembled as BRL's scientific staff for the war effort was the young Herman Goldstine (1913â2004), an assistant professor of mathematics at the University of Michigan until 1942, when he was called into wartime service. His charge at the BRL was ballistic computation.
15
Ballistic computation, solving differential equations to compute ballistic trajectories, demanded computing machines. The aim of these computations was to prepare firing tables that, for a particular shell, meant that up to some 3000 trajectories had to be computed for a particular range of initial firing conditions, such as muzzle velocity and firing angle.
16
As it happened, the Aberdeen Proving Ground had acquired, in 1935 (well before the BRL was founded) a “copy” of a machine called the
differential analyzer
that was quite unlike the kind of computers developed in places like Bell Laboratories or Harvard. The differential analyzer was an
analog
computer and, as the war came to America, was considered the most powerful device for the solution of differential equations, which was what it was designed expressly to do.
In 1931, an engineer named Vannevar Bush (1890â1974), a professor in the department of electrical engineering at MIT in Cambridge, Massachusetts, had invented the differential analyzer, a machine powered by an electric motor, but otherwise a purely mechanical device. It was an analog machine because, rather than transform an analytical expression (such as a differential equation) into a digital computational problem (as was the tradition ever since Babbage), the mathematical behavior of the “system” of interest (a ballistic trajectory, for example) would be modeled by another physical system with behavior that corresponded exactly (or as approximately as close as possible) to the system of interest. The model served as an
analog
to the problem system. By manipulating the model, the desired computation would be performed.