Read When Computers Were Human Online

Authors: David Alan Grier

When Computers Were Human (2 page)

BOOK: When Computers Were Human
4.98Mb size Format: txt, pdf, ePub
ads

Glover was an applied mathematician, an expert in the mathematics of finance, insurance, and governance. He had been employed as an actuary for Michigan's Teacher Retirement fund, had held the presidency of Andrew Carnegie's Teachers Insurance and Annuity Association, and, in the early years of the century, had served as a member of the Progressive Party's “brain trust,” the circle of academic advisors to the party leader, Robert La Follette.
3
Within the University of Michigan, Glover was an advocate for women's education, though he was at least partly motivated by a desire to increase enrollments in mathematics courses. He welcomed women to his classes, encouraged them to study in the department lounge, prepared them for graduate study, and helped them search for jobs. He pushed the women to look beyond the traditional role of school-teacher and consider careers in business and government. At a time when clerical jobs were still dominated by men, Glover helped his female students find positions as assistant actuaries and human computers, the workers who undertook difficult calculations in the days before electronic computers. At the end of his career, he recorded that he had advised nearly fifty women and that only “one-third have married and have retired from active business life.”
4

Of the six women who graduated in 1921, only one, my grandmother, never worked outside the home. The remaining five had mathematical careers that lasted into the 1950s. One was a human computer for the United States Army and prepared ballistics trajectories. A second did calculations for the Chemical Rubber Company, a publisher that sold handbooks to engineers and scientists. Another compiled health statistics for the state of Michigan. The fourth worked for the United States Bureau of Labor Statistics and eventually became the assistant director of a field office in Baton Rouge. The last female mathematics major of 1921 became an actuary, moved to New York City, and operated her own business.
5

Though my grandmother's hidden mathematical career held a special emotional appeal to me, it was the story of the other five women that captured my interest. What kind of world did they inhabit? What were their aspirations? What did they do each day? At the ends of their careers, what had they accomplished? Rather than restrict my scope to the five women who had known my grandmother or even the women mathematics graduates of the University of Michigan, I decided to look at the history of scientific computers, the workers who had done calculations for scientific research.

Scientific computation is not mathematics, though it is closely related to mathematical practice. One eighteenth-century computer remarked that calculation required nothing more than “persevering industry and
attention, which are not precisely the qualifications a mathematician is most anxious to be thought to possess.”
6
It might be best described as “blue-collar science,” the hard work of processing data or deriving predictions from scientific theories. “Mental labor” was the term used by the English mathematician Charles Babbage (1791–1871).
7
The origins of scientific calculation can be found in some of the earliest records of human history, the clay tables of Sumeria, the astronomical records of ancient shepherds who watched over their flocks by night, the land surveys of early China, the knotted cords of the Inca.
8
Its traditions were developed by astronomers and engineers and statisticians. It is kept alive, in a sophisticated form, by those graduate students and laboratory assistants who use electronic calculators and computer spreadsheets to prepare numbers for senior researchers.

Though many human computers toiled alone, the most influential worked in organized groups, which were sometimes called computing offices or computing laboratories. These groups form some of the earliest examples of a phenomenon known informally as “big science,” the combination of labor, capital, and machinery that undertakes the large problems of scientific research.
9
Many commentators identify the start of large-scale scientific research with the coordinated military research of the Second World War or the government-sponsored laboratories of the Cold War, but the roots of these projects can be traced to the computing offices of the eighteenth and nineteenth centuries.
10

It is possible to begin the story of organized computing long before the eighteenth century by starting with the great heavenly
Almagest
, the charts of the planets created by Claudius Ptolemy (85–165) in classical Egypt. As the connection between the ancient world and its modern counterpart is sometimes tenuous, we will begin our story just a few years before the opening of the eighteenth century with two events: the invention of calculus and the start of the Industrial Revolution. Both events are difficult to date exactly, but that is of little concern to this narrative. Identifying the inventors of specific ideas is less important than understanding how these ideas developed within the scientific community. Calculus gave scientists new ways of analyzing motion. Most historians of mathematics have concluded that it was invented independently by Isaac Newton (1642–1727) and Gottfried Wilhelm Leibniz (1646–1716) in the 1680s. It was initially used in astronomy, but it also opened new fields for scientific research. The Industrial Revolution, the economic and social change that was driven by the factory system and the invention of large machinery, created new techniques of management, developed public journalism as a means of disseminating ideas, and produced the modern factory.
11
Most scholars place the start of the Industrial Revolution at the end of the eighteenth century, but this change was deeply influenced by the
events of Newton's time. “It is enough to record that by 1700 the foundations of modern technology have been laid,”
12
concluded historian Donald Caldwell.

By starting with the invention of calculus, we will overlook several important computational projects, including the
Arithmetica Logarithmica
by Henry Briggs (1561–1630), the ballistic trajectories of Galileo Galilei (1564–1642), and the planetary computations in the
Rudolphine Tables
by Johannes Kepler (1571–1630). Each of these projects contributed to the development of science and mathematics. Briggs gave science one of its most important computing tools, the logarithm table. Galileo and Kepler laid the foundation for calculus. However, none of these projects is an example of organized computation, as we define it. None of these scientists employed a staff of computers. Instead, they did the computations themselves with the occasional assistance of a student or friend.

The story of organized scientific computation shares three themes with the history of labor and the history of factories: the division of labor, the idea of mass production, and the development of professional managers. All of these themes emerge in the first organized computing groups of the eighteenth century and reappear in new forms as the story develops. All three were identified by Charles Babbage in the 1820s, when he was considering problems of computation. These themes are tightly intertwined, as mass production clearly depends upon the division of labor, and the appearance of skilled managers can be seen as a specific example of divided and specified labor. However, this book separates these ideas and treats them individually in an attempt to clarify and illuminate the different forces that shaped computation.

The first third of this book, which deals with computation from the start of the eighteenth century up to 1880, treats the first theme, the division of labor. During this period, astronomy was the dominant field of scientific research and the discipline that required the greatest amount of calculation. Some of this calculation was done in observatories for astronomers, but most of it was done in practical settings by individuals who used astronomy in their work, most notably navigators and surveyors. It was a period when the borders of scientific practice were not well defined and many a scientist moved easily through the learned disciplines, scanning the sky one night, navigating a ship the next, and perhaps, on the night following, designing a fortification or preparing an insurance table. The great exponent of divided labor, the Scottish philosopher Adam Smith (1723–1790), wrote
The Wealth of Nations
during this period. Smith discussed the nature of divided labor in scientific work and even commented briefly on the nature of astronomy. The astronomers of the age were familiar with Smith's ideas and cited them as the inspiration for their computing staffs.

The second third of the book covers the period from 1880 to 1930, a time when astronomy was still an important force behind scientific computation but was no longer the only discipline that required large-scale calculations. In particular, electrical engineers and ordnance engineers started building staffs to deal with the demands of computation. The major change during this period came from the mass-produced adding and calculating machines. Such machines have histories that can be traced back to the seventeenth century, but they were not commonly found in computing offices until the start of the twentieth century. While these machines decreased the amount of time required for astronomical calculations, they had a greater impact in the fields of economics and social statistics. They allowed scientists to summarize large amounts of data and to develop mathematical means for analyzing large populations. With the calculating machines came other ideas that we associate with mass production, such as standardized methods, generalized techniques, and tighter managerial control.

The final third of the book discusses computation during the Great Depression, the Second World War, and the early years of the Cold War. It was at this time that human computers attempted to establish their work as an independent discipline, distinct from the different fields of scientific research and even from mathematics itself. This activity required human computers to create a literature of computation, define formal ways of training new computers, and create institutions that could support their work. Historians discuss such subjects under the topic of “professionalization,” a term that suggests independence, societal respect, and control of one's activities. In the case of the human computer, professionalization produced no independence, little respect, and nothing that could be characterized as self-governance. Professionalization came just as human computers were being replaced by computing machines that were built with tubes, powered by electricity, and controlled by a program.

The story of the human computer is connected to the development of the modern electronic computer, but it does not provide the direct antecedent of the machines that were built for scientific and business calculation in the last half of the twentieth century. To be sure, the two stories twist about each other, touching at regular points and sharing ideas with the contact. The developers of electronic computers often borrowed the mathematical techniques of hand calculation and, from time to time, asked human computers to check some number that had been produced by their machines; however, few human computers contributed to the invention of electronic computing equipment, and few computing offices were connected to machine development projects. It is best to view the human computing organizations as the backdrop against which the story of electronic computers unfolds. Human computers plugged away at
their calculations with little influence over those engaged in machine design. Most computers were intrigued with the electronic computing machines and looked forward to using these devices, but they would prove to be the secondary characters in the narrative, the Rosencrantz and Guildenstern instead of the Hamlet and Ophelia. The human computers occupied a small corner of the stage, somewhat unsure of their role, as engineers developed electronic replacements for the computing laboratories and their large staffs of workers.

This book attempts to invert the history of scientific computing by narrating the stories of those who actually did the calculations. These stories are often difficult to tell, as the vast majority of computers left no record of their lives beyond a single footnote to a scholarly article or an acknowledgment in the bottom margin of a mathematical table. Furthermore, the few surviving human computers often failed to appreciate the full scope of what they did. As often as not, they would deflect inquiries with remarks like “It was nothing” or “You should have asked my supervisor about that.” The stories unfolded in unusual ways from unlikely sources. There was a bound volume of correspondence in the Library of Congress, the cassette tape that had been carefully guarded by a family, a scrapbook that had been long filed away, the box of records with the confusing label on the shelves of the National Archives, the correspondence from an obscure university official, and the four-hour telephone conversation with a man on a hospital bed. Each of these stories illustrated a different aspect of the human computer, but each, in its own way, returned to the statement of a grandmother, “You know, I took calculus in college.”

PART I

Astronomy and the Division of Labor 1682–1880

If your wish is to become really a man of science and not merely a petty experimentalist, I should advise you to apply to every branch of natural philosophy, including mathematics.

Mary Shelley,
Frankenstein
(1818)

CHAPTER ONE

The First Anticipated Return: Halley's Comet 1758

When they come to model Heaven
And calculate the stars, how they will wield
The mighty frame …

John Milton,
Paradise Lost
(1667)

O
UR STORY
will begin with a comet, a new method of mathematics, and a seemingly intractable problem. The comet is the one that appeared over Europe in August 1682, the comet that has since been named for the English astronomer Edmund Halley (1656–1742). This comet emerged in the late summer sky and, according to observers at Cambridge University, hung like a beacon with a long, shimmering tail above the chapel of King's College. To that age, comets were mysterious visitors, phenomena that appeared at irregular intervals with no obvious explanation. Their origins, substance, and purpose were matters of pure speculation. Some thought that they were wayward stars. Others suggested that they might originate in the atmosphere, each a burning piece of Helios's chariot, perhaps, that had been caught between the earth and the moon.

BOOK: When Computers Were Human
4.98Mb size Format: txt, pdf, ePub
ads

Other books

My Name is Red by Orhan Pamuk
Cannery Row by John Steinbeck
The Willows and Beyond by William Horwood, Patrick Benson, Kenneth Grahame
Beauty Submits To Her Beast by Sydney St. Claire
Revenge of the Cootie Girls by Sparkle Hayter