Darwin Among the Machines (3 page)

Read Darwin Among the Machines Online

Authors: George B. Dyson

BOOK: Darwin Among the Machines
6.68Mb size Format: txt, pdf, ePub

Ampère, an early advocate of the electromagnetic telegraph and mathematical pioneer of both game theory and electrodynamics, thereby anticipated the
Cybernetics
of Norbert Wiener, who, another century later, reinvented both Ampère's terminology and Hobbes's philosophy in their current, electronic form. “Although the term
cybernetics
does not date further back than the summer of 1947,” wrote Wiener in 1948, “we shall find it convenient to use in referring to earlier epochs of the development of the field.”
22
Wiener, who was involved in the development of radar-guided anti-aircraft fire control, which marked the beginning of rudimentary perception by electronic machines, was unaware until after the publication of
Cybernetics
of the coincidence in choosing a name coined by the same Ampère we now honor in measuring the flow of electrons through a circuit. In 1820, by demonstrating that electric currents are able to convey both power
and
information, Ampère had laid the foundations for Wiener's cybernetic principles of feedback, adaptation, and control.

We live in an age of embodied logic whose beginnings go back to Thomas Hobbes as surely as it remains our destiny to see new Leviathans unfold. Hobbes established that logic and digital computation share common foundations, suggesting a basis in common with mind. “Per ratiocinationem autem intelligo computationem,” declared
Hobbes in 1655, or, “by
ratiocination
, I mean
computation
. Now to compute, is either to collect the sum of many things that are added together, or to know what remains when one thing is taken out of another.
Ratiocination
, therefore is the same with
Addition
or
Substraction
; and if any man adde
Multiplication
and
Division
, I will not be against it, seeing Multiplication is nothing but Addition of equals one to another, and Division nothing but a Substraction of equals one from another, as often as is possible. So that all Ratiocination is comprehended in these two operations of the minde, Addition and Substraction.”
23

This statement launched an argument far from settled after 340 years: If reasoning can be reduced to arithmetic, which, even in Hobbes's time, could be performed by mechanism, then is mechanism capable of reasoning? Can machines think? (Or, as Marvin Minsky put it, “Why do people think computers can't?”)
24
Hobbes, the patriarch of artificial intelligence, was succeeded in this line of questioning by the young German lawyer and mathematician Gottfried Wilhelm von Leibniz (1646–1716), who made the first attempt at a system of symbolic logic and the first suggestion of a binary computing machine. The holy grail of capturing intelligence within a formal, mechanical system, however, slipped through Leibniz's grasp.

Or did it? The binary arithmetic and logical calculus of Leibniz and Hobbes's vague notions of reason as a mathematical function are now executed millions of times per second by thumbnail-size machines. Our formalization of logic is embedded microscopically in these devices, and by every available means of digital communication, from fiber optics to circulating floppy disks, the kingdom of the microprocessor is building a collective body of results. Philosophers and mathematicians have made limited progress at deconstructing the firmament of mind from the top down, while a grand, bottom-up experiment at building intelligence from elemental bits of addition and subtraction has been advancing by leaps and bounds. The results have more in common with the diffuse intelligence of Hobbes's Leviathan than with the localized artificial intelligence, or AI, that has now been promised for fifty years.

Is intelligence a formal (or mathematically definable) system? Is life a recursive (or mechanically calculable) function? What happens when you replicate discrete-state microprocessors by the billions and run these questions the other way? (Are formal systems intelligent? Are recursive functions alive?) Life and intelligence have learned to operate on any number of different scales: larger, smaller, slower, and faster than our own. Biology and technology evidence parallel tendencies toward collective, hierarchical processes based on information
exchange. As information is distributed, it tends to be represented (encoded) by increasingly economical (meaningful) forms. This evolutionary process, whereby the most economical or meaningful representation wins, leads to a hierarchy of languages, encoding meaning on levels that transcend comprehension by the system's individual components—whether genes, insects, microprocessors, or human minds.

Binary arithmetic is a language held in common by switches of all kinds. The global population of integrated circuits—monolithic networks of microscopic switches that take only billionths of a second to switch between off and on—is growing by more than 100 million units per day.
25
Production of silicon wafer, approximately 2.5 billion square inches for the year 1994, is expected to double by the year 2000—enough raw material, to use an existing benchmark, for 30 billion Pentium microprocessors, of 3.3 million transistors each.
26
Intel's Pentium microprocessors are now manufactured, tested, and packaged at a cost of less than forty dollars each, while 350,000-transistor 486SXL embedded microprocessors cost less than eight dollars to manufacture and sell in quantity for about fifteen dollars each.
27
Microcontrollers—specialized microprocessors embedded in all kinds of things—were produced at a rate of more than 8 million units per day in 1996.
28
Over 200,000 non-embedded 32-bit microprocessors per day were shipped in 1995, and worldwide sales of personal computers exceeded 70 million units for the year. But the distinction between microprocessors and microcontrollers is increasingly obscure. Embedded devices are being integrated into the computational landscape, while computers are reaching beyond the desktop to become more deeply embedded in the control of all aspects of our world.

This digital metabolism is held together by telecommunications, spanning distance, and by memory, spanning time. Annual production of dynamic random-access memory (DRAM) now exceeds 25 billion megabits, and the manufacturing cost of 16-megabit memory circuits dropped beloW $10.00, or $0.62 per megabit, in 1996.
29
More than 100 million hard disk drives—averaging 500 megabytes each—were shipped in 1996. The market for electronic connectors now exceeds 20 billion dollars a year. Long-distance transmission of data has exceeded transmission of voice since 1995, with current telecommunications standards allowing the multiplexing of as many as 64,000 voice-equivalent channels over a single fiber optic pair.

Physicist Donald Keck, who wrote “Eureka!” in his Corning laboratory notebook after testing the first 200 meters of low-loss optical fiber in August 1970, estimated the worldwide installed base of
optical fiber at more than 100 million kilometers at the end of 1996.
30
Eight million kilometers of telecommunications fiber were deployed in 1996 in the United States alone.
31
Much of this is “dark fiber” that awaits the growth of high-speed switching elsewhere in the global telecommunications network before it can be used. “The AT&T network is the world's largest computer,” according to Alex Mandl of AT&T. “It is the largest distributed intelligence in the world—perhaps the universe,” he claimed in 1995 (assuming that extraterrestrial civilizations have broken up their telecommunications industries into pieces smaller than AT&T).
32

The emergence of life and intelligence from less-alive and less-intelligent components has happened at least once. Emergent behavior is that which cannot be predicted through analysis at any level simpler than that of the system as a whole. Explanations of emergence, like simplifications of complexity, are inherently illusory and can only be achieved by sleight of hand. This does not mean that emergence is not real. Emergent behavior, by definition, is what's left after everything else has been explained.

“Emergence offers a way to believe in physical causality while simultaneously maintaining the impossibility of a reductionist explanation of thought,” wrote W. Daniel Hillis, a computer architect who believes that architecture and programming can only go so far, after which intelligence has to be allowed to evolve on its own. “For those who fear mechanistic explanations of the human mind, our ignorance of how local interactions produce emergent behavior offers a reassuring fog in which to hide the soul.”
33
Although individual computers and individual computer programs are developing the elements of artificial intelligence, it is in the larger networks (or the network at large) that we are developing a more likely medium for the emergence of the Leviathan of artificial mind.

Sixty years ago, English logician Alan Turing constructed a theory of computable numbers by means of an imaginary discrete-state automaton, reading and writing distinguishable but otherwise intrinsically meaningless symbols on an unbounded length of tape. In Turing's universe there are only two objects in existence: Turing machine and tape. Turing's thought experiment was as close to Leibniz's dream of an elemental and universal language as mind, mechanism, or mathematics has been able to get so far. With the arrival of World War II, statistical analysis and the decoding of computable functions became a matter of life and death. Theory became hardware overnight. Turing and his wartime colleagues working for Allied intelligence at Bletchley Park found themselves coercing
obstinate lengths of punched paper tape, at speeds of up to thirty miles per hour, through an optical mask linked by an array of photoelectric cells to the logical circuitry of a primitive computer named Colossus. Some fifteen hundred vacuum tubes, configured for parallel Boolean arithmetic, cycled through five thousand states per second, seeking to recognize a meaningful pattern in scrambled strings of code. The age of electronic digital computers was launched, secretively, as ten Colossi were brought on line by the time the war came to an end.

It has been nothing but Turing machines, in one form or another, ever since. Ours is the age of computable numbers, from the pocket calculator to Mozart on compact disc to the $89.95 operating system containing eleven million lines of code. We inhabit a computational labyrinth infested by billions of Turing machines, each shuffling through millions of internal states per second and set loose, without coordinated instructions, to read and write mutually intelligible strings of symbols on a communally unbounded, self-referential, and infinitely convoluted supply of tape.

Although our attention has been focused on the growth of computer networks as a medium for communication among human beings, beneath the surface lies a far more extensive growth in communication among machines. Everything that human beings are doing to make it easier to operate computer networks is at the same time, but for different reasons, making it easier for computer networks to operate human beings. Symbiosis operates by way of positive rewards. The benefits of telecommunication are so attractive that we are eager to share our world with these machines.

We are, after all, social creatures, formed by our nature into social units, as we ourselves are formed from societies of individual cells. Even H. G. Wells, who warned of a dark future as he approached the close of his life, held out hope for humanity through the globalization of human knowledge, described in his 1938 book
World Brain:
“In a universal organization and clarification of knowledge and ideas . . . in the evocation, that is, of what I have here called a World Brain . . . a World Brain which will replace our multitude of uncoordinated ganglia . . . in that and in that alone, it is maintained, is there any clear hope of a really Competent Receiver for world affairs. . . . We do not want dictators, we do not want oligarchic parties or class rule, we want a widespread world intelligence conscious of itself.”
34
As we develop digital models of all things great and small, our models are faced with the puzzle of modeling themselves. As far as we know, this is how consciousness evolves.

Wells acknowledged memory not as an accessory to intelligence, but as the substance from which intelligence is formed. “The whole human memory can be, and probably in a short time will be, made accessible to every individual. . . . This new all-human cerebrum . . . need not be concentrated in any one single place. It need not be vulnerable as a human head or a human heart is vulnerable. It can be reproduced exactly and fully, in Peru, China, Iceland, Central Africa, or wherever else seems to afford an insurance against danger and interruption. It can have at once, the concentration of a craniate animal and the diffused vitality of an amoeba.”
35
Writing from a perspective about midway, technologically, between the diffuse, largely unmechanized nature of Hobbes's Leviathan and the diffuse, highly mechanized information-processing structures of today. Wells held out the hope that this collective intelligence might improve on some of the collective stupidity exhibited by human beings so far. Let us hope that Wells was right.

Not everyone agrees that our great network of networks represents an emerging intelligence, or that it would be in our best interest if it did. Our intuitive association of intelligence with computational complexity has no precedent by which to grasp the combinatorial scale of the computer networks developing today. “Since the complexity is an exponential function of this kind of combinatorics, there is really a gigantic gap between computers and flatworms or any other simple kind of organism,” warned Philip Morrison, considering the prospects for artificial intelligence in 1974. “Computer experts have a long, long way to go. If they work hard, their machines might approach the intelligence of a human. But the human species is not one person, it is 10
10
of them, and that is entirely a different thing. When they tell you about 10
10
computers, then you can start to worry.”
36

Other books

Miss Match by Erynn Mangum
The Last Heiress by Mary Ellis
A Case of Need: A Novel by Michael Crichton, Jeffery Hudson
A Killer Among Us by Lynette Eason