The Singularity Is Near: When Humans Transcend Biology (94 page)

Read The Singularity Is Near: When Humans Transcend Biology Online

Authors: Ray Kurzweil

Tags: #Non-Fiction, #Fringe Science, #Retail, #Technology, #Amazon.com

BOOK: The Singularity Is Near: When Humans Transcend Biology
2.58Mb size Format: txt, pdf, ePub

35.
Sherry Turkle, ed., “Evocative Objects: Things We Think With,” forthcoming.

36.
See the “Exponential Growth of Computing”figure in
chapter 2
(p.
70
). Projecting the double exponential growth of the price-performance of computation to the end of the twenty-first century, one thousand dollars’ worth of computation will provide 10
60
calculations per second (cps). As we will discuss in
chapter 2
, three different analyses of the amount of computing required to functionally emulate
the human brain result in an estimate of 10
15
cps. A more conservative estimate, which assumes that it will be necessary to simulate all of the nonlinearities in every synapse and dendrite, results in an estimate of 10
19
cps for neuromorphic emulation of the human brain. Even taking the more conservative figure, we get a figure of 10
29
cps for the approximately 10
10
humans. Thus, the 10
60
cps that can be purchased for one thousand dollars circa 2099 will represent 10
31
(ten million trillion trillion) human civilizations.

37.
The invention of the power loom and the other textile automation machines of the early eighteenth century destroyed the livelihoods of the cottage industry of English weavers, who had passed down stable family businesses for hundreds of years. Economic power passed from the weaving families to the owners of the machines. As legend has it, a young and feebleminded boy named Ned Ludd broke two textile factory machines out of sheer clumsiness. From that point on, whenever factory equipment was found to have mysteriously been damaged, anyone suspected of foul play would say, “But Ned Ludd did it.” In 1812 the desperate weavers formed a secret society, an urban guerrilla army. They made threats and demands of factory owners, many of whom complied. When asked who their leader was, they replied, “Why, General Ned Ludd, of course.” Although the Luddites, as they became known, initially directed most of their violence against the machines, a series of bloody engagements erupted later that year. The tolerance of the Tory government for the Luddites ended, and the movement dissolved with the imprisonment and hanging of prominent members. Although they failed to create a sustained and viable movement, the Luddites have remained a powerful symbol of opposition to automation and technology.

38.
See note
34
above.

Chapter Two: A Theory of Technology Evolution:
The Law of Accelerating Returns

 

1.
John Smart, Abstract to “Understanding Evolutionary Development: A Challenge for Futurists,” presentation to World Futurist Society annual meeting, Washington, D.C., August 3, 2004.

2.
That epochal events in evolution represent increases in complexity is Theodore Modis’s view. See Theodore Modis, “Forecasting the Growth of Complexity and Change,”
Technological Forecasting and Social Change
69.4 (2002),
http://ourworld.compuserve.com/homepages/tmodis/TedWEB.htm
.

3.
Compressing files is a key aspect of both data transmission (such as a music or text file over the Internet) and data storage. The smaller the file is, the less time it will take to transmit and the less space it will require. The mathematician Claude Shannon, often called the father of information theory, defined the basic theory of data compression in his paper “A Mathematical Theory of Communication,”
The Bell System Technical Journal
27 (July–October 1948): 379–423, 623–56. Data
compression is possible because of factors such as redundancy (repetition) and probability of appearance of character combinations in data. For example, silence in an audio file could be replaced by a value that indicates the duration of the silence, and letter combinations in a text file could be replaced with coded identifiers in the compressed file.

Redundancy can be removed by lossless compression, as Shannon explained, which means there is no loss of information. There is a limit to lossless compression, defined by what Shannon called the entropy rate (compression increases the “entropy” of the data, which is the amount of actual information in it as opposed to predetermined and thus predictable data structures). Data compression removes redundancy from data; lossless compression does it without losing data (meaning that the exact original data can be restored). Alternatively, lossy compression, which is used for graphics files or streaming video and audio files, does result in information loss, though that loss is often imperceptible to our senses.

Most data-compression techniques use a code, which is a mapping of the basic units (or symbols) in the source to a code alphabet. For example, all the spaces in a text file could be replaced by a single code word and the number of spaces. A compression algorithm is used to set up the mapping and then create a new file using the code alphabet; the compressed file will be smaller than the original and thus easier to transmit or store. Here are some of the categories into which common lossless-compression techniques fall:

 
  • Run-length compression, which replaces repeating characters with a code and a value representing the number of repetitions of that character (examples: Pack-Bits and PCX).
  • Minimum redundancy coding or simple entropy coding, which assigns codes on the basis of probability, with the most frequent symbols receiving the shortest codes (examples: Huffman coding and arithmetic coding).
  • Dictionary coders, which use a dynamically updated symbol dictionary to represent patterns (examples: Lempel-Ziv, Lempel-Ziv-Welch, and DEFLATE).
  • Block-sorting compression, which reorganizes characters rather than using a code alphabet; run-length compression can then be used to compress the repeating strings (example: Burrows-Wheeler transform).
  • Prediction by partial mapping, which uses a set of symbols in the uncompressed file to predict how often the next symbol in the file appears.

4.
Murray Gell-Mann, “What Is Complexity?” in
Complexity
, vol. 1 (New York: John Wiley and Sons, 1995).

5.
The human genetic code has approximately six billion (about 10
10
) bits, not considering the possibility of compression. So the 10
27
bits that theoretically can be stored in a one-kilogram rock is greater than the genetic code by a factor of 10
17
. See note
57
below for a discussion of genome compression.

6.
Of course, a human, who is also composed of an enormous number of particles, contains an amount of information comparable to a rock of similar weight when
we consider the properties of all the particles. As with the rock, the bulk of this information is not needed to characterize the state of the person. On the other hand, much more information is needed to characterize a person than a rock.

7.
See note
175
in
chapter 5
for an algorithmic description of genetic algorithms.

8.
Humans, chimpanzees, gorillas, and orangutans are all included in the scientific classification of hominids (family
Hominidae
). The human lineage is thought to have diverged from its great ape relatives five to seven million years ago. The human genus
Homo
within the
Hominidae
includes extinct species such as
H. erectus
as well as modern man (
H. sapiens
).

In chimpanzee hands, the fingers are much longer and less straight than in humans, and the thumb is shorter, weaker, and not as mobile. Chimps can flail with a stick but tend to lose their grip. They cannot pinch hard because their thumbs do not overlap their index fingers. In the modern human, the thumb is longer, and the fingers rotate toward a central axis, so you can touch all the tips of your fingers to the tip of your thumb, a quality that is called full opposability. These and other changes gave humans two new grips: the precision and power grips. Even prehominoid hominids such as the
Australopithecine
from Ethiopia called Lucy, who is thought to have lived around three million years ago, could throw rocks with speed and accuracy. Since then, scientists claim, continual improvements in the hand’s capacity to throw and club, along with associated changes in other parts of the body, have resulted in distinct advantages over other animals of similar size and weight. See Richard Young, “Evolution of the Human Hand: The Role of Throwing and Clubbing,”
Journal of Anatomy
202 (2003): 165–74; Frank Wilson,
The Hand: How Its Use Shapes the Brain, Language, and Human Culture
(New York: Pantheon, 1998).

9.
The Santa Fe Institute has played a pioneering role in developing concepts and technology related to complexity and emergent systems. One of the principal developers of paradigms associated with chaos and complexity is Stuart Kauffman. Kauffman’s
At Home in the Universe: The Search for the Laws of Self-Organization and Complexity
(Oxford: Oxford University Press, 1995) looks “at the forces for order that lie at the edge of chaos.”

In his book
Evolution of Complexity by Means of Natural Selection
(Princeton: Princeton University Press, 1988), John Tyler Bonner asks the questions “How is it that an egg turns into an elaborate adult? How is it that a bacterium, given many millions of years, could have evolved into an elephant?”

John Holland is another leading thinker from the Santa Fe Institute in the emerging field of complexity. His book
Hidden Order: How Adaptation Builds Complexity
(Reading, Mass.: Addison-Wesley, 1996) includes a series of lectures that he presented at the Santa Fe Institute in 1994. See also John H. Holland,
Emergence: From Chaos to Order
(Reading, Mass.: Addison-Wesley, 1998) and Mitchell Waldrop,
Complexity: The Emerging Science at the Edge of Order and Chaos
(New York: Simon & Schuster, 1992).

10.
The second law of thermodynamics explains why there is no such thing as a
perfect engine that uses all the heat (energy) produced by burning fuel to do work: some heat will inevitably be lost to the environment. This same principle of nature holds that heat will flow from a hot pan to cold air rather than in reverse. It also posits that closed (“isolated”) systems will spontaneously become more disordered over time—that is, they tend to move from order to disorder. Molecules in ice chips, for example, are limited in their possible arrangements. So a cup of ice chips has less entropy (disorder) than the cup of water the ice chips become when left at room temperature. There are many more possible molecular arrangements in the glass of water than in the ice; greater freedom of movement equals higher entropy. Another way to think of entropy is as multiplicity. The more ways that a state could be achieved, the higher the multiplicity. Thus, for example, a jumbled pile of bricks has a higher multiplicity (and higher entropy) than a neat stack.

11.
Max More articulates the view that “advancing technologies are combining and cross-fertilizing to accelerate progress even faster.” Max More, “Track 7 Tech Vectors to Take Advantage of Technological Acceleration,”
ManyWorlds
, August 1, 2003.

12.
For more information, see J. J. Emerson et al., “Extensive Gene Traffic on the Mammalian X Chromosome,”
Science
303.5657 (January 23, 2004): 537–40,
http://www3.uta.edu/faculty/betran/science2004.pdf
; Nicholas Wade, “Y Chromosome Depends on Itself to Survive,”
New York Times
, June 19, 2003; and Bruce T. Lahn and David C. Page, “Four Evolutionary Strata on the Human X Chromosome,”
Science
286.5441 (October 29, 1999): 964–67,
http://inside.wi.mit.edu/page/Site/Page%20PDFs/Lahn_and_Page_strata_1999.pdf
.

Interestingly, the second X chromosome in girls is turned off in a process called X inactivation so that the genes on only one X chromosome are expressed. Research has shown that the X chromosome from the father is turned off in some cells and the X chromosome from the mother in other cells.

13.
Human Genome Project, “Insights Learned from the Sequence,”
http://www.ornl.gov/sci/techresources/Human_Genome/project/journals/
insights.html
. Even though the human genome has been sequenced, most of it does not code for proteins (the so-called junk DNA), so researchers are still debating how many genes will be identified among the three billion base pairs in human DNA. Current estimates suggest less than thirty thousand, though during the Human Genome Project estimates ranged as high as one hundred thousand. See “How Many Genes Are in the Human Genome?” (
http://www.ornl.gov/sci/techresources/Human_Genome/faq/
genenumber.shtml
) and Elizabeth Pennisi,“A Low Number Wins the GeneSweep Pool,”
Science
300.5625 (June 6, 2003): 1484.

14.
Niles Eldredge and the late Stephen Jay Gould proposed this theory in 1972 (N. Eldredge and S. J. Gould, “Punctuated Equilibria: An Alternative to Phyletic Gradualism,” in T. J. M. Schopf, ed.,
Models in Paleobiology
[San Francisco: Freeman, Cooper], pp.
82

115
). It has sparked heated discussions among paleontologists
and evolutionary biologists ever since, though it has gradually gained acceptance. According to this theory, millions of years may pass with species in relative stability. This stasis is then followed by a burst of change, resulting in new species and the extinction of old (called a “turnover pulse” by Elisabeth Vrba). The effect is ecosystemwide, affecting many unrelated species. Eldredge and Gould’s proposed pattern required a new perspective: “For no bias can be more constricting than invisibility—and stasis, inevitably read as absence of evolution, had always been treated as a non-subject. How odd, though, to define the most common of all palaeontological phenomena as beyond interest or notice!” S. J. Gould and N. Eldredge, “Punctuated Equilibrium Comes of Age,”
Nature
366 (November 18, 1993): 223–27.

Other books

Stranger in Town by Cheryl Bradshaw
Shifting Calder Wind by Janet Dailey
CUTTING ROOM -THE- by HOFFMAN JILLIANE
Passing the Narrows by Frank Tuttle