You are not a Gadget: A Manifesto (8 page)

BOOK: You are not a Gadget: A Manifesto
7.08Mb size Format: txt, pdf, ePub
ads

I claim that there is one measurable difference between a zombie and a person: a zombie has a different philosophy. Therefore, zombies can only be detected if they happen to be professional philosophers. A philosopher like Daniel Dennett is obviously a zombie.

Zombies and the rest of us do not have a symmetrical relationship. Unfortunately, it is only possible for nonzombies to observe the telltale sign of zombiehood. To zombies, everyone looks the same.

If there are enough zombies recruited into our world, I worry about the potential for a self-fulfilling prophecy. Maybe if people pretend they are not conscious or do not have free will—or that the cloud of online people is a person; if they pretend there is nothing special about the perspective of the individual—then perhaps we have the power to make it so. We might be able to collectively achieve antimagic.

Humans are free. We can commit suicide for the benefit of a Singularity. We can engineer our genes to better support an imaginary hive mind. We can make culture and journalism into second-rate activities and spend centuries remixing the detritus of the 1960s and other eras from before individual creativity went out of fashion.

Or we can believe in ourselves. By chance, it might turn out we are real.

*
Chris Anderson, “The End of Theory,”
Wired
, June 23, 2008 (
www.wired.com/science/discoveries/magazine/16-07/pb_theory
).

*
One extension of the tragedy of Turing’s death is that he didn’t live long enough to articulate all that he probably would have about his own point of view on the Turing test.

Historian George Dyson suggests that Turing might have sided
against
the cybernetic totalists. For instance, here is an excerpt from a paper Turing wrote in 1939, titled “Systems of Logic Based on Ordinals”: “We have been trying to see how far it is possible to eliminate intuition, and leave only ingenuity. We do not mind how much ingenuity is required, and therefore assume it to be available in unlimited supply.” The implication seems to be that we are wrong to imagine that ingenuity can be infinite, even with computing clouds, so therefore intuition will never be made obsolete.

Turing’s 1950 paper on the test includes this extraordinary passage: “In attempting to construct such machines we should not be irreverently usurping His power of creating souls, any more than we are in the procreation of children: rather we are, in either case, instruments of His will providing mansions for the souls that He creates.”

CHAPTER 3
The Noosphere Is Just Another Name for Everyone’s Inner Troll

SOME OF THE
fantasy objects arising from cybernetic totalism (like the noosphere, which is a supposed global brain formed by the sum of all the human brains connected through the internet) happen to motivate infelicitous technological designs. For instance, designs that celebrate the noosphere tend to energize the inner troll, or bad actor, within humans.

The Moral Imperative to Create the Blandest Possible Bible

According to a new creed, we technologists are turning ourselves, the planet, our species, everything, into computer peripherals attached to the great computing clouds. The news is no longer about us but about the big new computational object that is greater than us.

The colleagues I disagree with often conceive our discussions as being a contest between a Luddite (who, me?) and the future. But there is more than one possible technological future, and the debate should be about how to best identify and act on whatever freedoms of choice we still have, not about who’s the Luddite.

Some people say that doubters of the one true path, like myself, are like the shriveled medieval church officials who fought against poor Johannes Gutenberg’s press. We are accused of fearing change, just as
the medieval Church feared the printing press. (We might also be told that we are the sort who would have repressed Galileo or Darwin.)

What these critics forget is that printing presses in themselves provide no guarantee of an enlightened outcome. People, not machines, made the Renaissance. The printing that takes place in North Korea today, for instance, is nothing more than propaganda for a personality cult. What is important about printing presses is not the mechanism, but the authors.

An impenetrable tone deafness rules Silicon Valley when it comes to the idea of authorship. This was as clear as ever when John Updike and Kevin Kelly exchanged words on the question of authorship in 2006. Kevin suggested that it was not just a good thing, but a “moral imperative” that all the world’s books would soon become effectively “one book” once they were scanned, searchable, and remixable in the universal computational cloud.

Updike used the metaphor of the edges of the physical paper in a physical book to communicate the importance of enshrining the edges between individual authors. It was no use. Doctrinaire web 2.0 enthusiasts only perceived that Updike was being sentimental about an ancient technology.

The approach to digital culture I abhor would indeed turn all the world’s books into one book, just as Kevin suggested. It might start to happen in the next decade or so. Google and other companies are scanning library books into the cloud in a massive Manhattan Project of cultural digitization. What happens next is what’s important. If the books in the cloud are accessed via user interfaces that encourage mashups of fragments that obscure the context and authorship of each fragment, there will be only one book. This is what happens today with a lot of content; often you don’t know where a quoted fragment from a news story came from, who wrote a comment, or who shot a video. A continuation of the present trend will make us like various medieval religious empires, or like North Korea, a society with a single book.
*

The ethereal, digital replacement technology for the printing press happens to have come of age in a time when the unfortunate ideology I’m criticizing dominates technological culture. Authorship—the very idea of the individual point of view—is not a priority of the new ideology.

The digital flattening of expression into a global mush is not presently enforced from the top down, as it is in the case of a North Korean printing press. Instead, the design of software builds the ideology into those actions that are the easiest to perform on the software designs that are becoming ubiquitous. It is true that by using these tools, individuals can author books or blogs or whatever, but people are encouraged by the economics of free content, crowd dynamics, and lord aggregators to serve up fragments instead of considered whole expressions or arguments. The efforts of authors are appreciated in a manner that erases the boundaries between them.

The one collective book will absolutely not be the same thing as the library of books by individuals it is bankrupting. Some believe it will be better; others, including me, believe it will be disastrously worse. As the famous line goes from
Inherit the Wind:
“The Bible is a book … but it is not the only book.” Any singular, exclusive book, even the collective one accumulating in the cloud, will become a cruel book if it is the only one available.

Nerd Reductionism

One of the first printed books that wasn’t a bible was 1499’s
Hypnerotomachia Poliphili
, or “Poliphili’s Strife of Love in a Dream,” an illustrated, erotic, occult adventure through fantastic architectural settings. What is most interesting about this book, which looks and reads like a virtual reality fantasy, is that something fundamental about its approach to life—its intelligence, its worldview—is alien to the Church and the Bible.

It’s easy to imagine an alternate history in which everything that was printed on early presses went through the Church and was conceived as an extension of the Bible. “Strife of Love” might have existed in this alternate world, and might have been quite similar. But the “slight” modifications would have consisted of trimming the alien bits. The book would no longer have been as strange. And that tiny shift, even if it had been minuscule in terms of word count, would have been tragic.

This is what happened when elements of indigenous cultures were preserved but de-alienated by missionaries. We know a little about what Aztec or Inca music sounded like, for instance, but the bits that were trimmed to make the music fit into the European idea of church song were the most precious bits. The alien bits are where the flavor is found. They are the portals to strange philosophies. What a loss to not know how New World music would have sounded alien to us! Some melodies and rhythms survived, but the whole is lost.

Something like missionary reductionism has happened to the internet with the rise of web 2.0. The strangeness is being leached away by the mush-making process. Individual web pages as they first appeared in the early 1990s had the flavor of personhood. MySpace preserved some of that flavor, though a process of regularized formatting had begun. Facebook went further, organizing people into multiple-choice identities, while Wikipedia seeks to erase point of view entirely.

If a church or government were doing these things, it would feel authoritarian, but when technologists are the culprits, we seem hip, fresh, and inventive. People will accept ideas presented in technological form that would be abhorrent in any other form. It is utterly strange to hear my many old friends in the world of digital culture claim to be the true sons of the Renaissance without realizing that using computers to reduce individual expression is a primitive, retrograde activity, no matter how sophisticated your tools are.

Rejection of the Idea of Quality Results in a Loss of Quality

The fragments of human effort that have flooded the internet are perceived by some to form a hive mind, or noosphere. These are some of the terms used to describe what is thought to be a new superintelligence
that is emerging on a global basis on the net. Some people, like Larry Page, one of the Google founders, expect the internet to come alive at some point, while others, like science historian George Dyson, think that might already have happened. Popular derivative terms like “blogosphere” have become commonplace.

A fashionable idea in technical circles is that quantity not only turns into quality at some extreme of scale, but also does so according to principles we already understand. Some of my colleagues think a million, or perhaps a billion, fragmentary insults will eventually yield wisdom that surpasses that of any well-thought-out essay, so long as sophisticated secret statistical algorithms recombine the fragments. I disagree. A trope from the early days of computer science comes to mind: garbage in, garbage out.

There are so many examples of disdain for the idea of quality within the culture of web 2.0 enthusiasts that it’s hard to choose an example. I’ll choose hive enthusiast Clay Shirky’s idea that there is a vast cognitive surplus waiting to be harnessed.

Certainly there is broad agreement that there are huge numbers of people who are undereducated. Of those who are well educated, many are underemployed. If we want to talk about unmet human potential, we might also mention the huge number of people who are desperately poor. The waste of human potential is overwhelming. But these are not the problems that Shirky is talking about.

What he means is that quantity can overwhelm quality in human expression. Here’s a quote, from a speech Shirky gave in April 2008:

And this is the other thing about the size of the cognitive surplus we’re talking about. It’s so large that even a small change could have huge ramifications. Let’s say that everything stays 99 percent the same, that people watch 99 percent as much television as they used to, but 1 percent of that is carved out for producing and for sharing. The Internet-connected population watches roughly a trillion hours of TV a year … One percent of that is 98 Wikipedia projects per year worth of participation
.

So how many seconds of salvaged erstwhile television time would need to be harnessed to replicate the achievements of, say, Albert
Einstein? It seems to me that even if we could network all the potential aliens in the galaxy—quadrillions of them, perhaps—and get each of them to contribute some seconds to a physics wiki, we would not replicate the achievements of even one mediocre physicist, much less a great one.

Absent Intellectual Modesty

There are at least two ways to believe in the idea of quality. You can believe there’s something ineffable going on within the human mind, or you can believe we just don’t understand what quality in a mind is yet, even though we might someday. Either of those opinions allows one to distinguish quantity and quality. In order to confuse quantity and quality, you have to reject both possibilities.

The mere possibility of there being something ineffable about personhood is what drives many technologists to reject the notion of quality. They want to live in an airtight reality that resembles an idealized computer program, in which everything is understood and there are no fundamental mysteries. They recoil from even the hint of a potential zone of mystery or an unresolved seam in one’s worldview.

This desire for absolute order usually leads to tears in human affairs, so there is a historical reason to distrust it. Materialist extremists have long seemed determined to win a race with religious fanatics: Who can do the most damage to the most people?

At any rate, there is no evidence that quantity becomes quality in matters of human expression or achievement. What matters instead, I believe, is a sense of focus, a mind in effective concentration, and an adventurous individual imagination that is distinct from the crowd.

Of course, I can’t describe what it is that a mind does, because no one can. We don’t understand how brains work. We understand a lot about how parts of brains work, but there are fundamental questions that have not even been fully articulated yet, much less answered.

For instance, how does reason work? How does meaning work? The usual ideas currently in play are variations on the notion that pseudo-Darwinian selection goes on within the brain. The brain tries out different thought patterns, and the ones that work best are reinforced. That’s awfully vague. But there’s no reason that Darwinian evolution could not
have given rise to processes within the human brain that jumped out of the Darwinian progression. While the physical brain is a product of evolution as we are coming to understand it, the cultural brain might be a way of transforming the evolved brain according to principles that cannot be explained in evolutionary terms.

BOOK: You are not a Gadget: A Manifesto
7.08Mb size Format: txt, pdf, ePub
ads

Other books

Saturnalia by John Maddox Roberts
I Can't Think Straight by Shamim Sarif
Lucky Stars by Jane Heller
Vanishing Act by John Feinstein
Heart's Lair by Kathleen Morgan
A Crime in Holland by Georges Simenon
The Toymaker by Chuck Barrett