Read The Shallows Online

Authors: Nicholas Carr

The Shallows (2 page)

BOOK: The Shallows
5.22Mb size Format: txt, pdf, ePub
ads

I was an English major and went to great lengths to avoid math and science classes, but Kiewit occupied a strategic location on campus, midway between my dorm and Fraternity Row, and on weekend evenings I’d often spend an hour or two at a terminal in the public teletype room while waiting for the keg parties to get rolling. Usually, I’d fritter away the time playing one of the goofily primitive multiplayer games that the undergraduate programmers—“sysprogs,” they called themselves—had hacked together. But I did manage to teach myself how to use the system’s cumbersome word-processing program and even learned a few BASIC commands.

That was just a digital dalliance. For every hour I passed in Kiewit, I must have spent two dozen next door in Baker. I crammed for exams in the library’s cavernous reading room, looked up facts in the weighty volumes on the reference shelves, and worked part-time checking books in and out at the circulation desk. Most of my library time, though, went to wandering the long, narrow corridors of the stacks. Despite being surrounded by tens of thousands of books, I don’t remember feeling the anxiety that’s symptomatic of what we today call “information overload.” There was something calming in the reticence of all those books, their willingness to wait years, decades even, for the right reader to come along and pull them from their appointed slots.
Take your time
, the books whispered to me in their dusty voices.
We’re not going anywhere.

It was in 1986, five years after I left Dartmouth, that computers entered my life in earnest. To my wife’s dismay, I spent nearly our entire savings, some $2,000, on one of Apple’s earliest Macintoshes—a Mac Plus decked out with a single megabyte of RAM, a 20-megabyte hard drive, and a tiny black-and-white screen. I still recall the excitement I felt as I unpacked the little beige machine. I set it on my desk, plugged in the keyboard and mouse, and flipped the power switch. It lit up, sounded a welcoming chime, and smiled at me as it went through the mysterious routines that brought it to life. I was smitten.

The Plus did double duty as both a home and a business computer. Every day, I lugged it into the offices of the management consulting firm where I worked as an editor. I used Microsoft Word to revise proposals, reports, and presentations, and sometimes I’d launch Excel to key in revisions to a consultant’s spreadsheet. Every evening, I carted it back home, where I used it to keep track of the family finances, write letters, play games (still goofy, but less primitive), and—most diverting of all—cobble together simple databases using the ingenious HyperCard application that back then came with every Mac. Created by Bill Atkinson, one of Apple’s most inventive programmers, HyperCard incorporated a hypertext system that anticipated the look and feel of the World Wide Web. Where on the Web you click links on pages, on HyperCard you clicked buttons on cards—but the idea, and its seductiveness, was the same.

The computer, I began to sense, was more than just a simple tool that did what you told it to do. It was a machine that, in subtle but unmistakable ways, exerted an influence over you. The more I used it, the more it altered the way I worked. At first I had found it impossible to edit anything on-screen. I’d print out a document, mark it up with a pencil, and type the revisions back into the digital version. Then I’d print it out again and take another pass with the pencil. Sometimes I’d go through the cycle a dozen times a day. But at some point—and abruptly—my editing routine changed. I found I could no longer write or revise anything on paper. I felt lost without the Delete key, the scrollbar, the cut and paste functions, the Undo command. I
had
to do all my editing on-screen. In using the word processor, I had become something of a word processor myself.

Bigger changes came after I bought a modem, sometime around 1990. Up to then, the Plus had been a self-contained machine, its functions limited to whatever software I installed on its hard drive. When hooked up to other computers through the modem, it took on a new identity and a new role. It was no longer just a high-tech Swiss Army knife. It was a communications medium, a device for finding, organizing, and sharing information. I tried all the online services—CompuServe, Prodigy, even Apple’s short-lived eWorld—but the one I stuck with was America Online. My original AOL subscription limited me to five hours online a week, and I would painstakingly parcel out the precious minutes to exchange e-mails with a small group of friends who also had AOL accounts, to follow the conversations on a few bulletin boards, and to read articles reprinted from newspapers and magazines. I actually grew fond of the sound of my modem connecting through the phone lines to the AOL servers. Listening to the bleeps and clangs was like overhearing a friendly argument between a couple of robots.

By the mid-nineties, I had become trapped, not unhappily, in the “upgrade cycle.” I retired the aging Plus in 1994, replacing it with a Macintosh Performa 550 with a color screen, a CD-ROM drive, a 500-megabyte hard drive, and what seemed at the time a miraculously fast 33-megahertz processor. The new computer required updated versions of most of the programs I used, and it let me run all sorts of new applications with the latest multimedia features. By the time I had installed all the new software, my hard drive was full. I had to go out and buy an external drive as a supplement. I added a Zip drive too—and then a CD burner. Within a couple of years, I’d bought another new desktop, with a much larger monitor and a much faster chip, as well as a portable model that I could use while traveling. My employer had, in the meantime, banished Macs in favor of Windows PCs, so I was using two different systems, one at work and one at home.

It was around this same time that I started hearing talk of something called the Internet, a mysterious “network of networks” that promised, according to people in the know, to “change everything.” A 1994 article in
Wired
declared my beloved AOL “suddenly obsolete.” A new invention, the “graphical browser,” promised a far more exciting digital experience: “By following the links—click, and the linked document appears—you can travel through the online world along paths of whim and intuition.”
13
I was intrigued, and then I was hooked. By the end of 1995 I had installed the new Netscape browser on my work computer and was using it to explore the seemingly infinite pages of the World Wide Web. Soon I had an ISP account at home as well—and a much faster modem to go with it. I canceled my AOL service.

You know the rest of the story because it’s probably your story too. Ever-faster chips. Ever-quicker modems. DVDs and DVD burners. Gigabyte-sized hard drives. Yahoo and Amazon and eBay. MP3s. Streaming video. Broadband. Napster and Google. BlackBerrys and iPods. Wi-fi networks. YouTube and Wikipedia. Blogging and microblogging. Smartphones, thumb drives, netbooks. Who could resist? Certainly not I.

When the Web went 2.0 around 2005, I went 2.0 with it. I became a social networker and a content generator. I registered a domain, roughtype.com, and launched a blog. It was exhilarating, at least for the first couple of years. I had been working as a freelance writer since the start of the decade, writing mainly about technology, and I knew that publishing an article or a book was a slow, involved, and often frustrating business. You slaved over a manuscript, sent it off to a publisher, and, assuming it wasn’t sent back with a rejection slip, went through rounds of editing, fact checking, and proofreading. The finished product wouldn’t appear until weeks or months later. If it was a book, you might have to wait more than a year to see it in print. Blogging junked the traditional publishing apparatus. You’d type something up, code a few links, hit the Publish button, and your work would be out there, immediately, for all the world to see. You’d also get something you rarely got with more formal writing: direct responses from readers, in the form of comments or, if the readers had their own blogs, links. It felt new and liberating.

Reading online felt new and liberating too. Hyperlinks and search engines delivered an endless supply of words to my screen, alongside pictures, sounds, and videos. As publishers tore down their paywalls, the flood of free content turned into a tidal wave. Headlines streamed around the clock through my Yahoo home page and my RSS feed reader. One click on a link led to a dozen or a hundred more. New e-mails popped into my in-box every minute or two. I registered for accounts with MySpace and Facebook, Digg and Twitter. I started letting my newspaper and magazine subscriptions lapse. Who needed them? By the time the print editions arrived, dewdampened or otherwise, I felt like I’d already seen all the stories.

Sometime in 2007, a serpent of doubt slithered into my infoparadise. I began to notice that the Net was exerting a much stronger and broader influence over me than my old stand-alone PC ever had. It wasn’t just that I was spending so much time staring into a computer screen. It wasn’t just that so many of my habits and routines were changing as I became more accustomed to and dependent on the sites and services of the Net. The very way my brain worked seemed to be changing. It was then that I began worrying about my inability to pay attention to one thing for more than a couple of minutes. At first I’d figured that the problem was a symptom of middle-age mind rot. But my brain, I realized, wasn’t just drifting. It was hungry. It was demanding to be fed the way the Net fed it—and the more it was fed, the hungrier it became. Even when I was away from my computer, I yearned to check e-mail, click links, do some Googling. I wanted to be
connected
. Just as Microsoft Word had turned me into a flesh-and-blood word processor, the Internet, I sensed, was turning me into something like a high-speed data-processing machine, a human HAL.

I missed my old brain.

The Vital Paths

F
riedrich Nietzsche was desperate. Sickly as a child, he had never fully recovered from injuries he suffered in his early twenties when he fell from a horse while serving in a mounted artillery unit in the Prussian army. In 1879, his health problems worsening, he’d been forced to resign his post as a professor of philology at the University of Basel. Just thirty-four years old, he began to wander through Europe, seeking relief from his many ailments. He would head south to the shores of the Mediterranean when the weather turned cool in the fall, then north again, to the Swiss Alps or his mother’s home near Leipzig, in the spring. Late in 1881, he rented a garret apartment in the Italian port city of Genoa. His vision was failing, and keeping his eyes focused on a page had become exhausting and painful, often bringing on crushing headaches and fits of vomiting. He’d been forced to curtail his writing, and he feared he would soon have to give it up.

At wit’s end, he ordered a typewriter—a Danish-made Malling-Hansen Writing Ball—and it was delivered to his lodgings during the first weeks of 1882. Invented a few years earlier by Hans Rasmus Johann Malling-Hansen, the principal of the Royal Institute for the Deaf-Mute in Copenhagen, the writing ball was an oddly beautiful instrument. It resembled an ornate golden pincushion. Fifty-two keys, for capital and lowercase letters as well as numerals and punctuation marks, protruded from the top of the ball in a concentric arrangement scientifically designed to enable the most efficient typing possible. Directly below the keys lay a curved plate that held a sheet of typing paper. Using an ingenious gearing system, the plate advanced like clockwork with each stroke of a key. Given enough practice, a person could type as many as eight hundred characters a minute with the machine, making it the fastest typewriter that had ever been built.
1

The writing ball rescued Nietzsche, at least for a time. Once he had learned touch typing, he was able to write with his eyes closed, using only the tips of his fingers. Words could again pass from his mind to the page. He was so taken with Malling-Hansen’s creation that he typed up a little ode to it:

The writing ball is a thing like me: made of iron
Yet easily twisted on journeys.
Patience and tact are required in abundance,
As well as fine fingers, to use us.

In March, a Berlin newspaper reported that Nietzsche “feels better than ever” and, thanks to his typewriter, “has resumed his writing activities.”

But the device had a subtler effect on his work. One of Nietzsche’s closest friends, the writer and composer Heinrich Köselitz, noticed a change in the style of his writing. Nietzsche’s prose had become tighter, more telegraphic. There was a new forcefulness to it, too, as though the machine’s power—its “iron”—was, through some mysterious metaphysical mechanism, being transferred into the words it pressed into the page. “Perhaps you will through this instrument even take to a new idiom,” Köselitz wrote in a letter, noting that, in his own work, “my ‘thoughts’ in music and language often depend on the quality of pen and paper.”

“You are right,” Nietzsche replied. “Our writing equipment takes part in the forming of our thoughts.”
2

 

WHILE NIETZSCHE WAS
learning to type on his writing ball in Genoa, five hundred miles to the northeast a young medical student named Sigmund Freud was working as a neurophysiology researcher in a Vienna laboratory. His specialty was dissecting the nervous systems of fish and crustaceans. Through his experiments, he came to surmise that the brain, like other bodily organs, is made up of many separate cells. He later extended his theory to suggest that the gaps between the cells—the “contact barriers,” as he termed them—play an essential role in governing the functions of the mind, shaping our memories and our thoughts. At the time, Freud’s conclusions lay outside the mainstream of scientific opinion. Most doctors and researchers believed that the brain was not cellular in construction but rather consisted of a single, continuous fabric of nerve fibers. And even among those who shared Freud’s view that the brain was made of cells, few paid any attention to what might be going on in the spaces between those cells.
3

Engaged to be wed and in need of a more substantial income, Freud soon abandoned his career as a researcher and went into private practice as a psychoanalyst. But subsequent studies bore out his youthful speculations. Armed with ever more powerful microscopes, scientists confirmed the existence of discrete nerve cells. They also discovered that those cells—our neurons—are both like and unlike the other cells in our bodies. Neurons have central cores, or somas, which carry out the functions common to all cells, but they also have two kinds of tentacle-like appendages—axons and dendrites—that transmit and receive electric pulses. When a neuron is active, a pulse flows from the soma to the tip of the axon, where it triggers the release of chemicals called neurotransmitters. The neurotransmitters flow across Freud’s contact barrier—the synapse, we now call it—and attach themselves to a dendrite of a neighboring neuron, triggering (or suppressing) a new electric pulse in that cell. It’s through the flow of neurotransmitters across synapses that neurons communicate with one another, directing the transmission of electrical signals along complex cellular pathways. Thoughts, memories, emotions—all emerge from the electrochemical interactions of neurons, mediated by synapses.

During the twentieth century, neuroscientists and psychologists also came to more fully appreciate the astounding complexity of the human brain. Inside our skulls, they discovered, are some 100 billion neurons, which take many different shapes and range in length from a few tenths of a millimeter to a few feet.
4
A single neuron typically has many dendrites (though only one axon), and dendrites and axons can have a multitude of branches and synaptic terminals. The average neuron makes about a thousand synaptic connections, and some neurons can make a hundred times that number. The millions of billions of synapses inside our skulls tie our neurons together into a dense mesh of circuits that, in ways that are still far from understood, give rise to what we think, how we feel, and who we are.

Even as our knowledge of the physical workings of the brain advanced during the last century, one old assumption remained firmly in place: most biologists and neurologists continued to believe, as they had for hundreds of years, that the structure of the adult brain never changed. Our neurons would connect into circuits during childhood, when our brains were malleable, and as we reached maturity the circuitry would become fixed. The brain, in the prevailing view, was something like a concrete structure. After being poured and shaped in our youth, it hardened quickly into its final form. Once we hit our twenties, no new neurons were created, no new circuits forged. We would, of course, continue to store new memories throughout our lives (and lose some old ones), but the only structural change the brain would go through in adulthood was a slow process of decay as the body aged and nerve cells died.

Although the belief in the adult brain’s immutability was deeply and widely held, there were a few heretics. A handful of biologists and psychologists saw in the rapidly growing body of brain research indications that even the adult brain was malleable, or “plastic.” New neural circuits could form throughout our lives, they suggested, and old ones might grow stronger or weaker or wither away entirely. The British biologist J. Z. Young, in a series of lectures broadcast by the BBC in 1950, argued that the structure of the brain might in fact be in a constant state of flux, adapting to whatever task it’s called on to perform. “There is evidence that the cells of our brains literally develop and grow bigger with use, and atrophy or waste away with disuse,” he said. “It may be therefore that every action leaves some permanent print upon the nervous tissue.”
5

Young was not the first to propose such an idea. Seventy years earlier, the American psychologist William James had expressed a similar intuition about the brain’s adaptability. The “nervous tissue,” he wrote in his landmark
Principles of Psychology
, “seems endowed with a very extraordinary degree of plasticity.” As with any other physical compound, “either outward forces or inward tensions can, from one hour to another, turn that structure into something different from what it was.” James quoted, approvingly, an analogy that the French scientist Léon Dumont had drawn, in an earlier essay about the biological consequences of habit, between the actions of water on land and the effects of experience on the brain: “Flowing water hollows out a channel for itself which grows broader and deeper; and when it later flows again, it follows the path traced by itself before. Just so, the impressions of outer objects fashion for themselves more and more appropriate paths in the nervous system, and these vital paths recur under similar external stimulation, even if they have been interrupted for some time.”
6
Freud, too, ended up taking the contrarian position. In “Project for a Scientific Psychology,” a manuscript he wrote in 1895 but never published, he argued that the brain, and in particular the contact barriers between neurons, could change in response to a person’s experiences.
7

Such speculations were dismissed, often contemptuously, by most brain scientists and physicians. They remained convinced that the brain’s plasticity ended with childhood, that the “vital paths,” once laid, could not be widened or narrowed, much less rerouted. They stood with Santiago Ramón y Cajal, the eminent Spanish physician, neuroanatomist, and Nobel laureate, who in 1913 declared, with a tone that left little room for debate, “In the adult [brain] centres, the nerve paths are something fixed, ended, immutable. Everything may die, nothing may be regenerated.”
8
In his younger days, Ramón y Cajal had himself expressed doubts about the orthodox view—he had suggested, in 1894, that the “organ of thought is, within certain limits, malleable, and perfectible by well-directed mental exercise”
9
—but in the end he embraced the conventional wisdom and became one of its most eloquent and authoritative defenders.

The conception of the adult brain as an unchanging physical apparatus grew out of, and was buttressed by, an Industrial Age metaphor that represented the brain as a mechanical contraption. Like a steam engine or an electric dynamo, the nervous system was made up of many parts, and each had a specific and set purpose that contributed in some essential way to the successful operation of the whole. The parts could not change, in shape or function, because that would lead, immediately and inexorably, to the breakdown of the machine. Different regions of the brain, and even individual circuits, played precisely defined roles in processing sensory inputs, directing the movements of muscles, and forming memories and thoughts; and those roles, established in childhood, were not susceptible to alteration. When it came to the brain, the child was indeed, as Wordsworth had written, the father to the man.

The mechanical conception of the brain both reflected and refuted the famous theory of dualism that René Descartes had laid out in his
Meditations
of 1641. Descartes claimed that the brain and the mind existed in two separate spheres: one material, one ethereal. The physical brain, like the rest of the body, was a purely mechanical instrument that, like a clock or a pump, acted according to the movements of its component parts. But the workings of the brain, argued Descartes, did not explain the workings of the conscious mind. As the essence of the self, the mind existed outside of space, beyond the laws of matter. Mind and brain could influence each other (through, as Descartes saw it, some mysterious action of the pineal gland), but they remained entirely separate substances. At a time of rapid scientific advance and social upheaval, Descartes’ dualism came as a comfort. Reality had a material side, which was the realm of science, but it also had a spiritual side, which was the realm of theology—and never the twain shall meet.

As reason became the new religion of the Enlightenment, the notion of an immaterial mind lying outside the reach of observation and experiment seemed increasingly tenuous. Scientists rejected the “mind” half of Cartesian dualism even as they embraced Descartes’ idea of the brain as a machine. Thought, memory, and emotion, rather than being the emanations of a spirit world, came to be seen as the logical and predetermined outputs of the physical operations of the brain. Consciousness was simply a by-product of those operations. “The word Mind is obsolete,” one prominent neurophysiologist ultimately declared.
10
The machine metaphor was extended, and further reinforced, by the arrival of the digital computer—a “thinking machine”—in the middle of the twentieth century. That’s when scientists and philosophers began referring to our brain circuits, and even our behavior, as being “hardwired,” just like the microscopic circuits etched into the silicon substrate of a computer chip.

As the idea of the unchangeable adult brain hardened into dogma, it turned into a kind of “neurological nihilism,” according to the research psychiatrist Norman Doidge. Because it created “a sense that treatment for many brain problems was ineffective or unwarranted,” Doidge explains, it left those with mental illnesses or brain injuries little hope of treatment, much less cure. And as the idea “spread through our culture,” it ended up “stunting our overall view of human nature. Since the brain could not change, human nature, which emerges from it, seemed necessarily fixed and unalterable as well.”
11
There was no regeneration; there was only decay. We, too, were stuck in the frozen concrete of our brain cells—or at least in the frozen concrete of received wisdom.

BOOK: The Shallows
5.22Mb size Format: txt, pdf, ePub
ads

Other books

Red Light by J. D. Glass
Under His Hand by Anne Calhoun
Before I Wake by C. L. Taylor
La ciudad de oro y de plomo by John Christopher
Rory by Vanessa Devereaux
Starting Over by Sue Moorcroft
Corpse Suzette by G. A. McKevett