The Shallows (24 page)

Read The Shallows Online

Authors: Nicholas Carr

BOOK: The Shallows
5.45Mb size Format: txt, pdf, ePub

Kobi Rosenblum, who heads the Department of Neurobiology and Ethology at the University of Haifa in Israel, has, like Eric Kandel, done extensive research on memory consolidation. One of the salient lessons to emerge from his work is how different biological memory is from computer memory. “The process of long-term memory creation in the human brain,” he says, “is one of the incredible processes which is so clearly different than ‘artificial brains’ like those in a computer. While an artificial brain absorbs information and immediately saves it in its memory, the human brain continues to process information long after it is received, and the quality of memories depends on how the information is processed.”
28
Biological memory is alive. Computer memory is not.

Those who celebrate the “outsourcing” of memory to the Web have been misled by a metaphor. They overlook the fundamentally organic nature of biological memory. What gives real memory its richness and its character, not to mention its mystery and fragility, is its contingency. It exists in time, changing as the body changes. Indeed, the very act of recalling a memory appears to restart the entire process of consolidation, including the generation of proteins to form new synaptic terminals.
29
Once we bring an explicit long-term memory back into working memory, it becomes a short-term memory again. When we reconsolidate it, it gains a new set of connections—a new context. As Joseph LeDoux explains, “The brain that does the remembering is not the brain that formed the initial memory. In order for the old memory to make sense in the current brain, the memory has to be updated.”
30
Biological memory is in a perpetual state of renewal. The memory stored in a computer, by contrast, takes the form of distinct and static bits; you can move the bits from one storage drive to another as many times as you like, and they will always remain precisely as they were.

The proponents of the outsourcing idea also confuse working memory with long-term memory. When a person fails to consolidate a fact, an idea, or an experience in long-term memory, he’s not “freeing up” space in his brain for other functions. In contrast to working memory, with its constrained capacity, long-term memory expands and contracts with almost unlimited elasticity, thanks to the brain’s ability to grow and prune synaptic terminals and continually adjust the strength of synaptic connections. “Unlike a computer,” writes Nelson Cowan, an expert on memory who teaches at the University of Missouri, “the normal human brain never reaches a point at which experiences can no longer be committed to memory; the brain cannot be full.”
31
Says Torkel Klingberg, “The amount of information that can be stored in long-term memory is virtually boundless.”
32
Evidence suggests, moreover, that as we build up our personal store of memories, our minds become sharper. The very act of remembering, explains clinical psychologist Sheila Crowell in
The Neurobiology of Learning
, appears to modify the brain in a way that can make it easier to learn ideas and skills in the future.
33

We don’t constrain our mental powers when we store new long-term memories. We strengthen them. With each expansion of our memory comes an enlargement of our intelligence. The Web provides a convenient and compelling supplement to personal memory, but when we start using the Web as a substitute for personal memory, bypassing the inner processes of consolidation, we risk emptying our minds of their riches.

In the 1970s, when schools began allowing students to use portable calculators, many parents objected. They worried that a reliance on the machines would weaken their children’s grasp of mathematical concepts. The fears, subsequent studies showed, were largely unwarranted.
34
No longer forced to spend a lot of time on routine calculations, many students gained a deeper understanding of the principles underlying their exercises. Today, the story of the calculator is often used to support the argument that our growing dependence on online databases is benign, even liberating. In freeing us from the work of remembering, it’s said, the Web allows us to devote more time to creative thought. But the parallel is flawed. The pocket calculator relieved the pressure on our working memory, letting us deploy that critical short-term store for more abstract reasoning. As the experience of math students has shown, the calculator made it easier for the brain to transfer ideas from working memory to long-term memory and encode them in the conceptual schemas that are so important to building knowledge. The Web has a very different effect. It places
more pressure
on our working memory, not only diverting resources from our higher reasoning faculties but obstructing the consolidation of long-term memories and the development of schemas. The calculator, a powerful but highly specialized tool, turned out to be an aid to memory. The Web is a technology of forgetfulness.

 

WHAT DETERMINES WHAT
we remember and what we forget? The key to memory consolidation is attentiveness. Storing explicit memories and, equally important, forming connections between them requires strong mental concentration, amplified by repetition or by intense intellectual or emotional engagement. The sharper the attention, the sharper the memory. “For a memory to persist,” writes Kandel, “the incoming information must be thoroughly and deeply processed. This is accomplished by attending to the information and associating it meaningfully and systematically with knowledge already well established in memory.”
35
If we’re unable to attend to the information in our working memory, the information lasts only as long as the neurons that hold it maintain their electric charge—a few seconds at best. Then it’s gone, leaving little or no trace in the mind.

Attention may seem ethereal—a “ghost inside the head,” as the developmental psychologist Bruce McCandliss says
36
—but it’s a genuine physical state, and it produces material effects throughout the brain. Recent experiments with mice indicate that the act of paying attention to an idea or an experience sets off a chain reaction that crisscrosses the brain. Conscious attention begins in the frontal lobes of the cerebral cortex, with the imposition of top-down, executive control over the mind’s focus. The establishment of attention leads the neurons of the cortex to send signals to neurons in the midbrain that produce the powerful neurotransmitter dopamine. The axons of these neurons reach all the way into the hippocampus, providing a distribution channel for the neurotransmitter. Once the dopamine is funneled into the synapses of the hippocampus, it jump-starts the consolidation of explicit memory, probably by activating genes that spur the synthesis of new proteins.
37

The influx of competing messages that we receive whenever we go online not only overloads our working memory; it makes it much harder for our frontal lobes to concentrate our attention on any one thing. The process of memory consolidation can’t even get started. And, thanks once again to the plasticity of our neuronal pathways, the more we use the Web, the more we train our brain to be distracted—to process information very quickly and very efficiently but without sustained attention. That helps explain why many of us find it hard to concentrate even when we’re away from our computers. Our brains become adept at forgetting, inept at remembering. Our growing dependence on the Web’s information stores may in fact be the product of a self-perpetuating, self-amplifying loop. As our use of the Web makes it harder for us to lock information into our biological memory, we’re forced to rely more and more on the Net’s capacious and easily searchable artificial memory, even if it makes us shallower thinkers.

The changes in our brains happen automatically, outside the narrow compass of our consciousness, but that doesn’t absolve us from responsibility for the choices we make. One thing that sets us apart from other animals is the command we have been granted over our attention. “‘Learning how to think’ really means learning how to exercise some control over
how
and
what
you think,” said the novelist David Foster Wallace in a commencement address at Kenyon College in 2005. “It means being conscious and aware enough to
choose
what you pay attention to and to
choose
how you construct meaning from experience.” To give up that control is to be left with “the constant gnawing sense of having had and lost some infinite thing.”
38
A mentally troubled man—he would hang himself two and a half years after the speech—Wallace knew with special urgency the stakes involved in how we choose, or fail to choose, to focus our mind. We cede control over our attention at our own peril. Everything that neuroscientists have discovered about the cellular and molecular workings of the human brain underscores that point.

Socrates may have been mistaken about the effects of writing, but he was wise to warn us against taking memory’s treasures for granted. His prophecy of a tool that would “implant forgetfulness” in the mind, providing “a recipe not for memory, but for reminder,” has gained new currency with the coming of the Web. The prediction may turn out to have been merely premature, not wrong. Of all the sacrifices we make when we devote ourselves to the Internet as our universal medium, the greatest is likely to be the wealth of connections within our own minds. It’s true that the Web is itself a network of connections, but the hyperlinks that associate bits of online data are nothing like the synapses in our brain. The Web’s links are just addresses, simple software tags that direct a browser to load another discrete page of information. They have none of the organic richness or sensitivity of our synapses. The brain’s connections, writes Ari Schulman, “don’t merely provide
access
to a memory; they in many ways
constitute
memories.”
39
The Web’s connections are not
our
connections—and no matter how many hours we spend searching and surfing, they will never become our connections. When we outsource our memory to a machine, we also outsource a very important part of our intellect and even our identity. William James, in concluding his 1892 lecture on memory, said, “The connecting
is
the thinking.” To which could be added, “The connecting
is
the self.”

 

“I PROJECT THE
history of the future,” wrote Walt Whitman in one of the opening verses of
Leaves of Grass
. It has long been known that the culture a person is brought up in influences the content and character of that person’s memory. People born into societies that celebrate individual achievement, like the United States, tend, for example, to be able to remember events from earlier in their lives than do people raised in societies that stress communal achievement, such as Korea.
40
Psychologists and anthropologists are now discovering that, as Whitman intuited, the influence goes both ways. Personal memory shapes and sustains the “collective memory” that underpins culture. What’s stored in the individual mind—events, facts, concepts, skills—is more than the “representation of distinctive personhood” that constitutes the self, writes the anthropologist Pascal Boyer. It’s also “the crux of cultural transmission.”
41
Each of us carries and projects the history of the future. Culture is sustained in our synapses.

The offloading of memory to external data banks doesn’t just threaten the depth and distinctiveness of the self. It threatens the depth and distinctiveness of the culture we all share. In a recent essay, the playwright Richard Foreman eloquently described what’s at stake. “I come from a tradition of Western culture,” he wrote, “in which the ideal (my ideal) was the complex, dense and ‘cathedral-like’ structure of the highly educated and articulate personality—a man or woman who carried inside themselves a personally constructed and unique version of the entire heritage of the West.” But now, he continued, “I see within us all (myself included) the replacement of complex inner density with a new kind of self—evolving under the pressure of information overload and the technology of the ‘instantly available.’” As we are drained of our “inner repertory of dense cultural inheritance,” Foreman concluded, we risk turning into “pancake people—spread wide and thin as we connect with that vast network of information accessed by the mere touch of a button.”
42

Culture is more than the aggregate of what Google describes as “the world’s information.” It’s more than what can be reduced to binary code and uploaded onto the Net. To remain vital, culture must be renewed in the minds of the members of every generation. Outsource memory, and culture withers.

A Digression On The Writing Of This Book

I KNOW WHAT
you’re thinking. The very existence of this book would seem to contradict its thesis. If I’m finding it so hard to concentrate, to stay focused on a line of thought, how in the world did I manage to write a few hundred pages of at least semicoherent prose?

It wasn’t easy. When I began writing
The Shallows
, toward the end of 2007, I struggled in vain to keep my mind fixed on the task. The Net provided, as always, a bounty of useful information and research tools, but its constant interruptions scattered my thoughts and words. I tended to write in disconnected spurts, the same way I wrote when blogging. It was clear that big changes were in order. In the summer of the following year, I moved with my wife from a highly connected suburb of Boston to the mountains of Colorado. There was no cell phone service at our new home, and the Internet arrived through a relatively poky DSL connection. I canceled my Twitter account, put my Facebook membership on hiatus, and mothballed my blog. I shut down my RSS reader and curtailed my skyping and instant messaging. Most important, I throttled back my e-mail application. It had long been set to check for new messages every minute. I reset it to check only once an hour, and when that still created too much of a distraction, I began keeping the program closed much of the day.

The dismantling of my online life was far from painless. For months, my synapses howled for their Net fix. I found myself sneaking clicks on the “check for new mail” button. Occasionally, I’d go on a daylong Web binge. But in time the cravings subsided, and I found myself able to type at my keyboard for hours on end or to read through a dense academic paper without my mind wandering. Some old, disused neural circuits were springing back to life, it seemed, and some of the newer, Web-wired ones were quieting down. I started to feel generally calmer and more in control of my thoughts—less like a lab rat pressing a lever and more like, well, a human being. My brain could breathe again.

My case, I realize, isn’t typical. Being self-employed and of a fairly solitary nature, I have the option of disconnecting. Most people today don’t. The Web is so essential to their work and social lives that even if they wanted to escape the network they could not. In a recent essay, the young novelist Benjamin Kunkel mulled over the Net’s expanding hold on his waking hours: “The internet, as its proponents rightly remind us, makes for variety and convenience; it does not force anything on you. Only it turns out it doesn’t feel like that at all. We don’t
feel
as if we had freely chosen our online practices. We feel instead that they are habits we have helplessly picked up or that history has enforced, that we are not distributing our attention as we intend or even like to.”
1

The question, really, isn’t whether people can still read or write the occasional book. Of course they can. When we begin using a new intellectual technology, we don’t immediately switch from one mental mode to another. The brain isn’t binary. An intellectual technology exerts its influence by shifting the emphasis of our thought. Although even the initial users of the technology can often sense the changes in their patterns of attention, cognition, and memory as their brains adapt to the new medium, the most profound shifts play out more slowly, over several generations, as the technology becomes ever more embedded in work, leisure, and education—in all the norms and practices that define a society and its culture. How is the way we read changing? How is the way we write changing? How is the way we think changing? Those are the questions we should be asking, both of ourselves and of our children.

As for me, I’m already backsliding. With the end of this book in sight, I’ve gone back to keeping my e-mail running all the time and I’ve jacked into my RSS feed again. I’ve been playing around with a few new social-networking services and have been posting some new entries to my blog. I recently broke down and bought a Blu-ray player with a built-in Wi-fi connection. It lets me stream music from Pandora, movies from NetFlix, and videos from YouTube through my television and stereo. I have to confess: it’s cool. I’m not sure I could live without it.

Other books

Only for You by Beth Kery
The Decameron by Giovanni Boccaccio
Kraken Orbital by James Stubbs
Untraceable by Lindsay Delagair
The Guardian by Jack Whyte
The Princess Bride by William Goldman
The Lesser Blessed by Richard van Camp