Read How We Learn Online

Authors: Benedict Carey

How We Learn (5 page)

BOOK: How We Learn
3.92Mb size Format: txt, pdf, ePub
ads

The Power of Forgetting

A New Theory of Learning

Memory contests are misleading spectacles, especially in the final rounds.

At that point, there are only a handful of people left onstage and their faces reflect all varieties of exhaustion, terror, and concentration. The stakes are high, they’ve come a long way already, and any mistake can end it all. In a particularly tough to watch scene from the documentary
Spellbound
, about the Scripps National Spelling Bee, one twelve-year-old trips over the word “opsimath.” He appears to be familiar with the word, he’s digging deep, there’s a moment when he seems to have it—but then he inserts an “o” where it doesn’t belong.

Clang!

A bell rings—meaning:
wrong answer
—and the boy’s eyes bulge in stunned disbelief. A gasp sweeps through the crowd, followed by clapping, consolation applause for effort. He slinks offstage, numb. Variations of this scene repeat, as other well-prepped contestants miss a word. They slump at the microphone, or blink without seeing,
before being bathed in the same lukewarm applause. In contrast, those who move to the next round seem confident, locked in. The winner smiles when she hears her final word—“logorrhea”—and nails it.

These competitions tend to leave us with two impressions. One is that the contestants, and especially the winners, must be extra-human. How on earth are they doing that? Their brains must be not only bigger and faster but also
different
from the standard-issue version (i.e., ours). Maybe they even have “photographic” memories.

Not so. Yes, it’s true that some people are born with genetic advantages, in memory capacity and processing speed (though no one has yet identified an “intelligence gene” or knows with any certainty how one would function). It’s true, too, that these kinds of contests tend to draw from the higher end of the spectrum, from people who take a nerdy interest in stockpiling facts. Still, a brain is a brain is a brain, and the healthy ones all work the same way. With enough preparation and devotion, each is capable of seemingly wizardlike feats of memory. And photographic memories, as far as scientists can tell, don’t exist, at least not in the way that we imagine.

The other impression is more insidious, because it reinforces a common, self-defeating assumption: To forget is to fail. This appears self-evident. The world is so full of absentmindedness, tuned-out teenagers, misplaced keys, and fear of creeping dementia that forgetting feels dysfunctional, or ominous. If learning is building up skills and knowledge, then forgetting is losing some of what was gained. It seems like the enemy of learning.

It’s not. The truth is nearly the opposite.

Of course it can be a disaster to space out on a daughter’s birthday, to forget which trail leads back to the cabin, or to draw a blank at test time. Yet there are large upsides to forgetting, too. One is that it is nature’s most sophisticated spam filter. It’s what allows the brain to focus, enabling sought-after facts to pop to mind.

One way to dramatize this would be to parade all those spelling prodigies back onstage again for another kind of competition, a fast-paced tournament of the obvious. Quick: Name the last book you read. The last movie you saw. The local drugstore. The secretary of state. The World Series champions. And then faster still: your Gmail password, your sister’s middle name, the vice president of the United States.

In this hypothetical contest, each of those highly concentrated minds would be drawing a lot of blanks. Why? Not due to mere absentmindedness or preoccupation. No, these kids are alert and highly focused. So focused, in fact, that they’re blocking out trivial information.

Think about it: To hold so many obscure words in mind and keep the spellings straight, the brain must apply a filter. To say it another way, the brain must suppress—forget—competing information, so that “apathetic” doesn’t leak into “apothecary,” or “penumbra” into “penultimate,” and keep any distracting trivia from bubbling to the surface, whether song lyrics, book titles, or names of movie actors.

We engage in this kind of focused forgetting all the time, without giving it much thought. To lock in a new computer password, for example, we must block the old one from coming to mind; to absorb a new language, we must hold off the corresponding words in our native tongue. When thoroughly immersed in a topic or novel or computation, it’s natural to blank on even common nouns—“could you pass me the whatyoucallit, the thing you eat with?”

Fork.

As the nineteenth-century American psychologist William James observed, “If we remembered everything, we should on most occasions be as ill off as if
we remembered nothing.”

The study of forgetting has, in the past few decades, forced a fundamental reconsideration of how learning works. In a way, it has also altered what the words “remember” and “forget” mean. “The relationship
between learning and forgetting is not so simple and in certain important respects is quite the opposite of what people assume,” Robert Bjork, a psychologist at the University of California, Los Angeles, told me. “We assume it’s all bad, a failure of the system. But more often, forgetting is a friend to learning.”

The “losers” in memory competitions, this research suggests, stumble not because they remember too little. They have studied tens, perhaps hundreds of thousands of words, and often they are familiar with the word they ultimately misspell. In many cases, they stumble because they remember too much. If recollecting is just that—a
recollection
of perceptions, facts, and ideas scattered in intertwining neural networks in the dark storm of the brain—then forgetting acts to block the background noise, the static, so that the right signals stand out. The sharpness of the one depends on the strength of the other.

Another large upside of forgetting has nothing to do with its active filtering property. Normal forgetting—that passive decay we so often bemoan—is also helpful for subsequent learning. I think of this as the muscle-building property of forgetting: Some “breakdown” must occur for us to strengthen learning when we revisit the material. Without a little forgetting, you get no benefit from further study. It is what allows learning to build, like an exercised muscle.

This system is far from perfect. We have instantaneous and flawless recall of many isolated facts, it’s true: Seoul is the capital of South Korea, 3 is the square root of 9, and J. K. Rowling is the author of the
Harry Potter
books. Yet no complex memory comes back exactly the same way twice, in part because the forgetting filter blocks some relevant details along with many irrelevant ones. Features that previously were blocked or forgotten often reemerge. This drift in memory is perhaps most obvious when it comes to the sort of childhood tales we all tell and embellish. The time we borrowed the family car at age fourteen; the time we got lost on the metro the first time
we visited the city. After rolling out those yarns enough times, it can be tough to tell what’s true and what’s not.

The point is not that memory is nothing more than a pile of loose facts and a catalog of tall tales. It’s that retrieving any memory alters its accessibility, and often its content.

There is an emerging theory that accounts for these
and related ideas. It’s called the New Theory of Disuse, to distinguish it from an older, outdated principle stating, simply, that memories evaporate entirely from the brain over time if they’re not used. The new theory is far more than an updating, though. It’s an overhaul, recasting forgetting as the best friend of learning, rather than its rival.

A better name for it, then, might be the Forget to Learn theory. That phrase captures its literal implications and its general spirit, its reassuring voice. One implication, for instance, is that forgetting a huge chunk of what we’ve just learned, especially when it’s a brand-new topic, is not necessarily evidence of laziness, attention deficits, or a faulty character. On the contrary, it is a sign that the brain is working as it should.

No one knows why we should be such poor judges of forgetting or other mental skills that are so indispensable, so automatic, that they feel deeply familiar. Yet we are. And it helps to count the ways.

• • •

Let’s go back to the beginning, then. Let’s go back to the first learning laboratory of them all, to its sole occupant, and his most important contribution—the Forgetting Curve. The Forgetting Curve is exactly what it sounds like, a graph of memory loss over time. In particular, it charts the rate at which newly learned information fades from memory. It’s a learning curve, turned upside-down:

This curve, first published in the late 1880s, falls well short of breathtaking. It’s what anyone might draw if asked to guess how memory changes with time. Yet its creator, Hermann Ebbinghaus, wasn’t one for idle guesswork. He was exacting by nature, compulsive about evidence. He had to be, given his ambitions. In the late 1870s, as a young philosophy Ph.D., he zigzagged through Europe, thinking big. He longed to bridge philosophy and science, to apply rigorous measurement to some aspect of human nature or psychology. The only problem was, he didn’t know where to start. He was poking around in a secondhand Paris bookstall one afternoon when he pulled from the shelf a volume called
Elements of Psychophysics
by Gustav Fechner. A scientist with a mystical bent, Fechner saw a unifying mathematical connection between the inner, mental world and the outer, natural one. He argued that every human experience, even one as ephemeral as memory, should be reducible to measurable units that could be plugged into an equation of some sort. Fechner’s reputation as a scientist—he’d done elegant experiments on the sensation of touch—lent his more grandiose ideas some weight.

As he read, Ebbinghaus felt something inside him shift—a sensation
he would describe, years later, to a student. He must have glimpsed his future as well, right then and there, because he later dedicated his greatest work,
Memory: A Contribution to Experimental Psychology
, to Fechner.

The memory equation
. Did it even exist? If so, could it be written down?

Memories come in so many shapes and sizes. There are the hour-long and the lifelong; there are dates and numbers, recipes and recitals; not to mention stories, emotional perceptions, the look on a child’s face when he’s dropped at the bus stop on the first day of school, the knowing smile shared between two friends who think no one is looking: the tapestry of hijinks and heartbreaks that make up a life. Our
ability
to recall specific facts also varies widely. Some people are good with names and faces; others are much better at retrieving numbers, dates, formulas. How on earth do you measure such a shape-shifting ghost, much less study it?

A generation of scientists before Ebbinghaus had essentially stood down, taking a pass on the question. It was too much. The variables were overwhelming.

Yet where some saw a justified caution, Ebbinghaus saw a lack of nerve. “At the very worst we should prefer to see resignation arise from the failure of earnest investigations rather than from the persistent, helpless astonishment in the face of the difficulties,” he wrote, in explaining his motives for pursuing the memory equation. He would take the dare if there was no one else. He reasoned from first principles. To study how the brain stores new information, he needed information that was, in fact, new. A list of nouns or names or numbers simply wouldn’t do; people walk around with an enormous storehouse of associations for all of these things. Even abstract sketches have a Rorschach-like, evocative quality. Stare long enough at a cloud and it begins to look like a dog’s head, which in turn activates hundreds of dog-related circuits in the brain. Our brain can impute meaning to almost anything.

How Ebbinghaus arrived at his solution
remains a mystery. “Was it an invention in the commonly accepted sense of the term, that is to say, deliberate?” wrote the American psychologist David Shakow, much later, in a biographical essay. “Or was it largely a discovery? What part did the gurgle of an infant, a transient progression to infancy, the reading of
Jabberwocky
, the expletives of the Paris coachman for the London cabbie, play?”

What Ebbinghaus created was a catalog of nonsense sounds. These were single syllables, formed by sticking a vowel between two consonants. RUR, HAL, MEK, BES, SOK, DUS. By and large, they were meaningless.

Ebbinghaus had found his generic memory “units.”

He created about 2,300 of them—a pool of all possible syllables, or at least as many as he could think of. He put together lists of the syllables, random groupings of between seven and thirty-six each. Then he began to memorize one list at a time, reading the syllables out loud, pacing himself with a metronome, keeping track of how many repetitions he needed to produce a perfect score.

By the time he landed a job as an instructor, at the University of Berlin in 1880, he’d logged more than eight hundred hours of practice with his nonsense sounds. He continued the work in his small office, pacing the floor, a compact, bushy-bearded man in Ben Franklin spectacles, spitting out the syllables at a rate of as many as 150 a minute. (In another era or another country, he might have been hauled off and fitted with a lunatic suit.) He tested himself at various intervals: Twenty minutes after studying. An hour. A day later, then a week. He varied the duration of his practice sessions, too, and found (surprise) that more practice sessions generally resulted in higher test scores and a slower rate of forgetting.

BOOK: How We Learn
3.92Mb size Format: txt, pdf, ePub
ads

Other books

Talk Turkey by Bru Baker
Lies & Lullabies by Courtney Lane
Love After Dark by Marie Force
Species Interaction by Cheyenne Meadows
Death in Breslau by Marek Krajewski
Billionaire's Threat by Storm, Sloan