The Winter of Our Disconnect (21 page)

BOOK: The Winter of Our Disconnect
12.36Mb size Format: txt, pdf, ePub
I continued. “He’s a bright boy, and he does his work, but he doesn’t
know
anything.”
Bill’s adviser put her pen down. “Ah,” she said. “That.”
 
 
In any discussion of the impact of media on thinking and learning, it’s vital to distinguish between aptitude (our cognitive capacity, or “intelligence”) on the one hand and cognitive style (the habitual ways we use our intelligence) on the other. That said, in recent times, both have undergone significant change. Among other things, this means that your kids really are smarter than you are—just as you suspected and they keep telling you. At the same time, they really are more impaired—just as you suspected.
Hang on. Is
this
why they can do complex file conversions in their sleep, or edit a YouTube video with one hand tied behind their Facebook account, but still can’t remember where Antarctica is? (“Wait—down south, right?”)
Well, it could be. Because, hard as it is to believe when watching an episode of
Project Runway
, as I’ve said, studies show that across-the-board intelligence is increasing, at least insofar as IQ scores can be said to measure it. A score of 100 on an IQ test is still “average.” But that’s only because the tests are being constantly recalibrated. Raw IQ scores show an average increase of about three points per decade since 1920. And it’s not just the better fed, better educated, more affluent segment of the population who are growing sharper. Those in the middle part of the demographic curve are too—including “the people who have supposedly been suffering from a deteriorating public-school system and a steady diet of lowest-common-denominator television and mindless pop music,” as journalist Malcolm Gladwell puts it.
8
The trend is not in spite of our increasing reliance on electronic media, but because of it, argues Steven Johnson. He believes that TV, computer games, and social media all place greater cognitive demands on users than earlier forms of leisure. This doesn’t seem to make sense if we compare, say, watching
Australian Idol
with reading
The Brothers Karamazov
. But how about sitting on the back step chewing tobacco? Or darning socks? Or falling asleep at sundown after a twelve-hour day on the assembly line? Most people were
never
reading fat, complex Russian novels. Most people were staring into space. And compared with that,
Today Tonight
is brain food. Or so, at least, the argument runs. At the lowest level, more time assimilating content—however puerile—means less cognitive downtime, means more neurons firing, means increased capacity. Theoretically.
If all the time Anni is now spending on Facebook were devoted instead to, say, debating the carbon emissions trading scheme or acquiring the rudiments of Katakana, the brain benefits would be clear. But if instead she were embroidering a table runner ...? Hard to say.
When Bill rediscovered the joy of sax, he started out playing maybe twenty minutes a day. By month four of The Experiment, he was practicing for up to three hours a day. Basically, he’d swapped Grand Theft Auto for the Charlie Parker songbook—and he knew it. “What if I could take back all those hours I’d spent on The Beast,” he mused, “and used them for practice? I’d be
amazing
now.” Pretending that the same thought hadn’t occurred to me twelve thousand times was one of the toughest tests I’ve ever faced as a parent. I swallowed hard and tried to nod spontaneously. But in truth, it was still a big “if.” Giving up one activity does not guarantee you will take up a more worthy substitute. The possibility that you might quit smoking and become hooked on nicotine patches instead is a real one.
 
 
Sussy’s experience was a good case in point. Under the new regimen, her screen time dropped from around six hours a day to about one. (She still used her laptop at school, ostensibly for work.) But her talking-on-the-landline time ballooned to fill perhaps three-quarters of the gap. This had some interesting repercussions for her friendships (not to mention our phone bill). But the difference between IM-ing her friends on her laptop and talking to them on the landline was arguably a toss-up, cognitively speaking.
Anni was more diverse in the way she approached her newly freed-up free time: reading more, seeing friends more, cooking more. But she also spent long periods in bed, leafing idly—let’s not say slothfully—through glossy magazines and listening to crappy commercial radio. Was this in any sense “better” than her pre-Experimental binges of eBay window shopping to the accompaniment of an iTunes playlist set to terminal shuffle? It’s hard to see how.
Back when
I
was a kid listening to crappy commercial radio, we were taught that intelligence was a fixed entity. Like the bowl of waxed fruit on our dining room table—or, for that matter, like matter itself—intelligence could be neither created nor destroyed. Smart kids were born smart, and they would stay that way. Dumb kids sat at a special table where they belonged. Today we know the whole intelligence question is much more complicated. For one thing, we recognize that there are
kinds
of intelligence. Even our primary-school kids learn that now, and high time too. But we have also discovered that cognitive capacity can be cultivated. We know now that brains are “plastic”: not in the Barbie doll sense, but in the plasticine sense of being moldable. Brain structures—neurons and neural pathways—can and do change significantly with use. Like a catcher’s mitt or a nursing mother’s breasts, they morph to fit the use to which they are habitually put.
To an important degree, we really do become what we behold—and so, it turns out, do our brains. It’s not so much the content we absorb that makes the difference. It’s the way that content is packaged and transmitted via symbols (like an alphabet or semaphore) and media (like the printing press or a Bluetooth headset). When Marshall McLuhan famously observed, “The medium is the message,” that’s what he was getting at. Reading
Harry Potter
may be “better” than seeing the movie, or it may be “worse” ... but it is an entirely different story, quite literally, as far as brain function is concerned. In the same way, readers of ideograms, like the Chinese, develop a neural circuitry demonstrably different from that of alphabetic readers—and the differences are discernible across many regions of the brain, from the way memory is stored to the way visual and auditory data are processed. Ditto when you attempt to improve your tennis by playing a video game, or to cook macaroni and cheese using a tutorial on your handheld game (as Bill once attempted to do, reducing the entire household to a figurative puddle of cheese sauce).
It would be peculiar if People of the Book, as Digital Immigrants are by definition, did not develop along different cognitive lines to People of the Screen. The question is, how different—and different how? Watching my kids juggle e. e. cummings, video uploads from the latest social event (“It’s not a party, Mum—it’s a ‘gathering’”), instant messaging conversations with forty-seven of their closest friends, and the odd spot of extreme Googling, I used to wonder all the time how they did it. Their preferred explanation—that the multitasking teen brain simply has powers and abilities far beyond those of mortal monotaskers—seemed so logical. I mean, seriously:
I
couldn’t do what they do.
Imagine my surprise when it turned out that they couldn’t either.
 
 
Many morbidly obese individuals eat three meals a day. It’s not how often they eat that creates problems. It’s what they put on their plate. Maybe we shouldn’t be surprised that the epidemic John Naish calls “infobesity” works in much the same way.
The Kaiser Family Foundation’s latest study of multitasking teenagers found that Digital Natives weren’t necessarily spending more time with media than their parents were in the seventies—they were just packing more into it. American teenagers today spend an average 7.5 hours a day with media. But because using more than one device has become their new default setting, the figure for total daily media exposure clocks in at a horrifyingly Huxleyan 10 hours and 45 minutes. That’s an increase of more than two hours in the last five years.
For Generation M, multitasking acts as a kind of cognitive wind chill factor, intensifying engagement, fragmenting attention, and transforming entirely the experience that used to be called “tuning in.”
9
A
Los Angeles Times
/Bloomberg poll conducted way back in 2006 among 1,650 teens found that, while doing homework, 84 percent listened to music, 47 percent watched TV, and 21 percent did three tasks or more.
10
(And no, I’m not sure when watching Extreme Jack-ass stunts on YouTube got reclassified as a “task” either.) The 2010 Kaiser research found over 58 percent of seventh to twelfth graders say they multitask “most” of the time. Among eight- to eighteen-year-olds, one in three admit to multitasking “most of the time” while doing their homework.
The good news is that neuroscientific research in this area is accelerating almost as fast as your fourteen-year-old’s status updates. The bad news is that findings remain pretty basic, and only a few things are known for sure. One of them is that there is actually no such thing as multitasking.
Truly. Unlike your mother, your brain really can only do one thing at a time. Or, more accurately, it can only process information one task at a time. What may look like simultaneous engagements are actually sequential ones. The mind boggles, in other words, but the brain
toggles
—sometimes quite rapidly, from one task to the next. It’s an action that occurs in the region behind the forehead, the anterior prefrontal cortex, aka Brodmann area 10—a region that is one of the last to ripen (and one of the first to rot) with age. Not surprisingly, therefore, young children are actually worse at task switching than adults are.
David E. Meyer, director of the University of Michigan’s Brain, Cognition, and Action Laboratory, minces no words. Multitasking, he says, “is a myth.” And it always will be, thanks to the brain’s inherent limitations for information processing. “It just can’t be, any more than the best of all humans will ever be able to run a one-minute mile.”
11
Different brains, maybe. But
that
different? Uh-uh.
Meyer’s research shows that Sussy has been paying a heavy price for the privilege of Skyping her bestie while “simultaneously” studying for tomorrow’s science test. Digital Natives tested in Meyer’s lab took double the time or more to complete tasks while multitasking. Even more worryingly, their errors went way up. The “mystique” (as Meyer calls it) that this generation has sought to perpetuate about itself is a hopeful, if not downright arrogant, delusion.
When you think about it, you realize that of course this would be the case. Bill was absolutely and sincerely convinced that keeping one eye glued to the action-packed anime feature playing in the background made no difference to the quality of the essay on racism he was slowly but uncertainly pecking away at in the foreground. But then he would, wouldn’t he? Before The Experiment, he’d never really tried it any other way. What basis for comparison did he—do any of them—really have? The most recent research makes the point even more strongly.
A series of experiments carried out at Stanford’s Communication Between Humans and Interactive Media Lab and published in the
Proceedings of the National Academy of Science
in August 2009 tested two sets of college students—those who self-identified as heavy media multitaskers and those who said they were light multitaskers—on a range of problem-solving tasks. The main finding?
“Multitaskers were just lousy at everything,” is how researcher Clifford Nass summed it up. Nass and his fellow researchers set out to identify the cognitive advantages of multitasking. They’d designed the study with that purpose in mind. What they discovered was so contrary to expectations that Nass admitted, “It keeps me up late at night.”
12
Weirdly, the heavy multitaskers were especially disabled when it came to . . . well, multitasking. Compared with their peers, they were terrible at filtering out distractions. They were also less efficient at task switching, routinely paying a much higher cognitive “switch cost” in pace and accuracy.
Researchers were surprised to find that the experienced multitaskers had problems with working memory too—basically, they were less selective about what data they paid attention to, and this made them more vulnerable to distraction.
“I was sure they had some secret ability,” Nass commented later. “But it turns out high multitaskers are suckers for irrelevancy.”
“We kept looking for multitaskers’ advantages in this study,” added principal researcher Eyal Ophir. “But we kept finding only disadvantages. We thought multitaskers were very much in control of information. It turns out, they were just getting it all confused.”
13
They’re not the only ones. “The core of the problem,” Nass muses, is that multitaskers “think they’re great at what they do; and they’ve convinced everybody else they’re good at it too.”
14
 
 
When the kids started back to school and university in February 2009, I was more nervous about the homework question than they were. By that point, I’d had a month of trying to write in the old-fashioned way, and the results did not exactly inspire confidence. Completing my weekly column, a task that normally consumed an hour or two of online research and half a day of writing and rewriting, was now taking two full days to pull together.
Switching from research-driven topics to more reflective ones was easy. It was fun being less tied to data, using newspaper or magazine articles as a jumping-off point for my own ruminations. It was a method I’d employed intermittently throughout the ten-plus years I’d been churning out weekly copy, and I found it often simplified my writing and made it more relate-able. This time was no exception. Without recourse to a big fat Google blitz, I was forced to think through my topics more rigorously. For laughs, I took fewer cheap shots and told more stories.

Other books

Star Chamber Brotherhood by Fleming, Preston
Blue Moon by Luanne Rice
The Battered Body by J. B. Stanley
Chaos Theory by Penelope Fletcher
Unwanted Mate by Diana Persaud
Commitment Hour by James Alan Gardner
The Wanton Troopers by Alden Nowlan
Undoing Gender by Judith Butler
Finding Love for a Cynic by Tarbox, Deneice
Flash Fire by Caroline B. Cooney