The blogs give us a chance to communicate between us and motivate us to write more. When we publish on our blog, people from the entire world can respond by using the comments link. This way, they can ask questions or simply tell us what they like. We can then know if people like what we write and this indicate[s to] us what to do better. By reading these comments, we can know our weaknesses and our talents. Blogging is an opportunity to exchange our point of view with the rest of the world not just people in our immediate environment. (See Downes.)
The remarks sound scripted, as if a bright 11-year-old had hearkened to the blog hype and learned to talk the talk. But the vision is so compelling, the expectations so upbeat, that we can hardly blame her. To an eager eighth-grader just discovering the vastness of the planet, developing an ego, and hoping to make a mark, the blog offers what so many other things in her fresh preteen life hinder: freedom and reach. People 3,000 miles away may be watching and listening, and the students’ receptivity to feedback grants it an air of conscience and progress. Even if we contend that the vision stands on a set of callow and self-centered illusions—for instance, teenagers aiming to become the next lonelygirl15—it still compels students to read and write, reflect and criticize. The nineties Web, “Web 1.0,” kept users in passive mode, simply delivering content in faster, easier ways. The twenty-first-century Web, Web 2.0, makes users participants in critical, intellectual, artistic enterprises, and their own actions direct the growth of the virtual sphere in the way, say, links in blogs groove further usage patterns. Nielsen//NetRatings found in August 2006 that “User-Generated Content Drives Half of U.S. Top 10 Fastest Growing Web Brands,” indicating the living and breathing nature of interactive usage. As tech guru Tim O’Reilly put it in 2005, “Much as synapses form in the brain, with associations becoming stronger through repetition or intensity, the web of connections grows organically as an output of the collective activity of all web users.” The brain metaphor is telling. Blogs, wikis, and the rest swell the public intelligence. Adolescents adept at them advance the collective mind and expand the storehouse of knowledge. They engage in creativity and criticism, dialogue and competition, kindling thought, not deadening it. Why, then, should bibliophiles and traditionalists carp so much?
BECAUSE THAT GLORIOUS creation of youth intelligence hasn’t materialized. The Web expands nonstop, absorbing everything it can, and more knowledge and beauty settle there by the hour. But no such enhancement has touched its most creative and frequent users, the digital natives. There is no reciprocal effect. Digital enthusiasts witness faithfully the miraculous evolution of the digital sphere, but they also assume a parallel ascent by its consumers, an assumption with no evidence behind it. In 2007, Pew Research compared current affairs knowledge with information and news technology and concluded:
Since the late 1980s, the emergence of 24-hour cable news as a dominant news source and the explosive growth of the internet have led to major changes in the American public’s news habits. But a new nationwide survey finds that the coaxial and digital revolutions and attendant changes in news audience behaviors have had little impact on how much Americans know about national and international affairs. (“Public Knowledge of Current Affairs Little Changed by News and Information Revolutions, What Americans Know: 1989-2007”)
The youngest age group in the survey, 18- to 29-year-olds, scored the lowest, with only 15 percent reaching “high knowledge.” By 2005, high school students had inhabited a Web world for most of their lives, and the participatory capabilities of the screen (the “Read/ Write Web”) had existed for years. Nonetheless, when the 2005 NAEP test in reading was administered to twelfth-graders, the outcomes marked a significant decrease from 1992, before cell phones came along and back when few classrooms had a computer.
The latest NAEP figures are but another entry in the ongoing catalogue of knowledge and skill deficits among the Web’s most dedicated partakers. When we look at the front end of digital usage, at the materials and transactions available online, we discover a mega-world of great books, beautiful artworks, historical information, statistical data, intelligent magazines, informative conversations, challenging games, and civic deeds. But when we go to the back end of digital usage, to the minds of those who’ve spent their formative years online, we draw a contrary conclusion. Whatever their other virtues, these minds know far too little, and they read and write and calculate and reflect way too poorly. However many hours they pass at the screen from age 11 to 25, however many blog comments they compose, intricate games they play, videos they create, personal profiles they craft, and gadgets they master, the transfer doesn’t happen. The Web grows, and the young adult mind stalls.
As we’ve seen, it isn’t for lack of surfing and playing time, and the materials for sturdy mental growth are all there to be downloaded and experienced. Enough years have passed for us to expect the intellectual payoff promised by digital enthusiasts to have happened. Blogs aren’t new anymore, and neither is
MySpace, The Sims,
or text messaging. Students consult
Wikipedia
all the time. If the Web did constitute such a rich learning encounter, we would have seen its effects by now. An article on
Wikipedia
in
Reason
magazine by Katherine Mangu-Ward announces, “as with Amazon, Google, and eBay, it is almost impossible to remember how much more circumscribed our world was before it existed” (June 2007). But what evidence do we have that the world has dilated, that the human mind reaches so much further than it did just a decade or two ago? The visionary rhetoric goes on, but with knowledge surveys producing one embarrassing finding after another, with reading scores flat, employers complaining about the writing skills of new hires as loudly as ever, college students majoring in math a rarity, remedial course attendance on the rise, and young people worrying less and less about
not
knowing the basics of history, civics, science, and the arts, the evidence against it can no longer be ignored. We should heed informed skeptics such as Bill Joy, described by
Wired
magazine as “software god, hero programmer, cofounder of Sun Microsystems,” who listened to fellow panelists at Aspen Institute’s 2006 festival gushing over the learning potential of blogging and games, and finally exclaimed, “I’m skeptical that any of this has anything to do with learning. It sounds like it’s a lot of encapsulated entertainment. . . . This all, for me, for high school students sounds like a gigantic waste of time. If I was competing with the United States, I would love to have the students I’m competing with spending their time on this kind of crap.”
In the education and hi-tech worlds, Joy is a countercultural, minority voice, but the outcomes support his contention. In an average young person’s online experience, the senses may be stimulated and the ego touched, but vocabulary doesn’t expand, memory doesn’t improve, analytic talents don’t develop, and erudition doesn’t ensue. Some young users excel, of course, and the Web does spark their intellects with fresh challenges, but that’s the most we can say right now about digital technology’s intellectual consequences. Digital enthusiasts and reporters looking for a neat story can always spotlight a bright young sophomore here and there doing dazzling, ingenious acts online, but they rarely ask whether this clever intellect would do equally inventive things with pencil and paper, paint and canvas, or needle-nose pliers and soldering iron if the Web weren’t routinely at hand. Game researcher James Gee tells
,
“We have interviewed kids who have redesigned the family computer, designed new maps and even made mods, designed Web sites, written guides, and contracted relationships with people across the world,” but he doesn’t state how representative such extraordinary cases are. Large-scale surveys and test scores do, and the portrait they draw gainsays the bouncy profiles of young Web genius. For most young users, it is clear, the Web hasn’t made them better writers and readers, sharper interpreters and more discerning critics, more knowledgeable citizens and tasteful consumers. In ACT’s National Curriculum Survey, released in April 2007, 35 percent of college teachers agreed that the college readiness of entering students has declined in the last several years, and only 13 percent stated that it had improved. Furthermore, college teachers found that the most important prerequisites for success lay not in higher-order talents such as critical thinking, which enthusiasts of technology often underscore, but in lower-order thinking skills, that is, the basic mechanics of spelling, punctuation, and arithmetic. One month later, in May 2007, ACT reported in
Rigor at Risk: Reaffirming Quality in the High School Core Curriculum
that “three out of four ACT-tested 2006 high school graduates who take a core curriculum are not prepared to take credit-bearing entry-level college courses with a reasonable chance of succeeding in those courses.” Furthermore, their momentum toward college readiness remains stable from eighth to tenth grade, and slips only in the last two years of high school, when those higher-order thinking skills supposedly blossom in their schoolwork and online hours.
In light of the outcomes, the energetic, mind-expanding communitarian /individualist dynamic of Web participation described by digital enthusiasts sounds like rosy oratory and false prophesy. A foundation hosts symposia on digital learning, a science group affirms the benefits of video games, humanities leaders insist that we respect the resourceful new literacies of the young, reading researchers insist that Web reading extends standard literacy skills to hypermedia comprehension, school districts unveil renovated hi-tech classrooms, and popular writers hail the artistry of today’s TV shows. All of them foretell a more intelligent and empowered generation on the rise. The years have passed, though, and we’re still waiting.
CHAPTER FOUR
ONLINE LEARNING AND NON-LEARNING
In November 2006, Educational Testing Service (developer of the SAT) released the findings of a survey of high school and college students and their digital research skills. The impetus for the study came from librarians and teachers who noticed that, for all their adroitness with technology, students don’t seek, find, and manage information very well. They play complex games and hit the social networking sites for hours, the educators said, but they don’t always cite pertinent sources and compose organized responses to complete class assignments. They’re comfortable with the tools, but indiscriminate in their applications. ETS terms the missing aptitude Information and Communications Technology (ICT) literacy, and it includes the ability to conduct research, evaluate sources, communicate data, and understand ethical/legal issues of access and use. To measure ICT literacy, ETS gathered 6,300 students and administered a 75-minute test containing 15 tasks. They included determining a Web site’s objectivity, ranking Web pages on given criteria, and categorizing emails and files into folders.
The first conclusion of the report: “Few test takers demonstrated key ICT literacy skills” (
www.ets.org/ictliteracy.org
). Academic and workplace performance increasingly depends on the ability to identify trustworthy sources, select relevant information, and render it in clear, useful form to others, tasks that exceeded the talents of most students. While the majority of them knew that .edu and .gov sites are less biased than .com sites, only 35 percent of them performed the correct revision when asked to “narrow an overly broad search.” When searching the Web on an assigned task, only 40 percent of the test takers entered several terms in order to tailor the ensuing listing of sources. Only 52 percent correctly judged the objectivity of certain Web sites, and when “selecting a research statement for a class assignment,” less than half of them (44 percent) found an adequate one. Asked to construct a persuasive slide for a presentation, 8 percent of them “used entirely irrelevant points” and 80 percent of them mixed relevant with irrelevant points, leaving only 12 percent who stuck to the argument.
A story on the report in brandished the headline “Are College Students Techno Idiots?” (see Thacker). An official at the American Library Association remarked of the report, “It doesn’t surprise me,” and Susan Metros, a design technology professor at Ohio State, thought that it reflected a worrisome habit of online research: less than 1 percent of Google searches ever extend to the second page of search results. Because the sample wasn’t gathered in a consistent manner from school to school, ETS warned against generalizing too firmly from the results, but even with that caution its drift against the flow of what digital enthusiasts maintain is arresting. This was not—to use the idiom of anti-testing groups— another standardized, multiple-choice exam focused on decontextualized facts and rewarding simple memorization. The exam solicited precisely the decision-making power and new literacies that techno-enthusiasts claim will follow from long hours at the game console. The major finding: “More than half the students failed to sort the information to clarify related material.” It graded the very communications skills Web 2.0, the Read/Write Web, supposedly instills, and “only a few test takers could accurately adapt material for a new audience.”