Geek Heresy (10 page)

Read Geek Heresy Online

Authors: Kentaro Toyama

BOOK: Geek Heresy
8.77Mb size Format: txt, pdf, ePub

It’s true, of course, that communication tools can bring people closer. Olympic broadcasts help unify countries with pride. In the week after I got on Facebook, I happily connected with friends from the third grade. But these are examples of people using technology to do more of what they already want to do, not making friends with old enemies.

How Not to Bridge the Digital Divide

The target of many social causes is some kind of inequality – of wealth, education, political voice, social status. Another is the “digital divide,” a phrase coined in the 1990s to describe unequal technology access
between rich and poor Americans. The term was quickly extended to global disparities, and soon bridging the digital divide became a rallying cry. One response was to develop low-cost technologies – to make things only rich people could own affordable for everyone. This was the idea behind One Laptop Per Child. Its early media buzz was based on a projected $100 price tag.
21
The Indian government rejected OLPC and proposed instead its own low-cost tablet, the Aakash, for $35.
22
And as early as 1999, there was Free-PC in the United States. The company offered PCs for $0; they were paid for by on-screen advertising.

Free-PC was discontinued, and the other products never hit their target prices. But bad business models aren’t the real problem with these efforts. The problem lies in the concept itself. Some people speak of low-cost access to goods as a kind of “democratization,” but in a real democracy, it’s one person, one vote. In a free market, it’s one dollar, one vote, which is a totally different beast. Richer people can always afford more technology. It’s not as if new technologies stop appearing while existing ones are made cheaper. By the time there are low-cost PCs, there are high-cost smartphones. By the time there are low-cost smartphones, there are high-cost phablets.
23
And by the time there are low-cost phablets, there will be high-cost digital glasses. There is no technological keeping up with the Joneses.

But suppose that an even distribution of technology were actually possible. What then? To answer this question, consider the following situation. Imagine the poorest person you can think of who is involuntarily poor. (The involuntary part is important – I’m not asking you to imagine a contented monk.) It might be a homeless person in your city, or a poor migrant worker in a remote area. Now imagine that you and that person were asked to raise as much money as you could for a charity of your choice, using nothing other than unlimited access to email for one week. Who would be able to raise more money? For most readers, it will be you. Because you have richer friends. You probably have more education and can write more persuasive emails. You likely have better organizational skills and could rally more people to the cause. And depending on the poor person you imagined, you might also be far ahead in basic skills such as literacy.

In this thought experiment, the technology is identical, but the outcome is different because of what you each started with. The differences are all about people – who you are, whom you know, and what you’re capable of. These are the same factors, incidentally, that allow you to be richer in the first place. Imagine repeating the same experiment, but not with someone who’s poor. Do the experiment with Bill Clinton or Bill Gates. Who would be able to raise more money, you or one of them? One of the Bills would, for the same reasons.

You could repeat these experiments with different information technologies (e.g., mobile phone calls, Twitter) and with different tasks (e.g., finding a job for your friend, seeking investment advice), and, for the most part, the results would be the same. In each case the technology is fixed, but the outcomes differ in proportion to the underlying advantages. Low-cost technology is just not an effective way to fight inequality, because the digital divide is much more a symptom than a cause of other divides.
24
Under the Law of Amplification, technology – even when it’s equally distributed – isn’t a bridge, but a jack. It widens existing disparities.
25

The Chinese Elephant

Harvard political scientist Gary King, who studies, among other things, the Chinese Internet, says it is the site of the “most extensive effort to selectively censor human expression ever implemented.” King has uncovered exactly what the Chinese government censors on its country’s social media platforms, and what he has found has unexpected lessons far beyond the digital realm.
26

According to King, “the Chinese Internet police force employs an estimated 50,000 censors who collaborate with about 300,000 Communist Party members. In addition, private firms are required by law to review the content on their own sites,” and for this they hire staff. King has reported that the overall censorship effort is so large that “it’s like an elephant walking through a room.” To track and measure its footprints, he conducted two subversive studies with colleagues Jennifer Pan and Margaret Roberts that offer new insights into the Chinese Leviathan.

In the first study, the team built a network of computers that watched 1,382 Chinese websites, monitoring new posts to see if and when they were censored. Eleven million posts covering eighty-five topics were chosen for investigation. The subjects ranged in political sensitivity from popular video games to the dissident artist Ai Weiwei. The researchers included online chatter resulting from real-world events.
27
In the second study, King and his team went undercover. They created fake accounts on over one hundred sites. They submitted posts to see which ones were censored. They even set up their own social media company in China.
28

Two of their findings stand out. First, China’s online censorship mechanisms are panoptic and efficient. Objectionable items are removed with a near-perfect elimination rate, typically within twenty-four hours of their posting. The researchers wrote, “This is a remarkable organizational accomplishment, requiring large scale military-like precision.”

Second, King and his team found what Chinese censors don’t like. They’re quick to act on anything that refers to, instigates, or otherwise links to grassroots collective action. Posts about protests, demonstrations, and even apolitical mass activities vanish quickly.
29
But the regime is comparatively comfortable with criticism of the government. For example, this passage was not censored:

The Chinese Communist Party made a promise of democratic, constitutional government at the beginning of the war of resistance against Japan. But after 60 years that promise has yet to be honored. China today lacks integrity, and accountability should be traced to Mao. . . . [I]ntra-party democracy espoused today is just an excuse to perpetuate one-party rule.

Meanwhile the following post, which refers to a man who responded to the demolition of his home by carrying out a suicide bombing, was nixed:

Even if we can verify what Qian Mingqi said on Weibo that the building demolition caused a great deal of personal damage, we should still
condemn his extreme act of retribution. . . . The government has continually put forth measures and laws to protect the interests of citizens in building demolition.

This comment was supportive of the government, but it was censored because it referred to a known source of public agitation. The distinction contradicts conventional ideas about totalitarian states. In George Orwell’s
1984
, Big Brother dealt quickly with any expressed disloyalty. But King’s findings reinforce more subtle theories of autocratic power, like that of his colleague Martin Dimitrov, who has argued that “regimes collapse when its [
sic
] people stop bringing grievances to the state.”
30
The real danger to a state comes when its citizens no longer complain in the open.

In fact, as King noted, a certain amount of public criticism may serve the Communist Party’s interests. It mollifies citizens who want to blow off steam, and it alerts the central government to issues requiring attention. It’s when the criticism spills over into calls for action that the censorship machine – and sometimes also the police – kicks in. The government is continually calibrating its tactics. In October 2013, a man in Shaanxi Province was detained for having a critical comment re-tweeted 500 times on Sina Weibo, China’s version of Twitter.
31
You can almost hear bureaucrats debating where the line should be: How many shares pose a collective-action threat – 250, 500, 1,000?

King’s study of Chinese social media censorship, then, reveals a lot more than just a strategy for online speech suppression. It provides clues to the Communist Party’s deepest fears and its sophisticated program of control. As we’ve seen with its heavy-handed response to uprisings in Xinjiang and Tibet, China is serious about suppressing physical protest. That intention carries over online, where censors are sensitive even to seemingly innocuous posts if they contain a seed for mass action. In a phone conversation, King told me that “political actors in any country use whatever means of communication they have to advance their goals. If technology allows them to do it faster, they’ll use technology.”

“In some ways, it’s the same in America,” King continued. Indeed, large technology companies in the United States are legally required to monitor and censor illegal content such as child pornography. And we know from recent revelations about the National Security Agency that our government is willing to strong-arm firms for the purposes of digital surveillance. “Functionally, that’s the same as what happens in China, though I won’t say it’s morally the same,” King said. In both countries, technology acts like a lens, magnifying and amplifying how governments act on their gravest concerns. By examining large-scale technology, you can ferret out hidden motivations.

Predicting Is Believing

The Law of Amplification enables us to make certain types of predictions. Under some conditions, it’s possible to gauge the future of a technology that doesn’t even exist yet. For example, imagine that scientists come up with the following inventions. In each pair, which one do you think would be more popular?

       
a)
  
A robot that cleans up after you, washes your dishes, and does all of your laundry.

       
b)
  
A robot that follows you around and verbally points out each of your personal flaws.

       
a)
  
A holographic device that projects the realistic illusion that your house is bigger than it is, outfitted with expensive furniture, and decorated by a professional interior decorator.

       
b)
  
A holographic device that projects the realistic illusion that your house is smaller than it is, outfitted with used furniture, and decorated by a college student.

       
a)
  
A novel device you wear on your belt buckle that guarantees a slim, fit figure, regardless of what you eat or how much you exercise.

       
b)
  
A novel device you wear on your belt buckle that guarantees an overweight figure, regardless of what you eat or how much you exercise.

None of these devices exists today, but you will have no trouble picking which of each pair would sell better. That’s because you already have a good sense of what most people want. Your ability to predict a technology’s success is based on an intuitive grasp of the human condition. Consistent with amplification, human preferences, more than technological design, decide which products succeed. Or, to put it another way, good design is the art of catering to our psyches.

You might quibble about which way these options would go. You might say that the outcomes depend on culture or the moment in history in which they occur. And you’d be right. What many Americans now consider an undesirable weight has been in other times and places a sign of wealth and status – for example, in the time of Peter Paul Rubens, who painted what we now call Rubenesque women.
32
Back then, device (b) would have done better than device (a). But that again proves that the technology doesn’t decide its outcome.

Similarly, we can predict that in future revolutions, all sides will use or abuse the communication technologies at their disposal. In the nineteenth century, rebels distributed pamphlets, autocrats closed printing presses, and the world heard about it months later by word of mouth. Here in the twenty-first century, rebels organize on Facebook; autocrats shut down the Internet; and the world watches events unfold on YouTube. Perhaps in the twenty-third century, rebels will rally on brain-to-brain transmitters; autocrats will scramble neuro-signals; and the world will watch it all through their synaptically projected awareness modules (known in the future as “SPAM”). The digital world is undoubtedly different from the analog and the postdigital, yet for so much of the social order . . .
plus la technologie change, plus c’est la même chose
.

Most importantly, amplification provides a guide as to whether social-change dollars should be spent on undeveloped technologies or on something else. We’ve seen how struggling schools aren’t turned
around by digital technologies, but tech utopians will insist that the right technology just hasn’t been invented yet. So let’s entertain their reverie for a moment and imagine a world with a powerful teaching machine like that from
The Matrix
.

I plug myself in, and, within seconds, “I know kung fu,” just like Keanu Reeves’s character in the movie. It’s an amazing technology that could teach just about anything, but will it eliminate inequities in education? In any world politically like ours, wealthy, influential parents will secure the best hardware for their own children, while the children of poor, marginalized households will have access to older models in need of repair. Rich kids will effortlessly learn quantum physics. Poor kids might come out quacking like a duck. Yet again, the technology will amplify the intentions (both explicit and implicit) of the larger society. And the same will be true of gamified e-textbooks, humanoid teaching robots, or any other novel technology. So, one prediction is this: If you’re interested in contributing to a fair, universal educational system, novel technology isn’t what will do the trick.

Other books

Style and Disgrace by Caitlin West
Game Girls by Judy Waite
Daughter of Dusk by Blackburne, Livia
The Bad Ass Brigade by Lee, Taylor
Choices by Cate Dean
Magical Acts: (Skeleton Key) by Michele Bardsley, Skeleton Key
Daily Life in Elizabethan England by Forgeng, Jeffrey L.
Dead Beat by Jim Butcher