Liars and Outliers (9 page)

Read Liars and Outliers Online

Authors: Bruce Schneier

BOOK: Liars and Outliers
2.59Mb size Format: txt, pdf, ePub

At some scope of defection, stocks will be so depleted that everyone's catch in future years will be jeopardized. There's more at stake than whether Alice gets her fair share. In game theory, this is called a non-zero-sum game because wins and losses don't add up to zero: there are outcomes where everyone loses, and loses big.
7
A fishery is non-zero-sum. Other societal dilemmas might seem like zero-sum games with a finite resource: if one person takes more, others get less. But even in these instances, there is a potential for catastrophe in widespread defection. If a community can't share a common water resource, everyone's crops will die because farmers can't plan on water use. If a few people constantly hog the exercise equipment, others won't come to the gym, which will lose membership and close. If someone consistently takes all the cookies, Mother will stop baking them. Remember: it's a bad parasite that kills its host.

The non-zero-sum property is an essential aspect of a societal dilemma. The group result barely depends on any single person's actions. Alice's cooperation or defection doesn't appreciably change the number of overfishers, nor is it likely to collapse the fishing stocks. It's the actions of the group that determine the overall result; at some point, the effects of the overfishers on the group will change from nothing to irreversible damage.

It's also possible that the group will not reach that point, even if all the members take as much fish as they want. There might not be enough fishers in the waters, or fishing technology might not be efficient enough. All the members of the group might be able to fish as much as they possibly can without affecting each other or future fishing stocks. But at some point, either the waters will get crowded enough or the fishers will get technologically advanced enough that the Tragedy of the Commons dilemma will occur.

The disconnect between Alice's individual actions and the effect of the group's actions as a whole makes societal dilemmas even harder to solve in larger groups. Under a rational economic analysis, it makes no sense for Alice to cooperate. The group will do whatever it does, regardless of her actions, and her individual cooperation or defection won't change that. All she's really deciding is whether to seize or forgo the short-term benefits of defecting.

Societal Dilemma: Tragedy of the Commons.
Society: Some group of people, either a society of interest or a society of circumstance.
Group interest: That the common resource not run out, and be available for all.
Competing interest: Get as much of that resource as possible in the short term.
Group norm: Cooperate and share that resource within its sustainability limits..
Corresponding defection: Take as much of that shared resource as you can.

In a Tragedy of the Commons, people acting in their self-interest harm the group interest. There's another type of societal dilemma, where people can receive the benefit of those who act in the group interest without having to act in the group interest themselves. It's called the
free-rider problem
.

Whooping cough
(otherwise known as pertussis) is a good example. It's both almost entirely preventable and almost entirely untreatable. Early in the 20th century, before the establishment of widespread vaccination programs, it was one of the most feared illnesses, and it remains a significant cause of death in developing countries. Compared to other vaccines, the pertussis vaccine isn't actually very effective at conferring immunity to any one individual. The stan-dard infant schedule calls for four shots. After the first shot, about 30% become immune; after two, 50%; and even after all four shots have been administered, only about 90% of individuals have enough antibodies to fight off the disease.

What's more, vaccination is not without risk. The original pertussis vaccine carried a small risk of neurological damage. It has since been replaced with a safer vaccine, but a minuscule risk of adverse reactions still persists, as it does with any vaccine. In the late 1960s and early 1970s, adverse vaccine reactions received a lot of attention in the media, most notably in Sweden, Japan, and the UK. Parents began to refuse vaccinations for their children, and doctors were often powerless to persuade them that the benefits outweighed the risks.

One of the primary benefits of vaccination is herd immunity. If almost everyone is vaccinated against a particular disease, there's no way for it to take hold in the community. Even if someone gets the disease, it's unlikely he will be able to infect others. Parents who refuse to have their children vaccinated do not only endanger their own children; they increase the risk of infection for everyone in the community. This increases, of course, as more parents opt out of vaccination programs. And while this is true for any vaccinated disease, the danger is particularly acute for whooping cough because the vaccine doesn't confer complete immunity and isn't recommended for the youngest infants or for those who are immune-compromised.

Between 1974 and 1979, the rate of pertussis vaccination among Swedish infants dropped precipitously, from 90% to 12%. Correspondingly, the incidence of whooping cough in Swedish children under four skyrocketed from 0.05% in 1975—effectively zero—to 3.4% by 1983. Sweden went from a country that had all but eradicated whooping cough to a country with a 1 in 30 infection rate.

When parents decide whether or not to immunize their child, they are faced with a societal dilemma. They can choose to cooperate and vaccinate their child, or they can choose to defect and refuse. As long as most children are vaccinated, a child is better off not being immunized: he avoids the chance of adverse effects, but reaps the benefit of herd immunity. But if there are too many defectors, everyone suffers the increased risk of epidemics. And it's a non-zero-sum game; there's a point where epidemics suddenly become much more likely.

Societal Dilemma: Vaccination.
Society: Society as a whole.
Group interest: No epidemics.
Competing interest: Avoid the small risk of adverse side effects (encephalopathy, allergic or autoimmune reactions, or—in extreme cases—contracting the disease from the vaccination).
Group norm: Vaccinate.
Corresponding defection: Avoid vaccination.

A free rider receives the benefit of everyone else's cooperation without having to cooperate himself. Think of a single person in the community who doesn't pay his taxes; he gets all the benefits of the public institutions those taxes pay for—police and fire departments, road construction and maintenance, regulations to keep his food and workplace safe, a military—without having to actually pay for them.

But as more and more people stop paying their taxes, the government can provide fewer and fewer of those services—services that would be much more expensive or impossible for individuals to provide on their own—and the benefit of free riding is reduced. In the extreme, the whole system collapses.

Imagine a condominium without smoke detectors. The first tenant to install one is a sucker, because even though he pays for his detector, the building can burn down from a fire started elsewhere. The last tenant to install one is a fool, because he already receives the benefits of everyone else's detectors without having to pay anything.

It's easy to dismiss those original two-person examples as the responsibility of the two people alone. Alice and Bob can decide whether to rat on each other in jail, or whether to cheat each other when they buy and sell sealed bags. No one else needs to get involved. There's certainly no reason for society to get involved. Let the buyer and seller beware.

Society becomes involved because a broader societal dilemma emerges from Alice's and Bob's decisions. Let's look at the sealed bag exchange, focusing on customer Alice. She can either cooperate by paying for her purchase, or defect by defrauding merchant Bob. Yes, that decision most directly affects Bob, but—thinking more broadly about theft and society—it affects everyone.

Societal Dilemma: Defrauding merchants.
Society: Those who buy and sell goods.
Group interest: For commerce to operate smoothly.
Competing interest: Get stuff without having to pay for it.
Group norm: Don't defraud merchants.
Corresponding defection: Defraud merchants.

It's not that society cares about any particular thief; rather, society wants property rights to be respected. Note that it doesn't matter what sort of property rights deserve respect. There could be communal property, there could be personal property, and there could be everything in-between. What's important for society is for everyone to respect what society decides are the property rules that make collective life work, and then for everybody to be able to trust that those rules will be followed.
8

Similarly, if we focus on merchant Bob, we can see that he is in a corresponding societal dilemma with the society composed of all the other merchants: he can either treat his customers fairly or he can defraud them. Society doesn't want dishonest merchants; not only because we don't want to be defrauded, but also because we know that our entire system of commerce hinges on trust.

The alternative just wouldn't work. Merchants would stop doing transactions with all customers, not just with Alice. And customers would stop doing transactions with all merchants. Or they could both implement expensive and time-consuming bag-checking procedures that require them to each hire someone to help them perform transactions. And so on. Without trust, commerce collapses.

Even prisoners can have a broader community with a stake in whether or not prisoners confess. A criminal organization won't be concerned with Alice or Bob personally, but with members' loyalty to the organization. The organization as a whole benefits if it is viewed by individual members as an association in which they can trust others to keep their secrets, even at great personal cost.

Societal Dilemma: Criminals testifying against each other.
Society: The criminal organization.
Group interest: To minimize the amount of jail time for the society.
Competing interest: To minimize personal jail time.
Group norm: Don't testify against each other.
Corresponding defection: Testify against each other in exchange for reduced jail time.

The interesting thing about these dilemmas is that, looking at them in isolation, there's no logical solution. Thinking back to the prisoners, there is no analysis by which cooperation makes sense. Because they can't trust each other, they both end up confessing. This is the fundamental problem with cooperation: trust is unnatural, and it's not in the individual's short-term self-interest. This problem is why cooperation is so rare in the natural world, why it took so long to develop in humans, and why we have developed societal pressures as a way to enforce cooperation and hold society together.

Part II

A Model of Trust

Chapter 6

Societal Pressures

In game theory, Prisoner's Dilemmas have no solution. Because the two prisoners, or the merchant and customer, can't trust each other, they both end up defecting. The larger societal dilemmas—the arms race, the Tragedy of the Commons, and the free-rider problem—are similarly unsolvable. Defecting is the only course that makes logical sense, even though the end result will be disastrous for the entire group.

But that's not how people generally operate. We cooperate all the time. We engage in honest commerce, although Enron and AIG and Countrywide are some pretty spectacular exceptions. Most of us
don't overfish
, even though the few of us who do have depleted the ocean's stocks. We mostly vaccinate our children, despite the minor risk of an adverse reaction. Sometimes, even, we don't rat on each other in prison.
1

Prisoner's Dilemmas involve a risk trade-off between group interest and self-interest, but it's generally only a dilemma if you look at it very narrowly. For most people, most of the time, there's no actual dilemma. We don't stand at the checkout line at a store thinking: “Either the merchant is selling me a big screen TV, or this box is secretly filled with rocks. If it's rocks, I'm better off giving him counterfeit money. And if it has a TV, I'm still better off giving him counterfeit money.” Generally, we just pay for the TV, put it in our car, and drive home. And if we're professional check forgers, we don't think through the dilemma, either. We pay for the TV with a bad check, put it in our car—I suppose it's a getaway car this time—and drive back to our lair.

The problem isn't with people; the problem is with the dilemma.
2
Societal dilemmas are choices between group interest and some competing individual interest. It assumes the individuals are only trying to minimize their jail time, or maximize their fishing catch or short-term profits. But in the real world, people are more complicated than that. Our competing interests are more nuanced and varied, and they're subjective and situational. We try to maximize things other than our selfish self-interest. And our societal dilemmas are part of our ongoing relationships with other people.

Society solves societal dilemmas by making it in people's best interest to act in the group interest. We do this so naturally and so easily that we don't even notice the dilemma. Because of laws and police, it's not obviously better to steal a big screen TV than go without. Because of how everyone will react, it's not obviously smarter to betray a friend. Sure, no jail time is better than risking six years in jail, and catching more fish is better than catching fewer fish, but even those assessments fail to capture the richness of human emotion. Is no jail time but a reputation as a stool pigeon better than six years in jail? Is catching more fish but contributing to the degradation of the oceans better than catching fewer fish, even if everyone else is catching more than you? It depends. It depends on who you are. It depends on what you are. It depends on where you are.

Another famous dilemma illustrates this. The Stag Hunt was first formulated by
Jean-Jacques Rousseau
in 1754. In his scenario, a small group of hunters—it could be two and it could be more; it doesn't matter—are hunting a stag together. As would be obvious to readers of his day, everyone needs to work together in order to pull this off.

If it was a matter of hunting deer, everyone well realized that he must remain at his post; but if a rabbit happened to pass within reach of one of them, we cannot doubt that he would have gone off in pursuit of it without scruple and, having caught his own prey, he would have cared very little about having caused his companions to lose theirs.

What makes this different than the Prisoner's Dilemma is that the benefit of cooperation is more than the benefit of defection: a stag is much more food, even divided a few ways, than a rabbit. It would seem there's no actual dilemma; for all players, cooperate–cooperate is better than any other option. In the real world, however,
defections happen
in this sort of cooperative game all the time. It seems to make no sense.

Rousseau, too, ignored the variety and subjectivity of the hunters' competing interests. It's not obvious—for all people all the time—that a share of a stag is better than a whole rabbit. Sure, it's more meat, but that's not the only consideration. First of all, the stag isn't a done deal. The choice is between a guaranteed rabbit—they're small and easy to catch—and the possibility, maybe even the probability, of a share of a stag. Is our intrepid hunter Alice an optimist or a pessimist? Does she want to stalk stag for hours, or does she want to snare her rabbit, go home, and do something she really enjoys with the rest of her day? Maybe she's tired. Maybe she's bored. Maybe she doesn't even like the taste of stag, and has a great rabbit stew recipe she's been dying to try. (Me, I like the one in Julia Child's
The Way to Cook
.) Maybe she is happy to forgo the rabbit for a stag, but doesn't trust that her fellow hunters will do the same. The point is that it's not for Rousseau to conclude which of these considerations matter to the hunters; the hunters get to decide for themselves. And they're all going to decide differently.

Another dilemma is called the
Snowdrift Dilemma
, sometimes called Chicken.
3
Two drivers are trapped by a snowdrift; each can either cooperate by shoveling or defect by remaining in his own car. If both remain in their cars, both remain stuck. If at least one of them shovels, both are freed; and two shovelers will get the job done much faster and more reliably than one. But unlike a Prisoner's Dilemma, it's in each driver's best interest to cooperate, even if the other defects.
4

It turns out there are several different dilemmas
5
—generally called
social dilemmas
or coordination games—whose differences depend on the relative value of the various outcomes. Those nuances make a huge difference to game theorists, but are less important to everyday people. We make trade-offs based on what we want to do.
6

When you look at the details of players' competing interests, motivations, and priorities, you often realize they might not be playing the same game. What might be a Prisoner's Dilemma for Alice could be a Snowdrift for Bob. What might be a Snowdrift for Alice might be a Stag Hunt for Bob. For Alice, cooperating might be the obviously smart thing to do. She might feel bad about herself if she defected. She might be afraid of what her friends would think if she defected. There might be a law against defecting, and she might not want to risk the jail time. She'll have her own particular trade-off: her own subjective values about cooperating and defecting. Bob might cooperate or defect for completely different reasons. And even if Bob and Alice are playing the same game today, they might each play a different game tomorrow. The complexities of these societal dilemmas are
much more complicated
than simple game theory models.

Think back to the baboon story at the start of Chapter 4. Notice the societal dilemma:

Societal Dilemma: Gazelle hunting.
Society: Society of baboons.
Group interest: Tasty gazelle meat for everyone.
Competing interest: Gaining an advantage over a fellow baboon.
Group norm: Hunt cooperatively.
Corresponding defection: Attack a fellow baboon during the hunt.

One of the great achievements of our species is our ability to solve societal dilemmas. In a way, we solve them by cheating. That is, we don't solve them within the parameters of the game. Instead, we modify the game to eliminate the dilemma. Recall the two drivers stuck behind a fallen tree that neither one can move by himself. They're not in a Prisoner's Dilemma. They're not even in a Snowdrift Dilemma. In their situation, their selfish interest coincides with the group interest—they're going to move the tree and get on with their lives. The trick to solving societal dilemmas is make them look like that. That's what societal pressures do: they're how society puts its thumb on the scales.

Solving societal dilemmas often means considering the people involved and their situations more broadly. The sealed-bag exchange is no longer a Prisoner's Dilemma if we assume the people involved have a sufficiently strong conscience.

Alice might be thinking: “If I assume Bob will cooperate, I have two choices. If I cooperate, I'll get my purchase and feel good about cooperating with Bob. If I defect, I'll get my purchase for free but I'll feel guilty about cheating Bob. That guilty feeling is worse than giving up the money, so it makes sense for me to cooperate. On the other hand, if I assume Bob will cheat me, my two choices look like this: If I cooperate, Bob will take my money and I'll feel stupid and angry for cooperating with a cheat. If I defect, I won't get my purchase and will feel guilty for trying to cheat Bob. That stupid feeling for being cheated is a little worse than the guilty feeling for trying to cheat Bob—who turned out to be a cheat himself. But Bob is making this same analysis, and he doesn't want to feel guilty about cheating me, either. So he's not going to defect.”

And indeed, Bob makes the same analysis and also cooperates, although—most likely—they both don't consciously decide anything and both just behave honestly and trust each other to do the same. Maybe I have the emotions wrong—they could be motivated by a moral compass, by a sense of fairness, or by altruism towards the other person. In any case, dilemma solved.

Those guilty feelings come from inside our heads. Feelings of guilt are a societal pressure, one that works to varying degrees in each of us.

Moral pressure isn't the only thing we use to solve societal dilemmas. All of the considerations that make cooperation more attractive and defection less attractive are societal pressures. These include the rewards society directs towards cooperators and the penalties it directs towards defectors, the legal punishments society metes out to defectors, and the security measures that make defecting difficult to pull off and even more difficult to get away with.
7

Societal Dilemma: Stealing.
Society: Society as a whole.
Group interest: Respect property rights.
Competing interest: Get stuff without having to pay for it.
Group norm: Don't steal.
Corresponding defection: Steal.
To encourage people to act in the group interest, society implements these societal pressures:

Moral: People feel good about being honest and bad about stealing. People have been taught religious admonitions like “Thou shalt not steal.”

Reputational: Society shuns people who have a reputation for being thieves.

Institutional: Stealing is illegal, and society punishes thieves.

Security: Door locks, burglar alarms, and so on.

Of course, there's a lot more going on, and I'll discuss that in later chapters. The real world isn't this simplistic; any analysis of human interaction must take circumstances into account. If Alice is a tourist in a foreign country, Bob might cheat her anyway. If the dollar value of cheating is high enough, either Alice or Bob might decide that cheating is worth more than the negative feelings that result from cheating. In Chapter 3, I said that trust is contextual; all of that analysis applies here.

For most of us, it is more worthwhile to cooperate than to defect. It can be a better strategy for us, given what we know about the people who share in our dilemma.
8
And, for different and equally valid reasons, some of us find defection to be more valuable than cooperation. Not universally, not all of the time, but at that moment for that person and that particular trade-off. There are no actual dilemmas; there are just individual subjective risk trade-offs.

Here are six different ways societal pressures can reduce the scope of defection—which I'll illustrate using the example of Alice potentially cheating a merchant.

  • Pressures that increase the actual or perceived difficulty of defecting.
    Actual commerce usually doesn't happen inside sealed bags. Bob takes various additional security precautions to minimize the risk that Alice might cheat. Bob requires her to pay with hard-to-forge currency, or runs her credit card through a third-party authentication system. Window bars and burglar alarms make it harder for Alice to steal from Bob.
  • Pressures that raise the consequences of defecting.
    These would be largely implemented after the fact; think prison terms, fines, cutting off a thief's hand,
    9
    and social ostracism. Even if they never catch anyone, the police can make it difficult and expensive to commit a crime; every heist movie demonstrates this entertainingly.
  • Pressures that reduce the actual or expected benefits of defecting.
    Exploding ink cartridges can make stolen garments less useful to thieves, and daily ATM withdrawal limits restrict how much a thief can steal.
  • Pressures that limit the damage caused by the defections that happen.
    Bob won't keep a lot of cash in his store. He might even store some of his expensive inventory elsewhere. He'll also have an insurance policy that will help him resume normal business quickly after a theft.
  • Pressures that increase the benefits of cooperating.
    Reputation serves this function; Alice derives value from being known in society as honest and honorable in her business dealings, more so if she is part of the same society as the merchant she patronizes. Certainly Alice's credit rating is a part of her reputation. We also have a powerful
    need to conform
    to the group.
  • Pressures that lower the costs of cooperating
    . Society makes it easy to cooperate. Stores make check-out stands easy to find. Unforgeable paper money and credit cards make it easy to conduct commerce, as opposed to a barter system, or needing to lug around a sackful of gold and silver. Or think of the iTunes store, which makes it easy to buy music legitimately online.
  • There's a lot of overlap here, and many of these techniques are tightly coupled. When you reduce the benefits of defecting, you almost certainly reduce the frequency of defecting.

Other books

Merry Christmas, Ollie! by Olivier Dunrea
Jana Leigh & Bryce Evans by Infiltrating the Pack (Shifter Justice)
Smugglers 1: Nikki by Gerald McCallum
The Way We Fall by Crewe, Megan
Andrew Lang_Fairy Book 06 by The Grey Fairy Book
AwayFromtheSun by Austina Love
Bamboo and Blood by James Church
Wild Stars Seeking Midnight Suns by J. California Cooper
Music From Standing Waves by Johanna Craven