Read Liars and Outliers Online
Authors: Bruce Schneier
Decisions about whether to implement a new societal pressure require careful consideration of the
trade-off between
its costs and benefits—which are extremely difficult to predict.
Security systems are often what economists call an experiential good: something you don't understand the value of until you've already bought, installed, and experienced it.
3
This holds true for other forms of societal pressure as well. If you're knowledgeable and experienced and perform a good analysis, you can make some good guesses, but it can be impossible to know the actual effects—or unintended consequences—of a particular societal pressure until you've already implemented it. This means that implementing societal pressures is always an iterative process. We try something, see how well it works, then fine-tune.
Any society—a family, a business, a government—is constantly balancing its need for security with the side effects, unintended consequences, and other considerations. Can we afford this particular societal pressure system? Are our fundamental freedoms and liberties more important than more security?
4
More onerous ATM security will result in fewer ATM transactions, costing a bank more than the ATM fraud. A retail store that installs security cameras in its dressing rooms will probably have fewer customers as a result, with a greater loss of revenue than was saved by the decrease of shoplifting. Online retailers face similar choices, since complicated security measures reduce purchases. In Chapter 9, we talked specifically about how hard it is to get the security effects of laws right. It's hard for all categories of societal pressure.
5
What all of this means is that it's easy to get societal pressures wrong. We implement more or less societal pressure than the risk warrants. We implement suboptimal, ineffective, or the wrong kind of security systems. Then, when we try to fix them, we get it wrong again. Many of the excesses in the War on Terror can be chalked up to overcompensation for the security failures that led to the terrorist attacks of 9/11.
In Chapters 7 through 10 we talked about how specific types of societal pressure fail. Here, I am going to talk more generally about societal pressure failures. These failures can be broken into several broad categories. These categories aren't hard and fast, and there's going to be some overlap. The goal here is just to give a feeling for how societal pressures can go wrong.
Misunderstanding the actor.
Potential defectors have many competing interests, ranging from selfish to moral; if you misunderstand them, you're likely to get security wrong. Defectors also have
different characteristics
, such as motivation, skill, money, risk aversion, and so on.
It makes no sense to spend $2 to forge an ID card worth $1, right? That's true if the defector is in it for the money. But if he's a security researcher analyzing weaknesses in the production process, a competing company trying to damage the business, or a hacker just trying to understand how the stuff works, it might be. Similarly, if you think terrorists are all foreigners, you'll miss the homegrown ones.
We've also touched on the problem of organized defectors. Organization is common in crime—well-funded criminal organizations are far more effective than lone criminals—and in terrorism.
6
It's also common among reform-minded defectors: abolitionists, animal rights activists, and so on. When defectors organize, societal pressures that worked in the past might not work as well. We talked about both of these problems in Chapter 11. A common misunderstanding is to assume that defectors are unorganized when they are—this happens often with crime—or to assume that defectors are organized when they are not, as happened with al Qaeda.
Misunderstanding the security incentives.
Sometimes societal pressure can fail because it creates an incentive for the wrong competing norm. An example will help make this clear.
Convincing people to reduce their trash is a societal dilemma. Moral pressure only goes so far, and reputational pressure against having a lot of trash is generally pretty weak. By far the easiest institutional pressure is to charge people by the amount of trash they generate: by the bag, by the bin, by the pound. The idea is to tax marginal defection and encourage people to reduce their trash.
Societal Dilemma: Limiting personal trash. | |
Society: Society as a whole. | |
Group interest: Limit the use of landfills. Group norm: Limit trash. | Competing interest: Laziness or apathy. |
Corresponding defection: Throw away as much trash as you want. | |
Competing interest: Minimize cost. | |
Corresponding defection: Overstuff the trash can. | |
To encourage people to act in the group interest, the society implements a variety of societal pressures. Moral: Awareness campaigns that emphasize the immorality of polluting. Reputational: Social pressure against people who put out a lot of trash. Institutional: Charge residents extra, based on how much trash they produce. Security: Garbage monitoring. |
However, a resident who wants to avoid the extra charges has several other options. He can stuff his trash more tightly into his bin. He can burn his trash to reduce the volume. He can dump his trash on the side of the road, or in the bin of a neighbor down the block. These options were always available to him, but before the extra trash collection fee, there was no reason to bother. As soon as you add societal pressures, some people will look for ways to get around them without having to cooperate in the original dilemma.
This isn't just theoretical. A study of nine municipalities showed exactly this sort of behavior—increases in trash burning and dumping—when
unit pricing was
implemented.
Stuffing more trash
in the bins, known as the “Seattle stomp” after the municipality where it was first noticed, is very common.
The failure here is the assumption that there is only one competing norm. In this case, there are a variety of ways to defect. And if the societal pressures only raise the cost of one competing norm, it could make the others more attractive. In this example, the trash fee didn't increase the cost of generating more trash; it merely increased the cost of generating more trash
and
putting that trash in trash cans. Directly targeting trash creation would be a better institutional pressure, but I can't think of any way a municipality could possibly make that work. On a larger scale, a disposal tax could be assessed when someone purchases a product. This would motivate product manufacturers to reduce packaging, or otherwise make their products more disposal-friendly, depending on the particulars of the tax. Of course, administering that would be difficult, and society would have to balance that cost with the benefit.
8
Misunderstanding the risk.
We don't make risk trade-offs based on actual risk; as shown in
13
, we make them
based on perceived
risk. If we believe the scope of defection is higher or lower than it really is, we're not going to implement optimal societal pressures. And there are lots of ways we get risk wrong.
Natural Biases in Risk Perception | |
We exaggerate risks that are… | We downplay risks that are… |
Spectacular | Pedestrian |
Rare | Common |
Personified | Anonymous |
Beyond our control | More under our control |
Externally imposed | Taken willingly |
Talked about | Not discussed |
Intentional or man-made | Natural |
Immediate | Long-term or diffuse |
Sudden | Evolving slowly over time |
Affecting us personally | Affecting others |
New and unfamiliar | Familiar |
Uncertain | Well understood |
Directed against children | Directed against adults |
Morally offensive | Morally desirable |
Entirely without redeeming features | Associated with some ancillary benefit |
This is all well-studied by psychologists. Current U.S. counterterrorism policy demonstrates these biases. Political scientist
John Mueller wrote
:
Until 2001, far fewer Americans were killed in any grouping of years by all forms of international terrorism than were killed by lightning, and almost none of those terrorist deaths occurred within the United States itself. Even with the September 11 attacks included in the count, the number of Americans killed by international terrorism since the late 1960s (which is when the State Department began counting) is about the same as the number of Americans killed over the same period by lightning, accident-causing deer, or severe allergic reaction to peanuts.
But that's not the way people think. Terrorism is rare, spectacular, beyond our control, externally imposed, sudden, new and unfamiliar, uncertain, potentially directed against our children, offensive, and entirely without redeeming features. For these and other reasons, we
exaggerate the risk
and end up spending much too much on security to mitigate it.
Another example is computer crime. It's pedestrian, common, slowly evolving, affecting others, increasingly familiar, and (at least by techies) well-understood. So it makes sense that we understate the risks and underfund security.
There are cultural biases to risk as well. According to one study conducted in 23 countries, people have a higher risk tolerance in cultures that avoid uncertainty or are individualistic, and a lower risk tolerance in cultures that are egalitarian and harmonious. Also—and this is particularly interesting—the wealthier a country is, the lower its citizens'
tolerance for risk
. Along similar lines, the greater the
income inequality
a society has, the less trusting its citizens are.
Creating a dilemma that encourages deception.
Think back to the two prisoners for a minute. Throughout this entire book, we've assumed that Alice and Bob are both actually guilty. What if they're not? Now, what is Alice's best strategy?
Disturbingly, it may still be in her best interest to confess and testify against Bob. Follow me here: if Bob lies and testifies against Alice, she is looking at either six or ten years in jail. Lying and testifying against Bob is the better choice for Alice: six years is better than ten. And if Bob remains silent, she's looking at either freedom or one year in jail. Again, lying is the better choice for Alice: freedom is better than one year in jail. By this analysis, both Alice and Bob fare best if they confess to crimes they did not commit in an attempt to get leniency for themselves while falsely accusing the other. To make matters worse, assume that Bob is innocent and Alice is guilty. It's still in Alice's interest to falsely testify against Bob.
Of course, the risk trade-off is more complicated than that. Alice and Bob have to assess the prosecutor's case, and weigh the trade-off between their false confession and the hope that justice will prevail in the end. But as soon as the police offer Alice and Bob this deal, they increase the likelihood that one or both of them will confess to a crime they didn't commit. This is the reason that plea bargaining is illegal in many countries: it sets up perverse incentives. This can only be exacerbated by the surprising tendency of people to make
false confessions
.
9
Generalizing, we find that all sorts of unsavory people try to align themselves with the police in exchange for leniency for their own actions. This kind of thing can happen whenever people cooperate with a norm they don't believe in.
Accidentally making the costs of cooperation too high.
Recall Chapter 11, where we talked about people assisting the police. One of Alice's potential competing interests is that cooperating with the police is too difficult, time-consuming, or dangerous. So even if Alice wants to cooperate, the cost is too high and she's forced to defect. This is the reason laws requiring the police to enforce immigration laws are a bad idea. The last thing you want is for someone to be afraid to assist the police out of fear that he will be deported. Another example is rape; if the cost of reporting a rape and helping prosecute the rapist is too emotionally high, women will not come forward. In general, there is a cost associated with cooperating. If we want to limit defections, we need to limit the costs—and/or increase the benefits—of cooperation.
Accidentally increasing the incentive to defect.
The point of societal pressure is to induce cooperation. Sometimes the results are backwards, and societal pressure induces defection. Again, an example will explain this. Currently in the United States, standardized student testing has incredible influence over the future fates of students, teachers, and schools. Under a law called the No Child Left Behind Act, students have to pass certain tests; if they don't pass, their schools are penalized. In the District of Columbia, the school system offered teachers $8,000 bonuses for improving test scores, and threatened them with termination for failing. Scores did increase significantly during the period, and the schools were held up as examples of how incentives affect teachers' behavior.