How We Know What Isn't So (9 page)

Read How We Know What Isn't So Online

Authors: Thomas Gilovich

Tags: #Psychology, #Developmental, #Child, #Social Psychology, #Personality, #Self-Help, #Personal Growth, #General

BOOK: How We Know What Isn't So
6.07Mb size Format: txt, pdf, ePub

Subsequent versions of the Prisoner’s Dilemma Game have maintained its basic structure but have changed the scenario and the “payoffs” so that participants can play many rounds of the game (with the same or different partners) and gain or lose different sums of money as a function of the combination of “cooperative” or “competitive” choices made.

4
Seeing What We Expect to See
 
The Biased Evaluation of Ambiguous and Inconsistent Data
 

I’ll see it when I believe it
.

Slip of the tongue by psychologist Thane Pittman

 

L
ife is a series of trade-offs. For every benefit gained, there is usually some cost. If we increase our speed on most tasks, we generally lose accuracy; to increase precision, we must slow down. If a successful business expands, it is likely to suffer a decline in the informality and access to the boss that may have been a large part of its initial success. Human beings are blessed with unsurpassed intelligence, but biologists tell us that getting the large brains responsible for that intelligence through the narrow birth canal requires that we be born prematurely and that we suffer an unusually long infancy of uncommon helplessness as a result.
1

Trade-offs are apparent in everyday judgment and reasoning as well. When making judgments and decisions, we employ a variety of informal rules and strategies that simplify fundamentally difficult problems and allow us to solve them without excessive effort and stress. These strategies are generally effective, but the benefit of simplification is paid for at the cost of occasional
systematic
error. There is, in other words, an ease/accuracy trade-off in human judgment.

The tendency to make judgments by “representativeness” that was described in Chapter 2 is a good example. Among other things, to reiterate, representativeness leads to the belief that causes resemble their effects: Big effects should have big causes, complex effects should have complex causes, and so on. This assumption contains some truth, and so it generally facilitates causal reasoning by narrowing the number of potential causes to consider. But not all causes resemble their effects (again, tiny viruses cause enormous epidemics), and an over-reliance on this assumption can lead people to ignore important causal relations and to “detect” some that are not there. Thus, the very same principle that permits us to make judgments with apparent ease and considerable success can also be responsible for some of our systematic errors.

No feature of human judgment and reasoning illustrates this trade-off of advantage and disadvantage better than the tendency for our expectations, preconceptions, and prior beliefs to influence our interpretation of new information. When examining evidence relevant to a given belief, people are inclined to see what they expect to see, and conclude what they expect to conclude. Information that is consistent with our pre-existing beliefs is often accepted at face value, whereas evidence that contradicts them is critically scrutinized and discounted. Our beliefs may thus be less responsive than they should to the implications of new information.

APPROPRIATE AND INAPPROPRIATE BIAS
 

At first blush, such uneven treatment of new information strikes most people as completely unjustified and potentially pernicious. It conjures up images, for example, of closed-minded people disregarding a person’s individual characteristics in deference to some invalid ethnic, gender, or occupational stereotype; it brings to mind examples of individuals and groups blindly adhering to outmoded dogma. To be sure, the tendency to evaluate evidence in a biased manner can have deleterious consequences and, as we shall see, it serves to bolster a great many questionable and erroneous beliefs. On closer inspection, however, the question of how impartial we should be in evaluating information that confirms or refutes our preconceptions is far more subtle and complicated than most people realize.

The issue is complex because it is also inappropriate and misguided to go through life weighing all facts equally and reconsidering one’s beliefs anew each time an antagonistic fact is encountered. If a belief has received a lifetime of support, it is perfectly justified to be skeptical of an observation or report that calls the belief into question, but to readily accept evidence that supports its validity. The skepticism of scientists who doubted the reports of cold fusion was entirely appropriate because it was based upon a solid theoretical foundation that specified what events are likely and unlikely, possible and impossible. Each of us is equally justified in looking askance at claims about UFO’s, levitations, and miracle cancer cures. Events that challenge a broadly-based and time-tested body of knowledge should be treated cautiously; those that fit with pre-existing knowledge can be accepted more freely. To clarify with a rather extreme example, consider two headlines: “Soviet Republic Votes for Secession,” and “Statue of Elvis found on Mars.” Surely we need not treat the two reports with equal seriousness.

As soon as we accept the legitimacy of treating new information unevenly, however, we worry about it being taken too far. How do we distinguish between the legitimate skepticism of those who scoffed at cold fusion, and the stifling dogma of the seventeenth-century clergymen who, doubting Galileo’s claim that the earth was not the center of the solar system, put him under house arrest for the last eight years of his life? In part, the answer lies in the distinction between skepticism and closed-mindedness. Many scientists who were skeptical about cold fusion nevertheless tried to replicate the reported phenomenon in their own labs; Galileo’s critics refused to look at the pertinent data. Equally important, however, is the foundation on which a person’s pre-existing beliefs and theories rest. We are justified in allowing our beliefs and theories to influence our assessments of new information in direct proportion to how plausible and well-substantiated they are in the first place. One need not feel concerned about quickly dismissing a purported levitation because our faith in the inexorable effect of gravity has been built up by a lifetime of consistent experience. Well-supported beliefs and theories have earned a bit of inertia, and should not be easily modified or abandoned because of isolated antagonistic “facts.” In marked contrast, many ethnic, gender, and occupational stereotypes are particularly troublesome because they often rest on such flimsy or non-existent evidence to begin with.

All of this is to say that the question of how even-handed we should be in evaluating evidence is rather complex. Not all bias is a bad thing; indeed, a certain amount is absolutely essential. The power and flexibility with which we reason depends upon our ability to use context, generic knowledge, and pre-existing information to disambiguate and extract meaning from new information—and, to some degree, to bias our interpretation of evidence. Consider, for example, the newspaper headline, “Mondale’s offensive looks hard to beat.”
2
Nothing in the words themselves allows us to determine whether it is referring to Mondale’s campaign strategy or to his physical appearance. Nevertheless, our pre-existing knowledge of what is and is not plausible allows us to quickly and effortlessly draw the correct conclusion.

Note, however, that it has proven extremely difficult to program even the most advanced computers to make such “simple” inferences.
3
3 Thus, without this ability to use context and expectations to “go beyond the information given,”
4
we would be unintelligent in the same way that computers with superior compututional capacity are unintelligent. As dysfunctional as they may be on occasion, our theories, preconceptions, and “biases” are what make us smart.

THE PATH OF BIAS
 

Ambiguous Information.
Our expectations can bias our evaluation of new information in two ways, depending largely on whether or not the information is ambiguous. Truly ambiguous information is often simply perceived in a way that fits our preconceptions. Consider how the stimulus “13” is differently perceived in the context of “12, 13, 14” versus “A, 13, C.” Similarly, the same smile can look warm and friendly when it is worn by someone we like, but smug or sinister when worn by someone we consider untrustworthy.

A particularly interesting example of how our expectations can influence what we see involves people’s negative associations to the color black and how they can influence the perceived aggressiveness of someone wearing black clothing. The “bad guys” have worn black hats since the invention of motion pictures, and psychological research has shown that film directors who employ this tactic are capitalizing on a very basic psychological phenomenon: Surveys conducted in a wide range of cultures reveal that black is seen as the color of evil and death in virtually all corners of the world.

This negative association leads to several interesting results in the domain of professional sports. When my colleague Mark Frank and I asked a group of respondents to rate the appearance of professional football and hockey uniforms, they judged those that were at least half black to be the most “bad,” “mean,” and “aggressive” looking. These perceptions influence, in turn, how specific actions performed by black-uniformed teams are viewed. We showed groups of trained referees one of two videotapes of the same aggressive play in a football scrimmage, one with the aggressive team wearing white and one with it wearing black. The referees who saw the black-uniformed version rated the play as much more aggressive and more deserving of a penalty than those who saw the white-uniformed version. The referees “saw” what this common negative association led them to expect to see. As a result of this bias, it is not surprising to learn that teams that wear black uniforms in these two sports have been penalized significantly more than average during the last two decades.
5

Unambiguous Information.
Our expectations can also slant our evaluations of unambiguous information, but in a rather different manner. In evaluating more clear-cut information, our perceptions are rarely so distorted that information that completely contradicts our expectations is seen as supportive. Nor do we simply ignore contradictory information and pay attention only to that which supports our preconceptions. Rather, our expectations have their effects through the way we subject inconsistent information to more critical scrutiny than consistent information; through the way we seek out additional information only when the initial outcomes are inconsistent with our expectations; and—more generally—through the way we assign meaning to new information. People place a premium on being rational and cognitively consistent, and so they are reluctant to simply disregard pertinent evidence in order to see what they expect to see and believe what they expect to believe. Instead, people subtly and carefully “massage” the evidence to make it consistent with their expectations. (A similar argument is made in Chapter 5 about the biasing effects of our
motivations
).

This point is effectively illustrated by a study in which proponents and opponents of the death penalty were exposed to evidence concerning the deterrent efficacy of capital punishment.
6
Both groups read summaries of the procedures, results, and critiques of two relevant studies. One study provided evidence supporting the deterrent efficacy of capital punishment and the other provided evidence against. For half the participants, the study supporting capital punishment compared homicide rates in the same state before and after capital punishment, and the study refuting its deterrent efficacy compared homicide rates in different states, some with capital punishment and others without. For the other participants, the type of studies supporting and refuting capital punishment was reversed. Thus, for both proponents and opponents of capital punishment, half of them had their expectations supported by one type of study and opposed by the other, and the other half were exposed to the opposite pattern of data.

The results of this experiment were striking. The participants considered the study that provided evidence consistent with their prior beliefs—regardless of what type of study that was—to be a well-conducted piece of research that provided important evidence concerning the effectiveness of capital punishment. In contrast, they uncovered numerous flaws in the research that contradicted their initial beliefs. The net effect of these two results was that the participants’ attitudes became polarized: Exposure to a mixed body of evidence made both sides even
more
convinced of the fundamental soundness of their original beliefs.

Now consider what the participants in this experiment did
not
do. They did not miscontrue the evidence against their position as being more favorable than it really was. They correctly saw hostile findings as hostile findings. Nor did the participants simply ignore these negative results. Instead, they carefully scrutinized the studies that produced these unwanted and unexpected findings, and came up with criticisms that were largely appropriate. Rather than ignoring outright the evidence at variance with their expectations, the participants cognitively transformed it into evidence that was considered relatively uninformative and could be assigned little weight. Thus, the participants’ expectations had their effect not through a simple process of ignoring inconsistent results, but through a more complicated process that involved a fair amount of cognitive effort.

This point is illustrated even more directly by research conducted in my own laboratory on the tendency of gamblers to evaluate outcomes in a biased manner.
7
This research began with the question of why gamblers persist in such an unrewarding enterprise. Why do gamblers believe, despite all their previous losses, that success is just around the corner? One might have predicted that they do so by remembering their successes and forgetting or repressing their failures. However, the actual state of affairs is more complicated. Gamblers do revise their personal histories of success and failure, but they do so in a way that is more subtle, and rather interesting.

Other books

In My Skin by Brittney Griner
Opiniones de un payaso by Heinrich Böll
Sweetbitter by Stephanie Danler
The Everything Guide to Herbal Remedies by Martha Schindler Connors
Skinny Legs and All by Robbins, Tom
Gilt and Midnight by Megan Hart
La Rosa de Alejandría by Manuel Vázquez Montalban
Somebody Love Me (Journeys) by Sutton, Michelle