How We Know What Isn't So (14 page)

Read How We Know What Isn't So Online

Authors: Thomas Gilovich

Tags: #Psychology, #Developmental, #Child, #Social Psychology, #Personality, #Self-Help, #Personal Growth, #General

BOOK: How We Know What Isn't So
9.28Mb size Format: txt, pdf, ePub

Our motivations thus influence our beliefs through the subtle ways we choose a comforting pattern from the fabric of evidence. One of the simplest and yet most powerful ways we do so lies in how we frame the very question we ask of the evidence. When we prefer to believe something, we may approach the relevant evidence by asking ourselves, “what evidence is there to support this belief?” If we prefer to believe that a political assassination was not the work of a lone gunman, we may ask ourselves about the evidence that supports a conspiracy theory. Note that this question is not unbiased: It directs our attention to supportive evidence and away from information that might contradict the desired conclusion. Because it is almost always possible to uncover
some
supportive evidence, the asymmetrical way we frame the question makes us overly likely to become convinced of what we hope to be true.

Kunda and her students have collected evidence indicating that our preferences lead us to test hypotheses that are slanted toward confirmation in precisely this way. In one study, participants were led to believe that either introversion or extroversion was related to academic success.
24
Not surprisingly, those who were led to believe that introversion was predictive of success thought of themselves as more introverted than those who were led to believe that extroversion was associated with success. More important, when asked to recall autobiographical events relevant to introversion/extroversion, those who were led to believe in the importance of introversion recalled more incidents of introversion, and they did so with greater speed. Those who were led to believe in the value of extroversion, in contrast, recalled more incidents of extroversion, and they did so more quickly. By establishing a preference for one of these traits, the ease of generating evidence consistent with that trait was facilitated. It seems that the preference led participants to formulate and test an asymmetrical hypothesis that was biased toward confirmation.

A second way in which our motives influence the kind of evidence we entertain involves whose opinions, expert or otherwise, we consult. We can often anticipate other people’s general beliefs and overall orientations, and thus can predict with some accuracy their views on a particular question. By judiciously choosing the right people to consult, we can increase our chances of hearing what we want to hear. Smokers can discuss their habit’s health risks with other smokers; Nixon fans can explore the “real meaning” of the Watergate scandal with those of similar ideological bent. There are a number of physiologists at Cornell who differ in their assessments of the importance of dietary fat as a determinant of serum cholesterol and arteriosclerosis. This variability in expert opinion gives members of the Cornell community an opportunity to find support for whatever eating practices they wish. Those who need to justify the lost opportunities brought on by an austere diet can talk with someone willing and able to describe the latest studies testifying to the evils of dietary fat; those with an appetite for Continental cuisine can talk with someone eager to discuss the critical flaws of those very same studies. We seek opinions that are likely to support what we want to be true.

People’s preferences influence not only the
kind
of information they consider, but also the
amount
they examine. When the initial evidence supports our preferences, we are generally satisfied and terminate our search; when the initial evidence is hostile, however, we often dig deeper, hoping to find more comforting information, or to uncover reasons to believe that the original evidence was flawed. By taking advantage of “optional stopping” in this way, we dramatically increase our chances of finding satisfactory support for what we wish to be true.
25

Consider a student who has performed poorly on an exam and wants desperately to believe that the test was unfair. The student may initially seek support for this interpretation by trying to recall specific questions that were ambiguous. If examples of ambiguity can be found, the student rests his case: the exam was unfair. If no such examples can be recalled, however, the search for supportive evidence continues. Maybe other students thought it was unfair! Again, if a number of like-minded others can be found, the test is deemed to be unfair; if not, then still further evidence is sought. Perhaps the student will think of all the things he learned in the course that were
not
tested, and therefore conclude that the test was unfair because it did not adequately cover all the course material. By considering a number of different sources of evidence and declaring victory whenever supportive data are obtained, the person is likely to end up spuriously believing that his or her suspicion is valid.

To illustrate further, consider a discussion I recently heard between two prominent psychologists concerning the severity of the AIDS risk among the heterosexual, non-drug-using population. One was arguing that the risks were overstated, whereas the other thought they were indeed so severe that they would soon bring about widespread changes in social life as we know it. Their opinions, furthermore, mirrored their preferences. One fervently wanted the sexual revolution to continue, and the other, someone who has lived a happy, monogamous life for some time, would just as soon see this era pass (in his words, “AIDS is not God’s punishment for licentiousness, but His way of reducing dissonance for sexual monogamy”). How did their divergent preferences influence how they arrived at, and how they justified, their ultimate beliefs? It is doubtful that their predilections led them simply to see things their way, with little attention to the relevant evidence. The consequences of ignoring reality are too great (indeed, in this case potentially fatal) for such a cavalier regard for the way things really are. However, their preferences did influence the kind of evidence each considered, as well as the
amount
they considered.

The person worried about the end of the sexual revolution began the discussion by noting the small number of drug-free heterosexuals in the United States who have contracted AIDS and assumed that that was decisive. Jarred out of premature security, however, by the other person’s statistics regarding AIDS transmission among heterosexuals in central and east Africa, he was momentarily concerned. But only momentarily. He proceeded to dig deeper into the matter, eventually finding solace in the fact that the state of public health in central Africa is so different from that in the United States that such information is not terribly informative. (“So many people there have open sores due to untreated venereal disease that of course AIDS is readily transmitted heterosexually.”)

The important point here is that although evidence and reality constrain our beliefs, they do not do so completely. For nearly all complex issues, the evidence is fraught with ambiguity and open to alternative interpretation. One way that our desires or preferences serve to resolve these ambiguities in our favor is by keeping our investigative engines running until we uncover information that permits a conclusion that we find comforting.

More generally, it is clear that we tend to use different criteria to evaluate propositions or conclusions we desire, and those we abhor. For propositions we want to believe, we ask only that the evidence not force us to believe otherwise—a rather easy standard to meet, given the equivocal nature of much information. For propositions we want to resist, however, we ask whether the evidence compels such a distasteful conclusion—a much more difficult standard to achieve. For desired conclusions, in other words, it is as if we ask ourselves, “
Can
I believe this?”, but for unpalatable conclusions we ask, “
Must
I believe this?” The evidence required for affirmative answers to these two questions are enormously different. By framing the question in such ways, however, we can often believe what we prefer to believe, and satisfy ourselves that we have an objective basis for doing so.

Optimistic Self-assessments and Self-based Definitions of Ability
. To consider a particularly intriguing example of how we juggle criteria to arrive at comforting conclusions, let us return to the previously discussed tendency for people to make unduly favorable assessments of their own abilities. Recall that, on average, people think of themselves as being much better than average. Part of the reason, it seems, is that different people use different criteria to evaluate their standing on a given trait—criteria that work to their own advantage. As economist Thomas Schelling explains, “… everybody ranks himself high in qualities he values: careful drivers give weight to care, skillful drivers give weight to skill, and those who think that, whatever else they are not, at least they are polite, give weight to courtesy, and come out high on their own scale. This is the way that every child has the best dog on the block.”
26
By basing our definitions of what constitutes being, say, athletic, intelligent, or generous on our own idiosyncratic strengths on these dimensions, almost all of us can think of ourselves as better than average and have some “objective” justification for doing so.

Several recent experiments indicate that such self-based definitions of ability are largely responsible for this “Lake Wobegon effect.” First, it has been shown that people are particularly inclined to think of themselves as above average on ambiguous traits—those for which the definition of what constitutes excellence can most readily be construed in self-serving ways. People rate themselves more favorably on amorphous traits like sensitivity and idealism (at the 73rd percentile, on average) than on relatively straightforward traits like thriftiness and being well-read (48th percentile). Further evidence was obtained in an experiment in which a group of university students was asked to rate the importance of a variety of academic skills (e.g., public speaking, math) and personal characteristics (e.g., creativity, meticulousness) in terms of how important they are in determining success in college. The students were also asked to rate their own standing on these characteristics. As expected, the students tended to think that the characteristics at which they excelled were most important in determining what constitutes a successful college student. Finally, it has been shown that the tendency for people to think of themselves as above average is reduced—even for ambiguous traits—when people are required to use specific definitions of each trait in their judgments.
27

This research effectively illustrates how we juggle different criteria to arrive at conclusions we favor.
*
As strong as our wishes or motives may sometimes be, they rarely lead us simply to see the world the way we would like to see it. To do so would invite pathology. It would require that we pay an excessively high price in cognitive inconsistency and in the ability to get along effectively in the world. Instead, we accomplish the same motivational goals more subtly by skewing the meaning we assign to the information we take in from the world. There are alternative ways of interpreting or “framing” what we encounter around us, and we seem to be fairly adept at finding a frame that is comforting. (Indeed, some evidence has accumulated that people who habitually fail to put the most favorable cast on their circumstances run the risk of depression.
28
) It is in these relatively subtle shifts of criteria and interpretation that many of the most significant effects of the wish to believe can be found.

EPILOGUE: BELIEFS AS POSSESSIONS
 

A supplementary perspective on how our preferences influence what we believe can be obtained by considering a useful metaphor offered by psychologist Robert Abelson, who argues that “beliefs are like possessions.”
29
We acquire and retain material possessions because of the functions they serve and the value they offer. To some extent, the same can be said of our beliefs: We may be particularly inclined to acquire and retain beliefs that make us feel good.

As Abelson notes, the similarity between beliefs and possessions is captured in our language. First of all, a person is said to “have” a belief, and this ownership connotation is maintained throughout a belief’s history, from the time it is “obtained” to the time it is “discarded.” We describe the formation of beliefs with numerous references to possession, as when we say that “I
adopted
the belief,” “he
inherited
the view,” “she
acquired
her conviction,” or, if a potential belief is rejected, “I don’t
buy
that.” When someone believes in something, we refer to fact that “she
holds
a belief,” or “he
clings
to his belief.” When a belief is “given up,” we state that “he
lost
his belief,” “she
abandoned
her convictions,” or “I
disown
my earlier stand.”

This metaphor sharpens our understanding of the formation and maintenance of beliefs in a number of ways. First, we are quite possessive and protective of our beliefs, as we are of our material possessions. When someone challenges our beliefs, it is as if someone criticized our possessions. We might no longer associate with that person, or we might seek solace and confirmation from others with similar beliefs. As with possessions, in other words, “one shows off one’s beliefs to people one thinks will appreciate them, not to those who are likely to be critical.”
30
Alternatively, we might respond to a challenge or criticism by thinking of compensatory features (“True, it is not very stylish, but I bought it for the gas mileage.”/“True, the raw statistics might seem to contradict me, but if you look at the intangibles… .”); or by shielding it from public view (“Maybe we should move the watercolor from the living room to the upstairs bedroom.”/“My beliefs work for me, why should I have to justify them to those people?”).

Other books

Inspector of the Dead by David Morrell
The Four Graces by D. E. Stevenson
Every Last Breath by Gaffney, Jessica
Jumping in Puddles by Barbara Elsborg
The Arranged Marriage by Katie Epstein
Trust in Me by Suzanna Ross