Read How We Know What Isn't So Online

Authors: Thomas Gilovich

Tags: #Psychology, #Developmental, #Child, #Social Psychology, #Personality, #Self-Help, #Personal Growth, #General

How We Know What Isn't So

BOOK: How We Know What Isn't So
4.36Mb size Format: txt, pdf, ePub
ads
How We Know
What Isn’t So
 

The Fallibility of Human
Reason in Everyday Life

 

Thomas Gilovich

 

THE FREE PRESS

THE FREE PRESS
A Division of Simon & Schuster Inc. 1230 Avenue of the Americas
New York, NY 10020
www.SimonandSchuster.com

 

Copyright © 1991 by Thomas Gilovich

All rights reserved,
including the right of reproduction
in whole or in part in any form.

T
HE
F
REE
P
RESS
and colophon are trademarks
of Simon & Schuster Inc.

First Free Press Paperback Edition 1993

Manufactured in the United States of America

20 19 18 17

Library of Congress Cataloging-in-Publication Data

Gilovich, Thomas.

How we know what isn’t so: the fallibility of human reason in
everyday life / Thomas Gilovich.

p.  cm.

Includes bibliographical references and index.

ISBN 0-02-911706-2

ISBN-13: 978-0-0291-1706-4

 

eISBN-13: 978-1-4391-0674-7

 

1. Reasoning (Psychology)

2. Judgment.

3. Evidence

4. Error.

5. Critical thinking.

6. Fallacies (Logic)

I. Title.

BF442.G55  1991

153.4′3-dc20                              90-26727

                              CIP

To Karen and Ilana

 
Contents
 

Acknowledgments

 

1. Introduction

 

PART ONE
Cognitive Determinants of Questionable Beliefs

2. Something Out of Nothing: The Misperception and Misinterpretation of Random Data

 

3. Too Much from Too Little: The Misinterpretation of Incomplete and Unrepresentative Data

 

4. Seeing What We Expect to See: The Biased Evaluation of Ambiguous and Inconsistent Data

 

PART TWO
Motivational and Social Determinants of Questionable Beliefs

5. Seeing What We Want to See: Motivational Determinants of Belief

 

6. Believing What We are Told: The Biasing Effects of Secondhand Information

 

7. The Imagined Agreement of Others: Exaggerated Impressions of Social Support

 

PART THREE
Examples of Questionable and Erroneous Beliefs

8. Belief in Ineffective “Alternative” Health Practices

 

9. Belief in the Effectiveness of Questionable Interpersonal Strategies

 

10. Belief in ESP

 

PART FOUR
Where Do We Go from Here?

11. Challenging Dubious Beliefs: The Role of Social Science

 

Notes

 

Index

 
Acknowledgments
 

Four people made unusually significant contributions to this work and deserve special thanks. Lee Ross commented on drafts of many of the chapters and provided a number of his uniquely illuminating insights on the phenomena at hand. Beyond that, I would like to thank Lee simply for being Lee—for being the most interesting “intuitive psychologist” I know, and for making the discussion of people and their commerce through everyday life so enjoyable. Karen Dashiff Gilovich read every word of this book and at times seemed to have something to say about nearly every one. She was in many respects my most challenging critic, but, as always, she delivered her critiques in the most loving, disarming, and helpful ways. I owe Dennis Regan and Daryl Bem a great debt for the helpful feedback they provided on earlier drafts and for their encouragement throughout the project.

Various chapters were improved by the comments of numerous people, and I would like to express my sincere thanks to all: Robert Frank, Mark Frank, David Hamilton, Robert Johnston, David Myers, James Pennebaker, Barbara Strupp, Richard Thaler, and Elaine Wethington. To protect them from blame for any wrong-headed ideas presented in this book, the usual disclaimers about ultimate responsibility apply.

Finally, I would like to thank the National Institute of Mental Health for the generous financial support that made possible much of my own research that is reported in this book, and Susan Milmoe of The Free Press for her enthusiasm and assistance during the past eighteen months.

1
Introduction
 

It ain’t so much the things we don’t know that get us into trouble. It’s the things we know that just ain’t so
.

Artemus Ward

 

I
t is widely believed that infertile couples who adopt a child are subsequently more likely to conceive than similar couples who do not. The usual explanation for this remarkable phenomenon involves the alleviation of stress. Couples who adopt, it is said, become less obsessed with their reproductive failure, and their new-found peace of mind boosts their chances for success.

On closer inspection, however, it becomes clear that the remarkable phenomenon we need to explain is not why adoption increases a couple’s fertility; clinical research has shown that it does not.
1
What needs explanation is why so many people hold this belief when it is not true.

People who are charged with deciding who is to be admitted to a distinguished undergraduate institution, a prestigious graduate school, or a select executive training program all think they can make more effective admissions decisions if each candidate is seen in a brief, personal interview. They cannot. Research indicates that decisions based on objective criteria alone are at least as effective as those influenced by subjective impressions formed in an interview.
2
But then why do people believe the interview to be informative?

Nurses who work on maternity wards believe that more babies are born when the moon is full. They are mistaken.
3
Again, why do they believe it if it “just ain’t so?”

This book seeks to answer these questions. It examines how questionable and erroneous beliefs are formed, and how they are maintained. As the examples above make clear, the strength and resiliency of certain beliefs cry out for explanation. Today, more people believe in ESP than in evolution,
4
and in this country there are 20 times as many astrologers as there are astronomers.
5
Both formal opinion polls and informal conversation reveal widespread acceptance of the reality of astral projection, of the authenticity of “channeling,” and of the spiritual and psychic value of crystals. This book attempts to increase our understanding of such beliefs and practices, and, in so doing, to shed some light on various broader issues in the study of human judgment and reasoning.

Several things are clear at the outset. First, people do not hold questionable beliefs simply because they have not been exposed to the relevant evidence. Erroneous beliefs plague both experienced professionals and less informed laypeople alike. In this respect, the admissions officials and maternity ward nurses should “know better.” They are professionals. They are in regular contact with the data. But they are mistaken.

Nor do people hold questionable beliefs simply because they are stupid or gullible. Quite the contrary. Evolution has given us powerful intellectual tools for processing vast amounts of information with accuracy and dispatch, and our questionable beliefs derive primarily from the misapplication or overutilization of generally valid and effective strategies for knowing. Just as we are subject to perceptual illusions in spite of, and largely because of, our extraordinary perceptual capacities, so too are many of our cognitive shortcomings “closely related to, or even an unavoidable cost of, [our] greatest strengths.”
6
And just as the study of perceptual illusions has illuminated general principles of perception, and the study of psychopathology has enhanced our knowledge of personality, so too should the study of erroneous beliefs enlarge our understanding of human judgment and reasoning. By design, then, this book dwells on beliefs that are wrong, but in doing so we must not lose sight of how often we are right.

As these remarks suggest, many questionable and erroneous beliefs have purely cognitive origins, and can be traced to imperfections in our capacities to process information and draw conclusions. We hold many dubious beliefs, in other words, not because they satisfy some important psychological need, but because they seem to be the most sensible conclusions consistent with the available evidence. People hold such beliefs because they seem, in the words of Robert Merton, to be the “irresistible products of their own experience.”
7
They are the products, not of irrationality, but of flawed rationality.

So it is with the erroneous belief that infertile couples who adopt are subsequently more likely to conceive. Our attention is automatically drawn to couples who conceive after adopting, but not to those who adopt but do not conceive, or those who conceive without adopting. Thus, to many people, the increased fertility of couples who adopt a child is a “fact” of everyday experience. People do not hold this belief because they have much of an emotional stake in doing so; they do so because it seems to be the only sensible conclusion consistent with the information that is most available to them.

Many of these imperfections in our cognitive and inferential tools might never surface under ideal conditions (just as many perceptual illusions are confined to impoverished settings). But the world does not play fair. Instead of providing us with clear information that would enable us to “know” better, it presents us with messy data that are random, incomplete, unrepresentative, ambiguous, inconsistent, unpalatable, or secondhand. As we shall see, it is often our flawed attempts to cope with precisely these difficulties that lay bare our inferential shortcomings and produce the facts we know that just ain’t so.

Returning to the infertility example once again, we can readily see how the world does not play fair. Couples who conceive after adopting are noteworthy. Their good fortune is reported by the media, transmitted by friends and neighbors, and therefore is more likely to come to our attention than the fate of couples who adopt but do not conceive, or those who conceive without adopting. Thus, even putting our own cognitive and inferential limitations aside, there are inherent biases in the data upon which we base our beliefs, biases that must be recognized and overcome if we are to arrive at sound judgments and valid beliefs.

In tackling this subject of questionable and erroneous beliefs, I continue the efforts of many social and cognitive psychologists who in the past several years have sought to understand the bounded rationality of human information processing. Part I of this book, “Cognitive determinants of questionable beliefs,” contains three chapters that analyze our imperfect strategies for dealing with the often messy data of the real world. Chapter 2 concerns random data and our tendency to see regularity and order where only the vagaries of chance are operating. Chapter 3 deals with incomplete and unrepresentative data and our limited ability to detect and correct for these biases. Chapter 4 discusses our eagerness to interpret ambiguous and inconsistent data in light of our pet theories and
a priori
expectations.

BOOK: How We Know What Isn't So
4.36Mb size Format: txt, pdf, ePub
ads

Other books

Elsinore by Jerome Charyn
Stephen Frey by Trust Fund
All That I See - 02 by Shane Gregory
King of the Castle by Victoria Holt
Cates, Kimberly by Angel's Fall
Vampire for Hire by J.R. Rain
08 Illusion by Frank Peretti
Girl on a Plane by Miriam Moss