Read Brain Buys Online

Authors: Dean Buonomano

Brain Buys (21 page)

BOOK: Brain Buys
2.05Mb size Format: txt, pdf, ePub
ads

At the outset the likelihood you will choose the incorrect door is 2/3. Let’s suppose the prize is behind door 3, and you chose door 1. If the host opens door 2 to reveal a goat and asks you if you’d like to switch, it would behoove you to do so, because the only remaining option is the correct door (3). The same logic holds if you had initially chosen door 2 and the host opened door 1. So 2/3 of the time switching doors guarantees the prize. In the other 1/3 of the cases, when the correct door was chosen at the outset, changing doors will lead you to a goat. But, clearly, switching is the way to go because it leads to a prize 66.7 percent of the time and a goat 33.3 percent of the time. One reason the problem is counterintuitive is that it seems that since our choice is random, switching should not matter; with only two doors left our chances should be 50/50, whether we switch or not, which would be true if not for Mr. Hall. The catch is that Monty Hall, in opening a door, has broken one of the cardinal assumptions about probabilities: he did something that was not at random. While we choose doors at random, he does not reveal them at random; he always shows us one with a goat, never the one with the prize. In doing so he has injected new information into the game, without altering our belief that the rules have stayed the same.

Even though Marilyn vos Savant gave the correct answer in her column, she was inundated with letters, including about a thousand from people with PhDs (many in mathematics and statistics), chastising her for propagating innumeracy. The debate received a surprising amount of national attention and a front-paged article in
The New York Times
.
22

Probability seems to be in a class by itself when it comes to mental blind spots and cognitive biases. Our intuition simply appears to run perpendicular to those of probability theory. This may be in part because the assumptions on which probability theory is built were unlikely to be met in natural settings.
23
Consider the gambler’s fallacy: our intuition tells us that if the roulette wheel landed on a red for the last five consecutive plays, we might want to place a bet on black, since it’s “due.” But then we did not evolve to optimize bets in casinos. Steven Pinker points out that “in any world but a casino, the gambler’s fallacy is rarely a fallacy. Indeed, calling our intuitive predictions fallacious because they fail on gambling devices is backwards. A gambling device is, by definition, a machine designed to defeat our intuitive predictions. It is like calling our hands badly designed because they make it hard to get out of handcuffs.”
24

Determining the probability that the roulette wheel will turn up black or red requires spinning it many times. Similarly, calculating the probability that a coin will land heads-up requires flipping it many times. But there is an additional implicit assumption: the properties of the roulette wheel or coin will not change over time. Coins don’t adapt or learn; they satisfy the condition of stationarity. We can safely assume that the “behavior” of the coin will be the same tomorrow as it is today. But in the natural world things are always changing. If my enemy shot 10 arrows at me and all were way off target, I’d be ill-advised to assume that the next 10 arrows will be equally harmless. Nature changes, and people and animals learn; the assumptions that are valid today are not necessarily valid tomorrow. Furthermore, in many ecologically realistic conditions we are not interested in the probability something will happen; what we care about is whether or not it will happen
this one time
. Will I survive if I swim across a crocodile-infested river? Will I survive if I’m bitten by the snake that just crossed my path? There are many things we don’t want to try again and again simply to establish a realistic estimate of the probability.

Perhaps one of the most famous examples of probability biases comes from cases in which people are asked to estimate or calculate an unknown probability based on other known probabilities. In one of the many different versions of these studies, the German cognitive psychologist Gird Gigerenzer presented the following pieces of information to 160 gynecologists:
25

1.
The probability that a woman has breast cancer is 1%.

2.
If a woman has breast cancer, the probability is 90% that she will have a positive mammogram.

3.
If a woman does not have breast cancer, there is a 9% chance she will have a positive mammogram (the false-positive rate.

Next he asked: if a woman has tested positive, what is the likelihood she actually has breast cancer? This is not an academic scenario; one can easily understand why it is important for both physicians and patients to grasp the answer to this question. Gigerenzer gave the doctors four possible options to choose from:

Only 20 percent of the physicians chose the correct option, C (10%); 14 percent chose option A, 47 percent chose option B, and 19 percent chose option D. So more than half of physicians assumed that there was more than an 80 percent chance that the patient had breast cancer. Gigerenzer points out the undue anxiety that would result from patients’ false belief that their chances of having breast cancer were so high.

Where does the correct answer come from? In a sample of 1000, the great majority of women (990) do not have breast cancer; but because of the 9 percent false-positive rate (which is quite high for a medical test), of these 990 women, 89 (9 percent of 990) women who do not have cancer will have a positive mammogram. That’s a lot of positive tests, particularly because only 10 women (1 percent of 1000) would be expected to have the disease, and of these, 9 to have a positive mammogram. Therefore, there will be a total of 98 positive tests, of which only 9 would truthfully indicate the disease—close to 10 percent. Gigerenzer went on to show that when he expressed the entire scenario in a more naturalistic manner, accuracy improved dramatically. For example, when the conditions were presented in terms of frequencies (statement 1 was reworded to read, “10 out of a population of 1000 women would be expected to have breast cancer”), most physicians (87 percent) choose the correct answer. In other words, the format used to present a problem is of fundamental importance. Gigerenzer argues that our awkward relationship with probability theory is not necessarily rooted in poor reasoning skills, but is because probabilities are not often encountered in ecologically realistic settings and thus do not represent a natural input format for the brain. Nevertheless, the fact remains: we are inept at making probability judgments.

NEUROSCIENCE OF BIASES

Now that we have sampled from a selection of cognitive biases, we can ask the question psychologists and economists have been asking for decades, and philosophers have been pondering for centuries, Are humans predominantly rational or irrational beings? In this oversimplified form, the question makes as much sense as asking whether humans are violent or peaceful creatures. We are both violent and peaceful, and rational and irrational. But why? How is it that on the one hand we have made it to the moon and back, cracked atoms, and uncoiled the mysteries of life itself; yet, on the other hand, we allow our decisions to be influenced by arbitrary and irrelevant factors and are inherently ill-equipped to decide whether we should switch doors in a game show? One answer to this paradox is that many of the tasks the brain performs are not accomplished by a single dedicated system but by the interplay between multiple systems—so our decisions are the beneficiaries, and victims, of the brain’s internal committee work.

Find the oddball (unique) item, in each of the panels in Figure 6.2 as quickly as possible.

Figure 6.2 Serial and parallel search.

Most people are able to spot the oddball much more quickly in the left than in the right panel. Why would this be, given that the panel on the right is simply the one on the left rotated by 90 degrees? The symbols on the left take the familiar shape of the numbers 2 and 5. You have had a lifetime of experiences with these symbols, but primarily in the upright position. This experience has led to neurons in your visual system specializing in “2” and “5” detection; accounting for an automatic and rapid ability to spot the standout. The task on the right, however, relies on attention and an effortful search among the less familiar symbols.
26

You have two strategies, or systems, at your disposal to find objects in a visual scene: an automatic one, referred to as a parallel search; and a conscious strategy, referred to as a serial search. Loosely speaking, there are also two independent yet interacting systems responsible for the decisions we make. These systems have been called the automatic (or associative) and the reflective (or rule-based) systems.
27
The automatic system is related to what we think of as our intuition, and it is unconscious, rapid, associative, and effortless. It is very sensitive to context and emotions, eager to jump to conclusions, and possesses a number of biases and preconceived assumptions. But the automatic system is precisely the one we need to understand what the people around us are saying and what their intentions are. It allows us to quickly decide if it is most prudent to stop or proceed through a yellow light. In his book
Blink
, Malcolm Gladwell examined the wisdom and folly of the automatic system, and the fact that training can make it the key-stone of expert judgments.
28
Through extensive experience, art dealers, coaches, soldiers, and doctors, learn to quickly evaluate situations overflowing with information and arrive at an effective assessment.

In contrast to the automatic system, the reflective system is slow, effortful, and requires conscious thought. It can adapt quickly to mistakes, and it is flexible and deliberative. This is the system we want to engage when we are problem solving, such as when we are trying to decide which mortgage plan is the best. It is the system Semmelweis used to figure out why there were so many more deaths in the First Obstetric Clinic. The reflective system ultimately grasps why we should switch doors when Monty Hall gives us the opportunity.

What do cows drink? Any initial urge to blurt out “milk” is a consequence of the automatic system, which associates cows with milk. But if you resisted that initial urge, the reflective system offered “water.” Here is another: A plastic baseball bat and a ball cost $1.10 in total. The bat costs $1 more than the ball. How much does the bat cost?
29
Most of us almost reflexively want to blurt out $1. Presumably our automatic system captures the fact that $1 + $0.10 matches the total of $1.10, but totally ignores the stipulation that the bat costs $1
more than the ball
. The reflective system must come to the rescue and point out that $0.05 + $1.05 also sums to the correct total
and
satisfies the condition that the bat cost $1 more than the ball.

We should not envision the automatic and reflective systems as distinct nonoverlapping parts of the brain, like two chips in a computer. Nevertheless, evolutionarily older parts of the brain are key players in the automatic system, and cortical areas that have recently undergone more dramatic expansion are likely pivotal to the reflective system.

The automatic system is the source of many of our cognitive biases. Does this mean the automatic system is inherently flawed, a failure of evolutionary design? No. First, the bugs in our automatic system are not a reflection of the fact that it was poorly designed, but once again, that it was designed for a time and place very different from the world we now inhabit. In this light we are “ecologically rational”—we generally make good and near-optimal decisions in evolutionarily realistic contexts.
30
Second, sometimes a powerful feature is also a bug. For instance, word processors and texting devices have “autocorrect” and “autocomplete” features, which can correct misspelled words or complete a couple of letters with the most common word. But it is inevitable that the wrong words will be inserted from time to time, and our messages will be garbled if we are not on our toes. By analogy, some cognitive biases are simply the flipside of some of the brain’s most important features.

BOOK: Brain Buys
2.05Mb size Format: txt, pdf, ePub
ads

Other books

Mistress at a Price by Sara Craven
Ghosts of Winter by Rebecca S. Buck
The Red Hot Fix by T. E. Woods
Two Lines by Melissa Marr
Lady Em's Indiscretion by Elena Greene
Canciones para Paula by Blue Jeans
The Lakeside Conspiracy by Gregg Stutts