The Illusion of Conscious Will (5 page)

Read The Illusion of Conscious Will Online

Authors: Daniel M. Wegner

Tags: #General, #Psychology, #Cognitive Psychology, #Philosophy, #Will, #Free Will & Determinism, #Free Will and Determinism

BOOK: The Illusion of Conscious Will
8.13Mb size Format: txt, pdf, ePub

Intentions that occur just prior to action, in contrast, do seem to compel the action. This is the basic phenomenon of the experience of will. In everyday language, of course, an action is often described as intended either when it is consciously planned (and we need not be conscious of it when we do it) or when we are consciously aware of doing it at the time we do it (and it might not have been planned in advance). Both are understood as acting on the basis of conscious will. However, if there is a conflict between them—as when we plan to stick to a diet at dinner but then end up (alas, all too consciously) splurging on dessert—the conscious idea of action that occurs just as the action starts is the one we will identify as our “true” intention. Prior plans that we fail to follow are then relegated to the recycling bin of false or mistaken intentions. The thoughts about the action at the time of action are the ones that prompt the strongest belief that we are causal agents who have made the act occur.

All these mental contents that seem to accompany and support causal agency in human beings need not be conscious at the moment of action. Instead, it seems that only the intention needs to appear in consciousness just as we act, whereas the beliefs, desires, and plans that may serve as the scaffolding for the intention need not be in consciousness. These other thoughts do seem to add substance to the idea that there is a conscious will that causes the action, and each of these kinds of thoughts seems to play a role in causing the action. But the conscious intention is, in a way, the mind’s “call” for the action it will do, and so the intention seems to be most immediately involved in the causation of the action.

Causal agency, in sum, is an important way in which people under-stand action, particularly human action. In the process of understanding actions performed by oneself or others, the person will appreciate information about intentions, beliefs, desires, and plans, and will use this in-formation in discerning just what the agent is doing. The intuitive appeal of the idea of conscious will can be traced in part to the embedding of the experience of will, and of the notion that will has a force, in the larger conception of causal agency. People appear to be goal-seeking agents who have the special ability to envision their goals consciously in advance of action. The experience of conscious will feels like being a causal agent.

Mechanisms and Minds

We all know a lot about agents and goals, desires, and intentions, and use these concepts all the time. These concepts are only useful, however, for understanding a limited range of our experience. The movements of clock hands and raindrops and electric trains, for instance, can be under-stood in terms of causal relations that have no consciousness or will at all. They are mechanisms. Extending the notion of causal agency to these items—to say these things have the ability to
cause themselves to be-have—
doesn’t fit very well with the physical causal relations we perceive all around us. Imagine a spoon, knife, and fork deciding to go for a walk to the far end of the dinner table (“We’re off to see the salad”), and you can see the problem. Things don’t usually will themselves to move, whereas people seem to do this all the time.

This rudimentary observation suggests that people have at hand two radically different systems of explanation, one for minds and one for everything else. Mentalistic explanation works wonders for understanding minds, but it doesn’t work elsewhere unless we want to start thinking that everything from people to rocks to beer cans to the whole universe actually does what it consciously wants.
9
Mechanistic explanation, in turn, is just splendid for understanding rocks and beer cans, and the movements of the planets, but it leaves much wanting in understanding minds.

Each of us is quite comfortable with using these two very different ways of thinking about and explaining events—a physical, mechanical way and a psychological, mental way. In the mechanical explanatory sys-tem, people apply intuitive versions of physics to questions of causality, and so they think about causes and effects as events in the world. In the mental explanatory system, people apply implicit psychological theories to questions of causality, focusing on issues of conscious thoughts and the experience of will as they try to explain actions. In the mechanical way of thinking, all the psychological trappings are unnecessary: A physical system such as a clock, for instance, doesn’t have to intend to keep time or to experience doing so. The essence of the mental explanatory system, in contrast, is the occurrence of the relevant thoughts and feelings about the action, and in this system the objects and events of physical causality are not particularly important: A person might experience having willed the death of an enemy and become wracked with guilt, for instance, even though there was no mechanism for this to have happened.

9
. This odd possibility is the extreme consequence of attributing minds to things that can’t talk. Chalmers (1996) makes the case for this theory, such as it is.

These two explanatory systems fall into place as children develop ways of understanding both the physical and psychological worlds. The first inklings that mind perception and mechanistic explanation might develop separately in children came from the juxtaposition of two findings by Jean Piaget: Children often neglect intention in making moral judgments, and yet they sometimes over attribute intention to inanimate objects. In the case of moral judgment, Piaget (1932) found that children before the age of seven or eight who are asked to decide whether a person has done something wrong don’t concern themselves very much with the person’s intention and focus instead on the damage caused by the action. For instance, in deciding how bad Haley was when she pushed Kelsey into the creek, a young child (say, aged six) might focus not on whether the pushing was done on purpose but rather on whether Kelsey’s shoes were ruined by mud.
10
This is a bit odd because focusing on intentions could be very useful to children, particularly when claiming good intentions might reduce their punishment (“I pushed her in the creek to prevent her from getting heatstroke”). And while children do pay a bit more regard to their own intentions than to those of others (Keasey 1977), they still focus mostly on the damage. This lack of interest in intentions in moral judgments leads one to suspect that young children also may not appreciate minds in the same way grownups do.

In looking at how children judge inanimate objects, Piaget (1929) noted that they sometimes ascribe the properties of living beings, including the property of intention, to nonliving things. Based on the discussion of animism in anthropology (the tendency to ascribe living properties to nonliving things; Lévy-Bruhl 1910; Mead 1932), Piaget discovered that children could fall prey to the same thing—overattributing mental properties to systems that are better understood as mechanical. His attribution of animism to children has turned out to be controversial because he overstated the case for older children in contemporary cultures (Looft and Bartz 1969). Still, there is compelling documentation of animistic thinking in young children around the world. An interview with one four-year-old boy about why a toy boat floats, for instance, went like this:


Why does it not go to the bottom?

— Can I make it go down to the bottom?


Try it and you will see.

— It comes up again!


Why does it not stay at the bottom?

— Because the man who is under this [under the roof] doesn’t want to go down.

— Here’s a nail.

— It will go to the bottom.


Why?

— Because there is a man in here [in the nail] and he likes to go to the bottom. (Laurendeau and Pinard 1962, 209)

10
. Research following up Piaget’s initial suggestions has pointed out some problems with his approach and has suggested that the use of intention information in moral judgment comes somewhat earlier than Piaget had suggested, but it has also generally substantiated the conclusion that young children underplay the importance of intention in moral judgment (Schultz 1980).

At first blush, Piaget’s pair of insights suggest paradoxically that children underuse intention (in moral judgment) and also overuse it (in perceiving inanimate causality). This makes sense, though, when we realize that the child’s notion of a mind is under construction. Without a fully developed idea of mental processes, children can fail to attribute intent when they should (in judging human beings) and attribute it too often when they shouldn’t (in judging objects). Children are faced with the problem of building a picture of their own minds and the minds of others, and of achieving an understanding of what it is
not
to have a mind as well. Early in life, they guess that things without minds might have mind-like properties of intention and that things they will later learn
have
minds might not possess such intention.

Piaget’s perspective has culminated in the contemporary literature on the development of theory of mind in animals (Premack and Woodruff 1978) and in children (Astington 1993; Perner 1991b; Wellman 1990), and in work that contrasts how children develop an understanding of agency, intention, and will with how they develop an understanding of causality, motion, and the principles of physics (Astington, Harris, and Olson 1988; Carey 1996; Gelman 1990; Gelman, Durgin, and Kaufman 1995; Wellman and Gelman 1992). Neither the perception of the physical world nor the perception of the mental world is a “given” to the human newborn. Although the neonate has rudimentary abilities in both areas, both systems must be developed over time and experience as ways of understanding what is going on.

The field of psychology itself has noticed that different systems of thinking seemed to be necessary for understanding mind and matter. The main preoccupation of much of psychology in the twentieth century was translating mind talk into mechanism talk on the assumption that the two were entirely interchangeable. A telling quote from Donald Hebb (1946) on how psychologists should understand chimpanzees highlights what happened as a result:

A thoroughgoing attempt to avoid anthropomorphic description in the study of temperament was made over a two-year period at the Yerkes laboratories. . . . All that resulted was an almost endless series of specific acts in which no order or meaning could be found. On the other hand, by the use of frankly anthropomorphic concepts of emotion and attitude one could quickly and easily describe the peculiarities of individual animals, and with this information a newcomer to the staff could handle the animals as he could not safely otherwise. Whatever the anthropomorphic terminology may seem to imply about conscious states in the chimpanzee, it provides an
intelligible and practical guide to behavior
. (88)

This realization suggested to Hebb and others that the earnest project of eliminating mind entirely from the scientific explanation of behavior (Bentley 1944; Werner 1940) was misguided. You have to think about the animals’ minds in order to keep from getting mugged by them. A mental system for understanding even chimp behavior seems highly preferable to a mechanical system.

Perceiving mind and causal agency is a significant human ability. It is possible that this achievement is accomplished by a fairly narrow mental module, a special-skill unit of mind that does only this, and that in different individuals this module can thus be particularly healthy, damaged, or even nonfunctional. Leslie (1994) has called this set of skills a Theory-of-Mind-Mechanism (ToMM), and Baron-Cohen (1995) has proposed that such a mechanism may be injured or missing in some forms of autism. He suggests that each of us has an “intentionality detector” that does the job of looking for actions that seem to be willed, in both self and others. The absence of this detector leaves us looking for physical or mechanistic explanations when psychological ones would really be better. Baron-Cohen has documented the “mindblindness” of autistic individuals in some detail, suggesting just how difficult life can be if one doesn’t have a quick and natural ability to comprehend other people’s minds. An example comes from Kanner’s (1943, 232) description of an autistic child: “On a crowded beach he would walk straight toward his goal irrespective of whether this involved walking over newspapers, hands, feet, or torsos, much to the discomfiture of their owners. His mother was careful to point out that he did not intentionally deviate from his course in order to walk on others, but neither did he make the slightest attempt to avoid them. It was as if he did not distinguish people from things, or at least did not concern himself about the distinction.”
11

The idea that mind perception is variable has also been noted by philosophers. Daniel Dennett (1987; 1996) has captured this observation in suggesting that people take an “intentional stance” in perceiving minds that they do not take in perceiving most of the physical world. The degree to which we perceive mindedness in phenomena can change, so that under some circumstances we might see our pet pooch as fully conscious and masterfully deciding just where it would be good to scratch himself, whereas under other circumstances we might have difficulty ex-tending the luxury of presumed conscious thought and human agency even to ourselves. It is probably the case, too, that the degree of mechanical causality we perceive is something that varies over time and circum-stance. Viewing any particular event as mentally or mechanically caused, then, can depend on a host of factors and can influence dramatically how we go about making sense of the event. And making sense of our own minds as mentally causal systems—conscious agents—includes accepting our feelings of conscious will as authentic.

Other books

Murder At Plums by Myers, Amy
Solo Command by Allston, Aaron
Tomy and the Planet of Lies by Erich von Daniken
Tension by R. L. Griffin
Born Wild by Julie Ann Walker
Sylvia Andrew by Francesca
The Recruit: Book Two by Elizabeth Kelly
The Raven Warrior by Alice Borchardt
Beauty and the Dark by Georgia Le Carre