The Language Instinct: How the Mind Creates Language (61 page)

BOOK: The Language Instinct: How the Mind Creates Language
4.69Mb size Format: txt, pdf, ePub

If Chomsky maintains that grammar shows signs of complex design but is skeptical that natural selection manufactured it, what alternative does he have in mind? What he repeatedly mentions is physical law. Just as the flying fish is compelled to return to the water and calcium-filled bones are compelled to be white, human brains might, for all we know, be compelled to contain circuits for Universal Grammar. He writes:

These skills [for example, learning a grammar] may well have arisen as a concomitant of structural properties of the brain that developed for other reasons. Suppose that there was selection for bigger brains, more cortical surface, hemispheric specialization for analytic processing, or many other structural properties that can be imagined. The brain that evolved might well have all sorts of special properties that are not individually selected; there would be no miracle in this, but only the normal workings of evolution. We have no idea, at present, how physical laws apply when 10
10
neurons are placed in an object the size of a basketball, under the special conditions that arose during human evolution.

 

We may not, just as we don’t know how physical laws apply under the special conditions of hurricanes sweeping through junkyards, but the possibility that there is an undiscovered corollary of the laws of physics that causes brains of human size and shape to develop the circuitry for Universal Grammar seems unlikely for many reasons.

At the microscopic level, what set of physical laws could cause a surface molecule guiding an axon along a thicket of glial cells to cooperate with millions of other such molecules to solder together just the kinds of circuits that would compute something as useful to an intelligent social species as grammatical language? The vast majority of the astronomical number of ways of wiring together a large neural network would surely lead to something else: bat sonar, or nest-building, or go-go dancing, or, most likely of all, random neural noise.

At the level of the whole brain, the remark that there has been selection for bigger brains is, to be sure, common in writings about human evolution (especially from paleoanthropologists). Given that premise, one might naturally think that all kinds of computational abilities might come as a by-product. But if you think about it for a minute, you should quickly see that the premise has it backwards. Why would evolution ever have selected for sheer bigness of brain, that bulbous, metabolically greedy organ? A large-brained creature is sentenced to a life that combines all the disadvantages of balancing a watermelon on a broomstick, running in place in a down jacket, and, for women, passing a large kidney stone every few years. Any selection on brain size itself would surely have favored the pinhead. Selection for more powerful computational abilities (language, perception, reasoning, and so on) must have given us a big brain as a by-product, not the other way around!

But even given a big brain, language does not fall out the way that flying fish fall out of the air. We see language in dwarfs whose heads are much smaller than a basketball. We also see it in hydrocephalics whose cerebral hemispheres have been squashed into grotesque shapes, sometimes a thin layer lining the skull like the flesh of a coconut, but who are intellectually and linguistically normal. Conversely, there are Specific Language Impairment victims with brains of normal size and shape and with intact analytic processing (recall that one of Gopnik’s subjects was fine with math and computers). All the evidence suggests that it is the precise wiring of the brain’s microcircuitry that makes language happen, not gross size, shape, or neuron packing. The pitiless laws of physics are unlikely to have done us the favor of hooking up that circuitry so that we could communicate with one another in words.

Incidentally, to attribute the basic design of the language instinct to natural selection is not to indulge in just-so storytelling that can spuriously “explain” any trait. The neuroscientist William Calvin, in his book
The Throwing Madonna
, explains the left-brain specialization for hand control, and consequently for language, as follows. Female hominids held their baby on their left side so the baby would be calmed by their heartbeat. This forced the mothers to use their right arm for throwing stones at small prey. Therefore the race became right-handed and left-brained. Now, this really
is
a just-so story. In all human societies that hunt, it is the men who do the hunting, not the women. Moreover, as a former boy I can attest that hitting an animal with a rock is not so easy. Calvin’s throwing madonna is about as likely as Roger Clemens hurling split-fingered fastballs over the plate with a squirming infant on his lap. In the second edition to his book Calvin had to explain to readers that he only meant it as a joke; he was trying to show that such stories are no less plausible than serious adaptationist explanations. But such blunt-edged satire misses the point almost as much as if it had been intended as serious. The throwing madonna is qualitatively different from genuine adaptationist explanations, for not only is it instantly falsified by empirical and engineering considerations, but it is a nonstarter for a key theoretical reason: natural selection is an explanation for the extremely improbable. If brains are lateralized at all, lateralization on the left is not extremely improbable—its chances are exactly fifty percent! We do not need a circuitous tracing of left brains to anything else, for here the alternatives to selection are perfectly satisfying. It is a good illustration of how the logic of natural selection allows us to distinguish legitimate selectionist accounts from just-so stories.

 

 

To be fair, there are genuine problems in reconstructing how the language faculty might have evolved by natural selection, though the psychologist Paul Bloom and I have argued that the problems are all resolvable. As P. B. Medawar noted, language could not have begun in the form it supposedly took in the first recorded utterance of the infant Lord Macaulay, who after having been scalded with hot tea allegedly said to his hostess, “Thank you, madam, the agony is sensibly abated.” If language evolved gradually, there must have been a sequence of intermediate forms, each useful to its possessor, and this raises several questions.

First, if language involves, for its true expression, another individual, who did the first grammar mutant talk to? One answer might be: the fifty percent of the brothers and sisters and sons and daughters who shared the new gene by common inheritance. But a more general answer is that the neighbors could have partly understood what the mutant was saying even if they lacked the new-fangled circuitry, just using overall intelligence. Though we cannot parse strings like
skid crash hospital
, we can figure out what they probably mean, and English speakers can often do a reasonably good job understanding Italian newspaper stories based on similar words and background knowledge. If a grammar mutant is making important distinctions that can be decoded by others only with uncertainty and great mental effort, it could set up a pressure for them to evolve the matching system that allows those distinctions to be recovered reliably by an automatic, unconscious parsing process. As I mentioned in Chapter 8, natural selection can take skills that are acquired with effort and uncertainty and hardwire them into the brain. Selection could have ratcheted up language abilities by favoring the speakers in each generation that the hearers could best decode, and the hearers who could best decode the speakers.

A second problem is what an intermediate grammar would have looked like. Bates asks:

What protoform can we possibly envision that could have given birth to constraints on the extraction of noun phrases from an embedded clause? What could it conceivably mean for an organism to possess half a symbol, or three quarters of a rule?…monadic symbols, absolute rules and modular systems must be acquired as a whole, on a yes-or-no basis—a process that cries out for a Creationist explanation.

 

The question is rather odd, because it assumes that Darwin literally meant that organs must evolve in successively larger fractions (half, three quarters, and so on). Bates’ rhetorical question is like asking what it could conceivably mean for an organism to possess half a head or three quarters of an elbow. Darwin’s real claim, of course, is that organs evolve in successively more complex forms. Grammars of intermediate
complexity
are easy to imagine; they could have symbols with a narrower range, rules that are less reliably applied, modules with fewer rules, and so on. In a recent book Derek Bickerton answers Bates even more concretely. He gives the term “protolanguage” to chimp signing, pidgins, child language in the two-word stage, and the unsuccessful partial language acquired after the critical period by Genie and other wolf-children. Bickerton suggests that
Homo erectus
spoke in protolanguage. Obviously there is still a huge gulf between these relatively crude systems and the modern adult language instinct, and here Bickerton makes the jaw-dropping additional suggestion that a single mutation in a single woman, African Eve, simultaneously wired in syntax, resized and reshaped the skull, and reworked the vocal tract. But we can extend the first half of Bickerton’s argument without accepting the second half, which is reminiscent of hurricanes assembling jetliners. The languages of children, pidgin speakers, immigrants, tourists, aphasics, telegrams, and headlines show that there is a vast continuum of viable language systems varying in efficiency and expressive power, exactly what the theory of natural selection requires.

A third problem is that each step in the evolution of a language instinct, up to and including the most recent ones, must enhance fitness. David Premack writes:

I challenge the reader to reconstruct the scenario that would confer selective fitness on recursiveness. Language evolved, it is conjectured, at a time when humans or protohumans were hunting mastodons…. Would it be a great advantage for one of our ancestors squatting alongside the embers, to be able to remark: “Beware of the short beast whose front hoof Bob cracked when, having forgotten his own spear back at camp, he got in a glancing blow with the dull spear he borrowed from Jack”?

Human language is an embarrassment for evolutionary theory because it is vastly more powerful than one can account for in terms of selective fitness. A semantic language with simple mapping rules, of a kind one might suppose that the chimpanzee would have, appears to confer all the advantages one normally associates with discussions of mastodon hunting or the like. For discussions of that kind, syntactic classes, structure-dependent rules, recursion and the rest, are overly powerful devices, absurdly so.

 

I am reminded of a Yiddish expression, “What’s the matter, is the bride too beautiful?” The objection is a bit like saying that the cheetah is much faster than it has to be, or that the eagle does not need such good vision, or that the elephant’s trunk is an overly powerful device, absurdly so. But it is worth taking up the challenge.

First, bear in mind that selection does not need great advantages. Given the vastness of time, tiny advantages will do. Imagine a mouse that was subjected to a minuscule selection pressure for increased size—say, a one percent reproductive advantage for offspring that were one percent bigger. Some arithmetic shows that the mouse’s descendants would evolve to the size of an elephant in a few thousand generations, an evolutionary eyeblink.

Second, if contemporary hunter-gatherers are any guide, our ancestors were not grunting cave men with little more to talk about than which mastodon to avoid. Hunter-gatherers are accomplished toolmakers and superb amateur biologists with detailed knowledge of the life cycles, ecology, and behavior of the plants and animals they depend on. Language would surely have been useful in anything resembling such a lifestyle. It is possible to imagine a superintelligent species whose isolated members cleverly negotiated their environment without communicating with one another, but what a waste! There is a fantastic payoff in trading hard-won knowledge with kin and friends, and language is obviously a major means of doing so.

And grammatical devices designed for communicating precise information about time, space, objects, and who did what to whom are not like the proverbial thermonuclear fly-swatter. Recursion in particular is extremely useful; it is not, as Premack implies, confined to phrases with tortuous syntax. Without recursion you can’t say
the man’s hat
or
I think he left
. Recall that all you need for recursion is an ability to embed a noun phrase inside another noun phrase or a clause within a clause, which falls out of rules as simple as “NP
det N PP” and “PP
P NP.” With this ability a speaker can pick out an object to an arbitrarily fine level of precision. These abilities can make a big difference. It makes a difference whether a far-off region is reached by taking the trail that is in front of the large tree or the trail that the large tree is in front of. It makes a difference whether that region has animals that you can eat or animals that can eat you. It makes a difference whether it has fruit that is ripe or fruit that was ripe or fruit that will be ripe. It makes a difference whether you can get there if you walk for three days or whether you can get there and walk for three days.

Other books

Wasted Years by John Harvey
Her Heart's Desire by Mary Wehr
A Lion Shame (Bear Creek Grizzlies Book 3) by Layla Nash, Callista Ball
The Crabby Cat Caper by Beverly Lewis
The Witch's Promise by Krehbiel, Greg
The Seventh Seal by Thorn, J.
La gran caza del tiburón by Hunter S. Thompson
Love in a Headscarf by Shelina Janmohamed