The First Word: The Search for the Origins of Language (21 page)

BOOK: The First Word: The Search for the Origins of Language
3.44Mb size Format: txt, pdf, ePub

The researchers took the experiment further. Did the baboons memorize the dominance ranking of all ninety members of their troop? Or was it possible they were factoring in family relationships as well? Were they using some kind of mental shortcut to collapse across the ranked list of almost one hundred individuals, just as humans do when they are dealing with a large list or set of discrete elements?

In order to find out, they played various violations of interactions between families. The researchers found that an apparent rank reversal of families resulted in a much more dramatic response from listeners than a rank reversal between individuals within a family. This implied that baboons not only recognize individual rank and family rank but integrate them into an even higher order of hierarchy.

The baboon findings confirm what Fitch and other researchers speak about when they refer to the big gap that exists for all animals between what is comprehended and what is produced. Despite the baboons’ limited vocal set, they have what Seyfarth calls an almost open-ended ability to learn the sound-meaning pairs of their own species and of other species. When they hear a vocalization, he says, they form a mental representation of each call’s meaning. This response seems to be instantaneous and subconscious, and it also appears to be an innate property of the baboon mind. Seyfarth suggests that if you are looking for a cognitive foundation that may serve as a precursor to syntax, it’s much more likely to be found on the interpretation side than on the production side of animal communication.

It may be that before our ancestors became adept at understanding and producing the computations of modern grammar, they learned to compute social relationships just like the baboon understanding of social rank, which is based on discrete values—individual rank and family rank—and their combination. Seyfarth stresses that this “language of thought” is not the same as human language, but he adds that it is adaptive in its own right and is a possible foundation for something that might turn into language.

 

 

 

Even if animals can understand structural rules where words or cries are joined one after the other, as with the Diana monkeys, human language uses a variety of syntactic mechanisms to build meaning. Thus, while some research has turned up evidence of rudimentary structural abilities in other animals, evidence has also been gathering regarding grammatical rules they are unable to use. Tecumseh Fitch and Marc Hauser tested the ability of tamarins, with whom we last had a common ancestor forty-five million years ago, to understand two different types of grammar.

Fitch and Hauser played recordings of different sequences of sounds to the monkeys. The sequences generated by the first type of grammar could be described by the grammatical rule: (AB)
n
, where a syllable (A) was always followed by another syllable (B) for a number of (n) repetitions. The sequences generated by the second type could be described by: A
n
B
n
, where the same number of A syllables was always followed by the same number of B syllables. Understanding how the sounds were arranged, according to Fitch and Hauser, required the ability to process two different kinds of grammar, a finite state grammar and a phrase structure grammar. The latter has more expressive power than the former, and it’s thought that you can’t generate all the structures in human language without at least a phrase structure grammar.

The researchers found that after the tamarins were played the recordings of the first rule, they would react if then played recordings that violated the same syntactic rule—suggesting that they had an expectation about how the sounds would be arranged. However, when the animals were played the sound sequences generated by the second rule, they didn’t show any sign that they could distinguish examples of correct syntax from sequences that violated the structural rule—it was all the same to them. Human subjects, in contrast, noticed the violations of both the finite state grammar as well as the phrase structure grammar.

In a interesting demonstration of the tangles created by homology and analogy, Timothy Gentner, an assistant professor of psychology at the University of California, San Diego, and colleagues demonstrated in 2006 that starlings can actually distinguish correct instances of the grammar based on Fitch and Hauser’s example, A
n
B
n
. The researchers used natural starling sounds to test the birds, exposing their subjects to many more examples than Fitch and Hauser exposed the monkeys. Gentner and colleagues suggest that these results show the comparative syntactic abilities in monkeys, humans, and birds may differ more in quantity than in quality. So rather than a singular syntactic ability that is a key foundation for human language, there may be a fundamental set of structural mechanisms that we use—some of which other animals also possess.

The Gentner paper received a lot of public attention. Many researchers were surprised by the results, and some welcomed the findings as proof that the syntax underlying human language is not a monolithic ability that only we possess. But the experiment was not universally acclaimed; in a
New York Times
interview Chomsky said that what the starlings did has nothing to do with language at all.

Certainly, the Fitch and Hauser and the Gentner experiments raised many interesting issues about methodology, as well as the capacity for understanding different kinds of grammar. Ray Jackendoff and colleagues published a letter noting that what the starlings are habituating to depends on how they encode the signal. They also questioned whether the starlings were really doing syntax as opposed to basically counting the strings of A’s and B’s (echoing Chomsky’s comment). Recall that many animals have some number ability. Indeed, it’s possible that the humans in the original experiment may have been counting the experimental stimuli rather than processing them as samples of a phrase structure grammar. Jackendoff explained: “If I imagine the stimuli and how I would think of them or remember them, it would be by counting or some equivalent rhythmic procedure: matching the first A with the first B, the second A with the second B, and so on. It would not be by relating the first A to the last B, the second A to the next to last B, which is what the syntactic hypothesis entails.”

Despite the complications, these experiments inaugurate a potentially rewarding endeavor that seeks to map which syntactic strategies are available to some species and not others.

 

 

 

It’s hard to overestimate the intricacy and power of each language’s syntax, let alone all of the syntactic strategies that human languages deploy. The complexities of linguistic structure that, so far, do not seem to have an analog in any kind of animal communication are myriad, including many different mechanisms for combining words and parts of words into a larger phrase, and larger phrases into even larger ones. For instance, a phrase of any length may be created by conjoining elements (He knows and she knows) or by arranging them recursively (He knows that she knows).

Parts of a phrase—for instance, the subject and the verb—may agree in number and gender. A verb may be intransitive, taking a subject but not a direct object (She slept) or transitive, taking a subject and direct object (She kicked it). A language may be ergative, marking the object of a transitive verb in the same way that it marks the subject of an intransitive verb, while the subject of a transitive verb is marked differently. Or a language may be nominative-accusative, marking the subject of a transitive and intransitive verb in the same way, distinct from the object of the transitive verb. Different languages mark these relationships in very different ways, using strategies like word order or lexical marking. And often the way a particular language deploys one kind of syntactic rule affects how it fulfills another. For instance, languages with free word order have many syntactically meaningful affixes.

Human syntax is also characterized by countless idioms and phrases, some of which are complete sentences (The jig is up), while others allow single words to be inserted into “slots” to create different meanings, such as “Take X for granted.” In yet another type of English idiom (also called a “syntactic nut”),
4
the phrase that complements the verb isn’t actually determined by the object of the verb; for example, “He sang/drank/slept/ laughed his head off” or “Sarah slept/drank/sang/laughed the whole afternoon away.”
5

In contemporary syntax there are two main approaches to accounting for all the structural rules that human languages use to build meaning: the Chomskyan approach and the “parallel architecture” approach. In the Chomskyan approach, the list of words in a language—its lexicon—and the syntactic rules that arrange those words are treated as separate entities. Syntactic rules are considered the core computational device with which most of language is generated. Accordingly, people still talk about universal grammar, or UG, as a language-specific set of rules or parameters belonging to a language-specific mental mechanism.

Mark Baker’s book
The Atoms of Language
(2001) is a good example of the mainstream approach to syntax. Baker’s goal was to show that apparently very different languages, like English and Mohawk, are different only in the way that a finite set of universal rules is applied to create them. Baker deduced a hierarchical list of fourteen parameters that he believes reflect rules that are hardwired into the human brain. He thinks there may be about thirty rules overall. English and Mohawk differ only, he says, in the way one single rule is applied at the top of the hierarchy.

Jackendoff calls this kind of approach “syntactocentrism,” meaning that syntax is regarded as the fundamental element of language. In contrast, he says, “in a number of different quarters, another approach has been emerging in distinction to Chomsky’s.” In this new way of accounting for structure in language, words and phrases are as important as the rules that combine them, and the idea of pure syntax is downplayed.

Instead of being objects, words are best thought of as interfaces. A word lies at the intersection of a number of systems—the sound of the word (phonology), syntactic structure (the structures that the word can license or appear in), and meaning (some of which may be specific to language, and some of which may be a more general kind of meaning).
6
The more general component of a word’s meaning may have some equivalence to the common cognitive platform that humans share with other species.

Jackendoff may be the only longtime generative linguist who willingly concedes that we may share this component of a word with a number of other species. As he explains: “An account of the extraordinarily complex behavior of primates, especially apes and especially in the social domain, leads inexorably to the conclusion that they are genuinely thinking thoughts of rich combinatorial structure, not as rich as human thought to be sure, but still combinatorial.”
7

It’s significant that Jackendoff now proposes that it’s time to move away from the pure focus on syntactic structure and the idea of a syntactic core to language. While he believes that language is as complicated and ramified as Chomsky does, he is now convinced there is a different way to account for that richness.

Rather than think of syntax as a set of computational algorithms, Jackendoff and Pinker call it a “sophisticated accounting system” for tracking the various layers of relationship and meaning that can be encoded into speech and then decoded by a listener. To their mind syntax is “a solution to the basic problem of language,” which is that meaning is multidimensional but can be expressed only in a linear fashion, because speech unfolds sequentially, as does writing. This way of looking at language and syntax is more consistent with the idea of language evolution and the view of evolution as a “tinkerer.”

Coming from a slightly different viewpoint, John McWhorter, a former professor of linguistics at the University of California, Berkeley, and senior fellow at the Manhattan Institute, emphasizes the way that, like biological evolution, language change results from in accretions or accumulations of structure. In this sense language is an artifact of the collective mind of history. It has imperfections and odd quirks, and makes peculiar demands of its speakers. Its textures and patterns have been created over a long period of time as it has been dragged through millions of mouths, expressing their individual agendas.

McWhorter argues that a lot of syntactic structure is sludge and is not shaped by logical necessity or innate mental rules. He talks about the “benign overgrowth” of language as a corrective to the idea that languages are a lens onto the human mind. He wrote: “There are few better examples than Fula of West Africa of how astoundingly baroque, arbitrary and utterly useless to communication a language’s grammar can become over the millennia and yet still be passed on intact to innocent children.” Fula, McWhorter points out, has sixteen genders, and each noun gender marker varies according to the noun. Moreover, any adjectives modifying a noun also must carry a different gender marker in order to agree with the noun.
8

In
Simpler Syntax,
a book coauthored with Peter Culicover, Jackendoff writes that while it is important to ask how optimal or perfect a language is, it is also necessary to recognize that language doesn’t operate like a physical system, say, “a galaxy or a molecule…It has to be carried in the brain, and its structure has to come from somewhere.”

Jackendoff and Culicover conclude by noting that they have heard it said in certain circles that if their ideas about language are true, then it means language is “not interesting.” But interestingness, they reply, is in the eyes of the observer. “Baseball cards and poodles interest some people and not others,” they write, “and the same is true of simpler syntax.”
9

 

 

 

What can the structure of language itself tell us about the way language changes over time? Linguists have developed a number of ways of investigating this topic.

Other books

The Unknown Masterpiece by Honore de Balzac
Friend & Foe by Shirley McKay
Under Fragile Stone by Oisín McGann
The Blind by Shelley Coriell
Ice Phoenix by Sulin Young
Leave It to Me by Bharati Mukherjee
Seagulls in My Soup by Tristan Jones