The First Word: The Search for the Origins of Language (28 page)

BOOK: The First Word: The Search for the Origins of Language
2.93Mb size Format: txt, pdf, ePub

The other force that affects how a new version of a gene fares and how widely it is passed on is selection. Negative selection removes deleterious genetic variants. In contrast, if a genetic mutation results in a trait that helps its carrier have more offspring (compared with individuals who do not have that genetic mutation), it will spread through a population much more quickly than the casual infiltration of mutations by genetic drift. This is positive selection.

Back in 1990, when Steven Pinker and Paul Bloom championed the investigation of language evolution, they proposed a scenario that could explain the natural selection of language. They based their hypothesis on knowledge from fields like anthropology and psychology, as well as logic, arguing that the complicated design of language, as with the eye, could not have arisen without selection. Recall that they presented this argument in the face of the ubiquitous criticism that because we can never really know if a trait was directly selected or if it arose by accident, let alone why it was selected, throwing out theories to explain adaptation is just an exercise in fiction.

Their endeavor has been vindicated by new statistical techniques that can reveal if a gene was selected or not. Once you know whether a gene has been selected, you can begin to look in more detail at its impact on an organism and substantially narrow down what trait was most likely selected for. The data about genetic changes in general, and about FOXP2 in particular, mark a huge shift in the kind of evidence available for language evolution.

Knowledge of the way that genes work and the ability to determine what’s been selected and what has merely drifted have been applied to a comparison of the human and the chimpanzee genome with especially interesting results. These pertain to another important genetic difference between the species: in addition to the different DNA sequence that each has (that famous 2 percent), there are a variety of ways that the same gene can be expressed in the particular organism.
10

It’s clear that in the human lineage, some expression levels have been elevated while others have been significantly reduced. A group of geneticists led by Svante Pääbo (who led the team that sequenced Neanderthal mtDNA and who headed the FOXP2 research) found that the evolutionary change of the expression of genes that shape the heart, liver, and kidneys of humans and chimpanzees is similar,
and
what differences there are in expression evolution have mostly been shaped by negative selection and drift. There is not as much difference between the species in expression in the brain. Said Pääbo, “In the brain, there is a lot of negative selection accounting for the small amount of differences we find, but of the few differences we see, more have occurred on the human lineage than the chimp lineage, suggesting that positive selection may have played a role in human brain evolution.”

Another study recently confirmed that many of the differences of gene expression between humans and chimps resulted from much higher levels of expression in genes in the human brain. Like the Pääbo team, this group concluded that changes in many other tissues of the body were random and probably not the result of positive selection.
11
Overall, genetic drift is a much more common process than selection, which makes finding a selected gene especially exciting. It’s been estimated that natural selection has had a significant effect on only 9 percent of genes in the human genome.

 

 

 

Because scientists are now able to zoom in on the way a gene version changes and spreads throughout species, the group that discovered FOXP2 started asking questions about how that gene has changed over time. In Leipzig, Wolfgang Enard, who works with Svante Pääbo, presented the history of FOXP2 in the context of the entire human genome.

Showing a slide of President George W. Bush and a chimpanzee, he clarified that there is, in fact, only a 1.2 percent genetic difference between humans and their closest relatives. Between humans and gorillas, there is a 1.7 percent difference, and between us and orangutans there is a 3 percent difference.
12
Moreover, said Enard, most of the differences between us and other animals lie in parts of the genome that are not particularly significant, the junk DNA. Nevertheless, these genomes, which look overwhelmingly similar, produce very different animals: humans have language, and chimpanzees, bonobos, gorillas, and orangutans don’t. The differences are not just cognitive, noted Enard. We have AIDS and other apes don’t. We have malaria and they don’t. We have a doubled maximal life span. We have bipedal walking. And we have a larger and differently proportioned brain.

As for FOXP2, the gene comprises a chain of 715 amino acids. Our common ancestor with mice lived more than seventy million years ago, and our FOXP2 differs from theirs by only three amino acids. Surprisingly, the chimpanzee version of the gene differs from that of mice by only one amino acid, which means that two amino acid changes have occurred in the six million years since humans and chimpanzees split.

The rate of change on FOXP2 is significantly higher in our species than in others. “You rarely get this much change in this amount of time,” said Enard. The high rate of turnover suggests that the human form of FOXP2 resulted from positive selection rather than random drift.
13
“It is a rare event to find a selected gene,” he said.

Now, crucially, all humans have these two changes, and the age of the bit of the DNA that carries these changes is younger than other parts in the human genome. If this part of our genome is significantly younger and present in all humans today, then it must have spread faster than other parts of the genome. This is equivalent, said Enard, to saying that it must have had an advantage. Enard and colleagues estimated that between fifty thousand to two hundred thousand years ago all living humans had the advantageous version of FOXP2.

Other researchers suspect that FOXP2 is crucial to language evolution partly because of the time frame of its rapid spread through the human population. If FOXP2 mutated to the human version within the last two hundred thousand to fifty thousand years, the mutation coincides perfectly with the acceleration of culture and the migration that spread modern
Homo sapiens
from Africa out across the world, and that meant instead of taking a million years to upgrade our tools, technology now changes every decade or so. Did the mutations, or at least one of them, significantly refine our ability to speak and make complicated syntactic distinctions—resulting in a major change of pace for cultural evolution?

We don’t yet know. The correlation between the FOXP2 changes and the blossoming of human culture may be coincidental (or less than direct). As we now know, genes have many different effects. Because the evidence of the KE family shows that FOXP2 is extremely important to language, it’s not unreasonable to suspect that the human mutations of FOXP2 were selected for their effect on language. But the changes in the human FOXP2 may have occurred because of their beneficial effect on heart tissue or lung development.

How can the effects that gave the first carriers of the modern FOXP2 such a profound advantage over their peers be identified? In order to take this next step in ancient forensics, Enard and his colleagues will be tinkering with the genome of the mouse. Unlike the team at Mount Sinai that knocked out one working copy of the mouse FoxP2 gene to see how it would affect the animals, Enard and his colleagues are building a mouse with knock-in genes—altering the mouse genome so that it is artificially wound forward along the human line. This involves changing the two amino acid positions from the mouse setting to the human one. What Enard and his colleagues will then do is look at how the changed gene affects gene expression, neuroanatomy, and behavior in the mouse. “What we will not find,” said Enard, “is a gene that affects only grammar.”

Naturally, it is tempting to see FOXP2 as solely responsible for human language and the last fifty thousand years of rapid cultural rollover. But heralding the new mutations to this ancient gene as some kind of genetic big bang would be tantamount to reviving the old tendency to view language as a singular and discrete “thing” that came about all at once.

Keep in mind that our ancestors used tools for millions of years before this genetic innovation. For these reasons, Gary Marcus advises caution:

 

We just have no idea how FOXP2 fits into the space of language-relevant genes as a whole. We know neither which other genes are relevant for language nor how those fit into the oodles of other genes that have been sculpted by positive or negative selection. All told, there are something like thirty-five million base pair differences between human and chimp, and we know some are important for language, some for our physical appearance, others for our immune system, et cetera. We also know some are simply irrelevant. For the most part we simply don’t know which are which. I suspect that most of the differences that are essential for language have been subject to strong selective pressures, but the details very much remain a mystery.

 

Instead, it makes more sense to look at the gene’s effects as part of the whole language suite. For Marcus, as for many other scholars, this means treating language as an aggregation of many abilities:

 

I think language is probably a patchwork of a dozen or more capacities borrowed from ancestors, ranging from tools for imitation and social understanding to tools for analyzing sounds and sequencing information. Many or even all of those subcomponents probably got further tuned over the course of language evolution, but I would argue language could not have evolved so quickly (or with so little genetic change) without this sort of broad inherited base. FOXP2 fits nicely with this perspective, since, whatever function it has for humans, it seems to build on a gene that’s had something to do with vocal learning for several hundred million years, long before language as such evolved.
14

 

As we find more and more evidence of the shared foundations of language, there is much less motivation to search for some crucial single genetic mutation that turned a loose potential for language into language. With this kind of understanding, Steven Pinker made some predictions about research on genes and language in the next decade or two:

 

We’ve found one gene. We’ve found two other markers for language impairment. We have inheritability studies that suggest that many other forms of language delay—stuttering, dyslexia, and so on—are also inheritable. So let’s say in ten years’ time, say, ten or fifteen genes have been isolated. Then you apply these statistical tests, and if you have…Well, in fact, even if you have one gene that’s been the target of selection, that establishes the case, as we already have. But the more genes that have been targeted for selection, the stronger the case would be that language is an adaptation. So I think that’s already strong evidence for language being an adaptation, and I predict that’s where the debate will eventually be settled.

 

Ultimately, said Pinker:

 

In the case of syntax, it’s very unlikely that it’s due to a single gene. One of the reasons we know this is that no one has found a case of language impairment where the faculty of language in the narrow sense is completely wiped out in terms of being nullified. We know that at least three genes or genetic markers have been identified for language disorder, and I think most people who work on the genetics of language believe that that’s just the tip of the iceberg, and that there are a lot of genes, each with a fairly small effect. That is exactly what you’d expect of an ability that is polygenic, and that required many evolutionary events to put into place.

 

One of the most exciting projects in genetic research and human origins was announced in 2006. Svante Pääbo and an American research team plan to sequence the entire Neanderthal genome. Not only will this provide an incredible framework against which to compare modern-day humans, opening the door even further back in time to our common ancestor with Neanderthals from about 500,000 years ago, but it may also help clear up an old but still intense debate. Even among those scholars who believe that Neanderthals may have had some form of language, there is much disagreement about whether their physiology would have permitted speech. These arguments generally center on the shape of the Neanderthal skull, neck, and the remains of the hyoid bone. When the Neanderthal genome is sequenced, we should be able to look more closely at their version of FOXP2 and see how closely it corresponds to ours.

 

 

 

The synthesis of genetic data with what we know about evolutionary change gives us one window into the way that genes and traits have accumulated in evolutionary time. But the replication of DNA isn’t the only kind of information exchange in which humans and other animals engage. Philip Lieberman argues, “Our genetic capacity won’t be manifested unless it intersects with culture.

“Look at all the behaviors around now,” he says. “Think of what a pilot has to do to land a commercial airliner. The pilot cannot see the wheels, but he puts them down nonetheless. Material culture and technologies are the aggregations of lots of minds. Before the horse was domesticated ten thousand years ago, people moved at about two to three miles an hour; then with the steam engine, people started to go thirty miles an hour; now it’s hundreds of miles an hour.” Lieberman isn’t talking merely about minds getting together in space to come up with new ideas, but the fact that today’s culture arises from the accumulation of minds through time. We could never have developed the jet engine in the age of horse travel; we only got there step by step, propeller by propeller.

13.
Culture evolves
 

S
ome orangutan groups blow raspberries like goodnight kisses to one another before they bed down on leaf nests that are constructed anew each evening. Dolphin groups off the coast of Australia use sponges to forage. Certain Japanese macaques have invented effective potato-washing techniques that other macaques do not employ. Many chimpanzee groups use tools. Different groups favor different tools—some prefer rock hammers, others wood—as well as different hammering techniques. Chimpanzees will pound their hammers on anvils for up to two hours a day. Some use a fishing technique to get termites with sticks, while chimpanzees in Guinea, western Africa, are the only ones that stand on the top of palm trees and repeatedly beat the center of the tree crown with a branch to make a pulpy soup. In fact, chimpanzees use what amounts to a tool kit. One wild chimpanzee was seen deploying four different implements to get honey from a bee’s nest.
1
Animal groups also vary in the way they organize themselves socially. In 2006 a chimpanzee group in Guinea was filmed crossing a road in a highly organized fashion—the alpha males led, acting as crossing guards, while other males brought up the rear so as to get the whole group safely across.
2
It was only fifty years ago that we knew virtually nothing about apes in the wild, let alone dolphins and other animals, but in the years of observation clocked by Jane Goodall and her intellectual descendants, particular animal groups have been shown to have many unique customs.
3

While the capacity for these preferences is genetically prescribed, the behaviors themselves are not—the chimpanzees that use hammer and anvil are genetically identical to those that do not. As knowledge about behavioral differences between groups of the same species has flourished, scholars have started to regard those differences as essentially cultural.

At its most basic level, culture is merely a group preference for doing things a particular way. As preferences accumulate over time, they become traditions, and these traditions are passed down by a group to its descendants. Just as some human groups prefer spaghetti to rice or high-rise apartments to ranch houses, different animal groups also have different material culture.

The recognition that apes have their own culture has even opened the door to a kind of ape archaeology. In 2002 a group of archaeologists and primatologists announced the discovery of a chimpanzee tool site.
4
Because these apes had a rudimentary material culture, they left evidence of their small-scale civilization from the past. The analysis of the site was the first time that archaeological methods had been applied to a nonhuman culture. In early 2006, members of the same team, along with other colleagues, announced that they had unearthed a 4,300-year-old chimpanzee tool site.
5
The scientists discovered modified stones with food residue attached to them in the Taï National Park, Côte d’Ivoire. The stones predate the settling of human farmers in the area, and suggest that the different varieties of chimpanzee and human tool use originate with our common ancestor (if not before it). The researchers point out that the chimpanzee stone technology is contemporaneous with a local human “Later Stone Age,” and therefore indicate a “Chimpanzee Stone Age” (one that apparently continues).

That these animals use tools and develop traditions demonstrates that it is possible for simple culture and technology to arise in the absence of language. Carel van Schaik, who observes orangutans in the Kluet swamp of Borneo, believes that culture and intelligence are inextricably linked. Not only does culture reveal intelligence, he argues, it bootstraps individual animals into greater intelligence. One of the orangutan groups that van Schaik watches is particularly skilled at extracting the rich, nutritious seeds of the Neesia tree. The seeds are encased in a tough pod and protected inside by sharp spikes. Van Schaik’s orangutans spend a lot of time inserting twigs into cracks in the husk to release the seeds, and they then tip back their heads and shake the seeds into their mouths.

Only this one group of orangutans employs the Neesia tool technique, and in the Neesia season van Schaik watches them grow fatter by the day. Other groups have access to the same pods, but their seed retrieval skills are nowhere near as effective. Van Schaik attributes much of his group’s success to the fact that they have a particularly high population density, and therefore lots of opportunities for observing the tool use, in addition to which individuals in this group are particularly tolerant of being observed and copied. It’s this kind of process, according to van Schaik, that allows animals to stand on the shoulders of previous generations and develop smarter solutions to problems in their world, basically creating new minds out of old brains.
6

Van Schaik echoes Frans de Waal’s wariness about the limitations of lab methods in tapping animal minds: “Our work in the wild shows us that most learning in nature, aside from simple conditioning, may have a social component, at least in primates. In contrast, most laboratory experiments that investigate how animals learn are aimed at revealing the subject’s ability for individual learning. Indeed, if the lab psychologist’s puzzle were presented under natural conditions, where myriad stimuli compete for attention, the subject might never realize that a problem was waiting to be solved. In the wild, the actions of knowledgeable members of the community serve to focus the attention of the naive animal.”

Human culture is an intensely complicated accumulation of techniques and tools. In the same way that an animal’s physical development is constrained by its genome, and therefore the genome of its parents, human culture constantly produces new forms of technology and material design by building on what came before. The way we live now is determined not solely by our genes but also by the course of cultural history. Even though the apparent gap between animal and human minds shrinks with each year, there is at this stage little evidence that the social and material traditions of other animals ever move beyond a simple level, in contrast with our own constantly churning culture. Researchers like Simon Kirby at the University of Edinburgh look at the ways in which language is a product of culture as well as biology, asking not just how it evolved but how it might have evolved itself.

 

 

 

Kirby, who completed undergraduate and graduate degrees in the study of language evolution, was appointed lecturer in language evolution at the University of Edinburgh at thirty-three. This was the first appointment of its kind in the world. Indeed, Kirby is still probably the only academic with language evolution in his job title. Each morning he heads off to his office in the linguistics department, and as he goes through his day, he talks to staff, other lecturers, and students. In lectures, tutorials, and simple hellos in the corridor, Kirby and his interlocutors exchange a certain number of words. If you could zoom out on the department, you would see Kirby and everyone he spoke to zipping around, stopping to connect with one another, and moving off again. Imagine these interactions in fast-forward, the days accelerating into weeks and then years, and all the while see how Kirby and his colleagues talk incessantly. Watch language bubble, build, and evaporate.

Let’s assume that as Kirby and his interlocutors get older, they have children, and eventually the children replace them in all the running around and constant talking. Then their children have children. And their children follow in their footsteps. As the talk continues, the language starts to grow and change. Kirby himself may have disappeared relatively early in the process, but the people he spoke to live on, influenced by their conversations with him, and even though they, too, eventually die, the people they spoke to are influenced by them, and indirectly influenced by what Kirby said. Imagine if you could watch this process unfold from the dawn of humanity, watch the first speakers speak and the first listeners listen, and see how meaning and structure develop. Over time, words proliferate and begin to cluster in particular ways, regularities appear, and structural patterns begin to emerge. This grand view of the history of language is a little like what Kirby seeks in his research. His specialty is computer modeling of the evolution of language.

Until the 1990s changes within and between languages could be tracked only by using the comparative method of linguistic reconstruction. But that technique has limitations. No single language from which all the world’s dialects are known to have descended has been reconstructed. The comparative method can unearth traces of language from as early as six thousand years ago, but not much further back than that. Computer modeling starts from the opposite end of the language chain. Instead of beginning with contemporary language and reconstructing past versions from it, Kirby creates populations of digital individuals called agents. He hands them some small amount of meaning, maybe a few rules, and then steps back and watches what they do with it.

Jim Hurford, Kirby’s supervisor, kicked off the digital modeling of language in the late 1980s. “Jim had read
The Selfish Gene
by Richard Dawkins,” said Kirby,

 

and in that Dawkins describes a computational model, where these things called biomorphs evolve, you know, bodies and things. Jim read that, and thought,
Wow, I wonder if I could do that for language.
So he started running these simulations on the VAX, an old-fashioned mainframe computer that we had back in the ’80s. He would tie up so much of the computing power, the whole department would be paralyzed, and they wouldn’t be able to read their e-mail or anything. It was groundbreaking stuff, and he did it really out of a vacuum.

Jim modeled various things, like speech sounds. He built a model about vocabulary and numeral systems, and he did one on the critical period for language learning, which is this idea that we can learn language very easily when we’re young, but after a certain age we stop, and our language-learning ability kind of switches off. The question he was trying to understand was: Why on earth did something like that evolve? Why not have the ability to learn language all through your life? And his computational model showed that a critical period did evolve in his agents.

 

As an undergraduate, Kirby had been deeply inspired by Hurford’s lectures. “His ideas about computational modeling really seemed fantastic, and it was just what I wanted to do.” So when Kirby finished his undergraduate degree, he enrolled as a Ph.D. student under Hurford.

At around this time, Steven Pinker published
The Language Instinct,
in which he describes Hurford’s “critical period” model and refers to Hurford as the world’s only computational evolutionary linguist. Since then, the computer modeling of language has boomed. “Every year,” said Kirby, “there are more people using the computational approach to language evolution.” Today, less than twenty years since Hurford periodically disabled the University of Edinburgh’s linguistics department, the school is offering the first degree specifically in the subject, an M.S. on the evolution of language and cognition, and hundreds of researchers are working on computer modeling all over the world.

 

 

 

Even though science has been getting better and better at tracking the elusive clues to our biological language suite, we still don’t know how language itself got here in the first place. Computer modeling promises to be a most useful tool in this quest. In addition to the godlike allure of creating populations and then watching them evolve into different kinds of creatures, this technique became so popular so quickly because modeling proposes to answer such questions as: How did the wordlike items that our ancestors used proliferate to become many tens of thousands of words with many rules about how they can be combined today? Why does language have structure, and why does it have its particular structure? How is it that the meaning of a sentence arises from the way it’s put together, not just from the meaning of the words alone?

In just a few years computer modeling of language evolution has produced a plethora of findings that are counterintuitive to a traditional view of language. The most fundamental idea driving this research is that there are at least two different kinds of evolution—biological and linguistic, meaning that as we evolved, language evolved on its own path.

Kirby starts his models by building a single individual, and then creating a whole population of them. “I’ll have them communicating with each other and transmitting their knowledge culturally over thousands or tens of thousands of generations and very long periods of time. In some extensions of the model, I allow those agents to evolve biologically as well.” What he and other researchers in the field have found is that from little things, big things grow. In these accelerated models, from the smallest beginning—agents with the ability to make sound but not words, agents who start out not knowing what other speakers mean—comes incredible structural complexity that looks a lot like language.

This cultural evolution, said Kirby, is simply the repeated learning by individuals of other individuals’ behavior:

 

The idea is that you’ve got iterated learning whenever your behavior is the result of observing another agent’s particular behavior. Language is the perfect example of this. The reason I speak in the way I do is because when I was younger I was around people who spoke and I tried to speak like that. And what we’ve been finding in our models is, to some extent, that is all you need. It’s very surprising. But if you make some very, very simple assumptions like that, you can get linguistic structure to emerge out of nothing—just from the assumption that the agents basically learn to speak on the basis of having seen other populations speak before them.

 

Strangely enough, the most languagelike structures arise from beginnings that are constrained or not full of information. When Kirby built a model where agents were allowed a lot of exposure to one another’s behavior and able to learn all at once pretty much anything they would ever want to say, he found that nothing would actually happen. No linguistic structure emerged from the primordial word soup. In fact, the resultant system of communication looked more like simple animal communication. Kirby discovered that if the agents had only limited access to one another’s utterances—either because he made the language so big that they could observe only a small part of it at any one time or because he made sure they listened to only a few sentences at a time—then a lot of syntactic structure would eventually arise over the generations of agents. “It’s a kind of irony that you get this complex and structured language precisely when you make it difficult for the agents to learn,” he said. “If you make it easy for them, then nothing interesting happens.”

Other books

Christmas in Bluebell Cove by Abigail Gordon
Hero by Cheryl Brooks
Burnt Paper Sky by Gilly MacMillan
Pleasing the Ghost by Sharon Creech
Wickham's Diary by Amanda Grange
The Mephisto Covenant by Trinity Faegen
The Nothing by Horowitz, Kenneth