Paleofantasy: What Evolution Really Tells Us about Sex, Diet, and How We Live (6 page)

BOOK: Paleofantasy: What Evolution Really Tells Us about Sex, Diet, and How We Live
5.37Mb size Format: txt, pdf, ePub
ads

The study of bonobos began much later than that of chimpanzees, whose societies were documented starting with Jane Goodall in 1960, and some researchers—for example, Adrienne Zihlman
43
—have wondered whether our view of human nature might have been different had this order been reversed, with the pacifist bonobos used as models for our ancient selves. In such a scenario we could have formed a picture of early humans as naturally conciliatory, rather than pugnacious, and hence seen modern warfare as an aberration rather than a natural outcome of a violent heritage. Alternatively, when the discovery of the skeletons of
Ardipithecus
, that purported hominin from 4.4 million years ago, was published in 2009,
44
other anthropologists worried that the idea of studying an actual human relative, even a fossilized one, would sideline efforts to use modern species of apes to draw conclusions about early human lives, and hence needlessly limit the sources of information about our past.

So, can we use modern primates to understand our own evolution? The answer is yes, but not because the apes are time machines or living fossils or any of the other clichés often employed. Because chimpanzees, gorillas, and bonobos are so like us in appearance, and they have been since we all split from our common ancestor, it is reasonable to suppose that selection has acted on them as it has on other apelike primates, our closer relatives the hominins. For example, unlike many other mammals, we and the other great apes rely on vision and hearing, not smell, so our communication has evolved via sight and sound instead of odors, and the same was almost certainly true of our ancestors. But that is not the same as saying that the more recently we shared an ancestor with a species, the more similar we are to it in all respects. And it also doesn’t mean that the apes themselves have remained identical to their 5-million-year-old ancestors; they have been evolving in their own environments just as we have.

One complication of assuming that the modern great ape is a frozen record of our shared past is that recent shared ancestry doesn’t necessarily mean genetic similarity in any particular trait. It all depends on how quickly evolution occurred, and different traits evolve at different rates. In other words, suppose that our common ancestor was warlike, and for that reason chimps now are warlike. Even though modern human genes are very similar to those of modern chimps, if natural selection in humans has favored cooperation instead of conflict, we can retain that genetic similarity but acquire new traits of our own. How long a characteristic has been around doesn’t argue for, or against, its existence in the first place. As Wrangham and Pilbeam put it, “But suppose, against all odds, that convincing evidence emerges to show that lethal raiding is 6 million years old in humans, whereas concealed ovulation [the absence of clear signs of periodic sexual receptivity, like heat in dogs and cats, a trait linked to monogamy and longer pair bonds] is ‘only’ 1.9 million years old. Should this matter to our sense of ourselves, that violence is 4.1 million years older than peace? Not at all.”
45
Asking which traits changed quickly, and why, is a more compelling pursuit than trying to establish a baseline for our essential chimp—or gorilla, or bonobo—nature.

Were the universal people real people?

One of the many pleasures of travel to foreign lands is making connections with the locals, people who may be extremely different from us in their dress, eating habits, or religion but who often turn out to share some basic qualities—perhaps a love of children or a fear of snakes. Global conflicts over land or politics notwithstanding, people are people.

Anthropologist Donald Brown was sufficiently intrigued by this similarity around the world, noted not just by tourists but also by scholars, that he began to catalog what he called “human universals.” Despite ethnographers’ emphasis on cultural differences, Brown claimed in his 1991 book that “nowhere in the ethnographic literature is there any description of what real people really did that is not shot through with the signs of a universal human nature.”
46
Brown was arguing against a purely cultural interpretation of what people do, suggesting instead that biology and evolution, interacting with the environment, have produced common behaviors in all human beings. Some of the universals include incest avoidance, the rough structure of language, a male-dominated political life, use of mind- or mood-altering substances, and the aforementioned fear of snakes. Brown’s vision of these universals was remarkably detailed, not only about what people did but about how they felt: “Universal People . . . may not know how to make fire, but they know how to use it . . . Tools and fire do much to make them more comfortable and secure.”
47

If these universals are real, does that mean they reflect our ancestral behavior, so that we can use them as a way to extrapolate to our “most natural” modern behavior? Not really. Although I agree that the similarity among human groups around the world is striking, and I also concur that human behaviors are the result of evolution, it does not follow that we can use these similarities to construct, well, a paleofantasy of what our ancestors were like. Rosemary Joyce says that her archaeology students often want to support arguments about human nature and our own past with traits that are found in a diversity of modern societies. “But,” she notes, “we could find that every human society living today had a particular social practice, and that won’t tell us how human ancestors in the deep past acted.”
48
The catch is that evolution is continuous, and we might all show similar patterns of behavior because we’ve all been subject to similar selection pressures.

How, then, did the cavemen live? It is not as clear-cut as the paleo proponents would like to believe. True, we know that early humans lived as hunter-gatherers, used stone tools to butcher their prey, produced art, and had a number of other attributes. And we can learn a great deal about how selection acts on social behavior by observing our close primate relatives. But there was no single Paleo Lifestyle, any more than there is a single Modern Lifestyle. Early humans trapped or fished, relied on large game or small, or collected a large proportion of their food, depending on where in the world they lived and the time period in which they were living. At any given moment, humans were doing some things that primates had been doing for 10 million years, such as using alliances among individuals to gain social status, and some things that were relatively recent evolutionary developments, such as making stone tools that could be attached to a handle instead of simply thrown or held in the hand. Neither one is more “authentic” than the other. Whether that is of interest to the Manhattanites trying to install meat lockers in their tiny apartments is another matter.

2

Are We Stuck?

A
lthough we can argue about exactly how our ancestors from 50,000 years ago or more might have lived, it’s undeniable that the time since people started living in settled groups larger than the size of a few extended families is extremely short. The ubiquitous clock metaphors for the history of the Earth have all of humanity shoved into the last few minutes, making agriculture and subsequent developments measurable only in nanoseconds. (In the Neanderthal Museum in Croatia, the time line of evolution is portrayed in a twenty-four-hour day, with “mankind’s relations” appearing at just eight minutes before midnight.
1
) Even within the span of human evolution, the relative proportion of that period we have spent in agrarian settlements rather than as foragers or pastoralists is even more minuscule. And evolution is usually billed as a ponderous process, requiring thousands and thousands of generations before its effects are realized. It can seem logical, then, to assume that we have, as evolutionary psychologists and others are fond of stating, Stone Age genes ill suited to our Space Age lives and environment, and that we suffer the consequences. Or, as the Web page “Evolutionary Psychology: A Primer” puts it, “Our modern skulls house a stone age mind.”
2

Reasonable though this conclusion may seem, however, it is wrong, or at least it is correct only in such a broad sense as to be nearly useless. It is also often conflated with a different, though related, point, frequently made by paleo-diet proponents like Loren Cordain, that adopting agriculture sent humanity spinning down a starch-choked path of doom.
3
But in reality, we have two questions, both of which I will examine in this chapter. First, why, exactly, was this particular shift so momentous in the first place? In other words, what did agriculture do that was so revolutionary, and were all of the resulting changes bad? Second, given the consequences of humans giving up a less settled existence a mere handful of millennia ago, are we therefore stuck with the bodies—and minds—that we had before the transition to agriculture? Saying that agriculture and its concomitant changes to our diet and politics were bad for us, as Jared Diamond
4
and others do, isn’t the same thing as saying we are trapped in an agriculture-induced cage—and an obese, sickly, socially stratified cage at that.

The curse or blessing—or both—of agriculture

Once the human species had spread out of Africa, people probably lived as hunter-gatherers in small groups until the rise of agriculture, which anthropologists Gregory Cochran and Henry Harpending call the Big Change.
5
No one denies that it was a major milestone, but several scientists go further and claim that it was the beginning of a downward spiral. In 1987, Jared Diamond, who later wrote such best-selling and influential books about the history of humans on Earth as
Guns, Germs, and Steel
, titled an article on the establishment of agriculture “The Worst Mistake in the History of the Human Race.” In it he says, “With agriculture came the gross social and sexual inequality, the disease and despotism, that curse our existence.”
6
An article in the British newspaper the
Telegraph
about Diamond’s and others’ work is similarly gloomily headlined “Is Farming the Root of All Evil?”
7

Spencer Wells of the National Geographic Society goes even further: “Ultimately, nearly every single major disease affecting modern human populations—whether bacterial, viral, parasitic or noncommunicable—has its roots in the mismatch between our biology and the world we have created since the advent of agriculture.”
8
And environmental writer and activist John Feeney pulls out all the stops with, “As hunter-gatherers, we blended gracefully into Earth’s ecosystems. Then everything changed. Civilization is made possible by agriculture. Agriculture is unsustainable. If it weren’t obvious already, you can see where this is going.”
9

The first point to clear up before we tackle all this pessimism is one of definition. Agriculture can be informally defined as growing one’s crops and domesticating or at least keeping animals, rather than simply picking up what nature provides.
10
But anthropologists distinguish three kinds of such food production: horticulture, pastoralism, and intensive agriculture. People probably started out with horticulture, in which relatively unmodified crops are grown and cultivated with simple tools such as digging sticks. Modern-day horticultural societies include the Yanomami of South America, who combine growing manioc, taro, and some medicinal plants with foraging and hunting in the forest for the remainder of their food. Some horticulturalists today (and probably many in the past) spend part of their time as nomads, rather than living in permanent settlements. When they do form relatively sedentary groups, those groups are small, not likely to cluster in towns or cities.

Pastoralists, who rely on domesticated herds of animals that feed on natural pasture rather than on food provided by their keepers, have probably always been less common than crop-cultivating people, though even today a few groups, such as the Saami (known also as Lapps) of Scandinavia, who herd reindeer, persist. The animals are sometimes kept in one place for a few months at a time, as when the Saami keep female reindeer in corrals for milking during the summer. Although the reindeer, like other animals kept by pastoralists, provide the bulk of the Saami people’s livelihood, the Saami and other pastoralists also trade with agricultural groups for other products, like plant foods.

Intensive agriculture is more like the form of growing food most often practiced today, though it is still seen in societies we would probably classify as “traditional,” such as the rice-farming cultures of Southeast Asia. Fields are more permanent than those used by the horticulturalists, who may “slash and burn” the areas they cultivate, leaving them in between growing periods, sometimes for years, to regain nutrients in the soil. In contrast, intensive agricultural societies actively manage their fields with fertilizers and use them full-time. They also use more sophisticated tools, though these may simply be animal-drawn plows, not engine-powered cultivators. Crops are raised not only for eating by those who cultivate them, but for sale, which means that people can live in larger groups, with a division of labor between those who do the growing and those who buy or trade for the produce. That division of labor in turn means that resources—food itself or the means to purchase it—are not always divided equally, and society can become stratified.

Other than providing points of discussion to anthropologists, why do these distinctions matter? They matter because it is easier to accuse Monsanto-like agribusiness of causing widespread obesity and hypertension than it is to do the same thing to a few dozen people scrabbling in the ground for tubers using pointed sticks. And small-scale agriculture may have been around a great deal longer than people think; we are only now discovering that even the manly Neandertals had grain fragments between their teeth, and that early humans ground grains into flour, as I discuss in more detail in Chapter 5.

All of these gradations make it difficult to determine exactly when the woes associated with the shift to agriculture—increased levels of infectious diseases, reliance on one or a few food sources—first appeared. As archaeologist Tim Denham and his colleagues point out, “Early agriculture is not a demarcated ‘all or nothing’ lifestyle that can be clearly mapped across space and tracked through time.”
11
As with all other processes in evolution, the move to a different way of obtaining food came about in fits and starts, with some human traits adapting well to the changes and others not so much. This irregular but realistic progression of events makes Cordain’s contention that “the Paleo Diet is the one and only diet that ideally fits our genetic makeup. Just 500 generations ago—and for 2.5 million years before that—every human on Earth ate this way”
12
a little suspect.

This is not to deny the changes that took place as agriculture—intensive or otherwise—became established. Most obviously, the human diet changed to include and eventually depend on crops such as wheat, rice, and other grains, which meant that larger populations could be supported in one place. It also meant that the relative proportions of carbohydrates and proteins in the diet shifted toward the more reliable starches, though exactly how much is uncertain. Recent evidence from Neandertals and other fossils suggests, for example, that early humans may have eaten, and even processed, grain foods much earlier than had been supposed. Nevertheless, postagricultural diets not only relied more on carbohydrates, but were far less variable than the diets of hunter-gatherers. Estimates of the number of different kinds of plants eaten by many hunter-gatherer groups range from 50 to over 100, depending on the location of the population. Nowadays, in contrast, according to David Harris of the Institute of Archaeology at University College London, “a mere 30 crops account for 95% of plant-derived energy in the human food supply, over half of which is provided by maize, rice and wheat.”
13

Why might reducing the number of foods we eat be a bad thing? Eating a varied diet is not necessarily inherently virtuous, though certain micronutrients are probably best obtained from a variety of foods. But a varied set of crops does provide a cushion against some kinds of food shortages, in a not-putting-all-your-grains-in-one-basket way. The Irish potato famine, for example, came about because a fungal disease wiped out the potato crop that the peasants of Ireland relied on for most of their caloric needs. The disease, in turn, was able to have such devastating effects because almost all the potatoes had been selected to be genetically uniform, with the size, shape, and flavor that made them tasty and easy to grow. If one potato plant was susceptible, that meant they all were, and thus the entire crop could be decimated in one fell swoop. Reliance on just a few food plants makes us vulnerable to similar calamities, and it is an ongoing concern among scientists and farmers today. It is debatable, however, whether a return to a hunter-gatherer existence—even if feasible—is the best, or only, solution to this problem.

Working harder than a chimpanzee

One of the biggest bones of contention, so to speak, about hunter-gatherers versus agriculturalists is that the latter work too hard, in terms of both the time spent on subsistence and the intensity of the labor required, or at least they work harder than people who do not farm. Wells puts it this way: “As hunter-gatherers, we were a species that lived in much the same way as any other, relying on the whims of nature to provide us with our food and water.”
14
And the whims of nature are presumably easier to cajole than the rocky soil or recalcitrant cattle of the farm. Agriculture, then, is sometimes seen as bad because it is just plain too difficult.

It is true that at least some hunter-gatherers spend less of their day “working,” defined as engaging in activities necessary for subsistence, than do many farmers. Richard Lee’s classic 1960s studies of the Kalahari desert people found that they needed two and a half days per week to collect enough food; adding activities such as toolmaking and other “housework” brought the total to an enviable forty-two hours per week.
15
Jared Diamond notes that the Hadza of Tanzania managed to keep their weekly work time down to fourteen hours or less.
16
Other estimates vary, and many of the calculations have been criticized by some anthropologists, who claim that the societies cited are not typical hunter-gatherers. But it seems reasonable to conclude that farmers, particularly those engaged in intensive agriculture, do indeed work harder than most foraging peoples.

The problem is that those foraging peoples are themselves still working pretty hard, at least compared to many other species. Anthropologist Hillard Kaplan and colleagues suggest that a hallmark of more modern humans was the ability to get hard-to-acquire foods.
17
They classify foods as
collected
, such as fruit;
extracted
, such as termites that are in protected underground nests or tubers that have to be dug from the ground; and
hunted
, which are foods such as deer or other prey that are caught or trapped.

Other primates, including chimpanzees, also eat foods that require some of the same kind of processing, and the chimps even hunt from time to time. But only humans focus on the extracted and hunted types rather than collecting what nature’s whim provides. And we humans—even those in hunter-gatherer societies—need long years of training before we have the skills to net a fish or bring down an ungulate. Men of the Aché of South America, one of the best-studied contemporary foraging societies, do not peak in hunting ability, measured in the amount of meat collected per unit effort, until they are thirty-five years old. Collecting tubers is also no walk in the park; women of the Hiwi people of Venezuela become maximally efficient at foraging for roots between thirty-five and forty-five years of age. Acquiring these skills takes time, and lots of it.
18

We can draw two conclusions from these statistics. The obvious one is that hunting and gathering is more than lolling around waiting for grapes to fall into your mouth or meeting up with your mates for an occasional fun-filled hunting trip. It may not be the workweek of a Wall Street shark or a nineteenth-century sweatshop laborer, but it is not the idyllic life we might have imagined. Less obvious, though, is that the amount of time one spends making a living is a continuum among animals, humans included. Why do we have a paleofantasy about the ancestral hunter-gatherer, when our even earlier relations, the apes, spend even less time foraging? Should we be yearning for the days before tool use? And how do we balance time against effort? Is it better to mindlessly munch grass, which requires little effort but takes a lot of time to down, one determined mouthful after another, or to spend less of the day fashioning a complex fish trap that may yield no catch? Choosing agriculture as the point at which we all started to go downhill because we began to work too hard is simply not defensible.

BOOK: Paleofantasy: What Evolution Really Tells Us about Sex, Diet, and How We Live
5.37Mb size Format: txt, pdf, ePub
ads

Other books

Moonlight by Ann Hunter
The Road Out of Hell by Anthony Flacco
Nothing Left to Burn by Patty Blount
Zeuglodon by James P. Blaylock
The Kill by Jan Neuharth
365 Nights by Charla Muller