Paleofantasy: What Evolution Really Tells Us about Sex, Diet, and How We Live (27 page)

BOOK: Paleofantasy: What Evolution Really Tells Us about Sex, Diet, and How We Live
2.73Mb size Format: txt, pdf, ePub

Interestingly, McKenna mentions at least one study examining the relationship between sleeping in a room away from the parents and later childhood behavior in which the children who had slept with their parents as infants actually demanded less, not more, of their parents’ attention while playing.
40
In addition, Lee Gettler, mentioned earlier, coauthored a study with McKenna examining the sleep patterns of mothers and babies who slept in different rooms versus those who shared a surface at night or at least slept in the same room (cosleepers).
41
The two groups of mothers and their infants were then observed as they slept in a sleep laboratory for three nights. The experienced cosleepers breast-fed more often and tended to do so at shorter intervals—both attributes that are associated with optimal weight gain.

Crying is another infant quality that worries parents. Entire philosophies seem to be constructed around whether it is better to let children “cry it out” or to pick them up, and if so, after how long an interval, with concomitant worry over the eventual spoiling or emotional insecurity of the child. Here, Small flatly states, “There is extensive scientific evidence that the accepted Western caretaking style repeatedly, and perhaps dangerously, violates the adaptive system called crying that evolved to help babies communicate with adults.”
42
She and other anthropologists and psychologists note that babies in non-Western cultures seem to cry far less than those from Western cultures. It’s not that Western babies are simply unhappy. All babies seem to begin to fuss about the same number of times during a day or night, regardless of where they live. Scientists have even documented a so-called crying curve, in which babies seem to be hardwired to cry increasingly often until about two months of age, after which the amount of crying plateaus. But Western babies continue to cry in longer bouts, and their whimpering accelerates to real wailing more often, than do babies from non-Western cultures.

Kruger and Konner reviewed crying, and who responded to it, in groups of !Kung San people from Africa.
43
The !Kung babies cried for a maximum of one minute per hour (usually far less), and someone—usually, but by no means always, the mother—responded to 88 percent of all the crying bouts. Kruger and Konner characterize the society as “sensitive and indulgent” toward infants. They suggest that the !Kung’s high responsiveness may make it less likely that babies escalate their crying to Western levels.

None of the researchers imply that Western parents are bad parents, that non-Western children live in an idyllic manner we should all emulate (for one thing, there is no single “non-Western society”), or that all babies that cry for longer than a minute are irrevocably scarred for life or have something wrong with them. All of the researchers stress the plasticity of human behavior, and the inevitable variation among infants and their parents that makes general proscriptions for child care risky. But babies do seem to have evolved in an atmosphere of immediate and frequent tending by multiple individuals, which suggests that large deviations from that environment may be hard on modern infants.

The findings also suggest that the baby wearers who promote carrying babies in slings or similar devices might not be too far off the mark. Of course, people will always interpret these suggestions from their own perspective; one website with many enthusiastic testimonials about slings had a comment about baby wearing in church.
44
The commenter was pleased that keeping her baby in a carrier had the benefits of “protecting your little one from germs” and “no playing pass the baby,” noting with approval that “no one touched him”—needless to say, not exactly replicating hunter-gatherer life. I suppose you can lead a person to evolution but you cannot make her embrace its effects.

The conclusion that babies evolved with immediate care, and might not thrive if that attentiveness was missing, differs from accepting that babies are the way they are because they evolved in the Pleistocene that way, and that change since that time would be bad for them and for us—a claim that I find wanting. Instead of assuming that babies from foraging societies are more natural and more accurately reflect our evolutionary past, McKenna and others like him tested their ideas on real infants under modern-day circumstances. They found, for example, that babies cosleeping with their mothers breast-fed at shorter intervals than did babies sleeping on their own. We have obviously already altered child-rearing practices in many different ways around the world, and most children grow up just fine. So rather than concluding that we must adhere to a paleo way of child care, the question becomes which of those deviations is too great and which is an acceptable variant. The only way to address that question is with data. The way we think humans might have evolved is a starting point for asking questions such as, Does baby wearing keep infants from crying during the night? It is not, however, a prescription.

9

Paleofantasy, in Sickness and in Health

L
et’s face it—the reason people are trying to eat like cavemen or exercise barefoot is not because it looks (or tastes, or feels) good; it is so that they can live healthier, and preferably longer, lives. On the blog
Mark’s Daily Apple
, for example, a commenter optimistically proclaims, “A diet that promotes high immune function—such as the high animal fat, low carb Paleo/traditional whole food diet—will protect against all cancers because it make [
sic
] the body strong.”
1
Another contributor is even bolder, albeit with a rather wistful conclusion: “People need to realize they have total control of their health. Otherwise what kind of life is that destined to be sick by our own genes?”
2

What kind of life is it, indeed? Do our genes, along with our modern lifestyles, inevitably make us sick? Or have the genes that can render us susceptible to illness also changed as we have evolved, making us better able to fight the diseases that always surround us? It turns out that some of the most exciting new developments in understanding human evolution come from medical research—studies not of cells in test tubes or mice in laboratory cages, but of the way our genes have responded to selection by pathogens.

Furthermore, one of the best places to detect the signature of recent evolution is in the diseases we get, mainly because it is easy to see the difference between winners and losers. After all, nothing says natural selection like a brisk round of the plague. But while the devastating effects of epidemics are well known, the accompanying evolutionary change that has occurred in human populations, some of it within the last few hundreds of generations, is not.

Understanding how resistance to disease evolves, and the exact nature of the genetic changes that enable that resistance, also has practical implications. From an evolutionary biology perspective, it is all very well and good to notice how gene pools change over centuries, but this information is not exactly helpful to medical researchers hoping to find cures within a patient’s lifetime. If, however, we can determine exactly how the ability to withstand disease works by examining the recent changes in the DNA of the resistant individuals, we can try to mimic such alterations in developing treatments, without having to simply wait for a selective purge of more susceptible genes, and the people carrying them, from the population. And as I will detail later, that is exactly what scientists are doing in an attempt to treat AIDS, one of humanity’s most recent plagues.

First, let’s consider how the adoption of agriculture has affected disease, as it has affected so many other aspects of our lives. Much has been written about the changes in patterns of human ailments since the dawn of agriculture, most of it gloomy. As I mentioned in Chapter 2, becoming sedentary, working the soil, and keeping domestic animals all provide ample opportunities for the rise and spread of infectious agents like bacteria, viruses, and parasitic worms. Settled populations tend to be larger, making it easier to spread pathogens around; and keeping cattle, goats, or pigs means being in rather close proximity to their dung—a situation that both increases the diseases in the domesticated animals themselves and facilitates the sharing of parasites with them.

What’s more, the influence of farming on the evolution of parasites was not a onetime event—our domesticated animals continue to evolve with their pathogens. A group of scientists from Norway and Switzerland led by Adèle Mennerat recently modeled the effects of intensive farming, defined as raising many animals in a small area with heavy management by people, on the evolution of parasite transmission.
3
Modern-day farming, whether of cows, chickens, or salmon, is a far different affair from the keeping of a few backyard animals; the creatures are kept in large numbers and are often cheek by jowl, in some cases literally, rather than in small flocks or herds. Such crowding selects for highly virulent parasites—those that do serious harm to their hosts—rather than for more benign forms that allow the host to survive and possibly recover.

These more deadly parasites evolve because, from the parasite’s perspective, keeping a host alive longer is of little consequence, since another host is close at hand in the cramped quarters of the barn or fish hatchery. Likewise, such crowded conditions put parasites that develop more quickly and can move to a new host faster at an advantage; if the host population is spread out, a parasite that develops quickly might be too small or weak to make it to a new host. Such crowding is thought to have contributed to the high virulence of bird flu; the virus could easily move from one chicken or duck to another in the facilities housing many thousands of birds that were the norm in the parts of Asia where the virus developed. The poor hygiene and close quarters of the trenches of World War I may also have increased selection for the version of influenza that caused the deadly 1918 epidemic. Urbanization, with its dense populations all buying lattes or attending theater performances in the same places, is similar to both large-scale farming and the teeming hordes of war, at least in its ability to spread disease.

Mennerat and colleagues suggest that fish farming might be particularly vulnerable to such conditions, and they recommend that agricultural managers and evolutionary biologists collaborate to avoid the evolution of such deadly diseases.
4
For example, keeping smaller populations of fish in more ponds, rather than as a single megaschool, reduces the advantage for a parasite to become more virulent.

In addition to potentially fostering more harmful diseases, agriculture is sometimes thought to cause illnesses more directly, by producing less healthful food. If the grain-is-evil crowd is to be believed, our nutrition became worse once we stopped being hunter-gatherers, leading to increases in noninfectious ailments like diabetes and obesity that further compromise our health. In
Pandora’s Seed
, Spencer Wells rather despairingly notes, “It’s as though agriculture were a virus, expanding in influence despite its negative effects on human health.”
5

I have been emphasizing how much our genes, as well as our lives, have changed in the 10,000 years since agriculture, making us different in many ways from our Paleolithic ancestors. How many of those changes have involved our health, and were they indeed always negative, as the doomsayers claim? Furthermore, as new diseases emerge, will our genes be able to meet the challenge?

Everyone dies of something

It is obviously reasonable to assume that everyone who lived before us also died, and it’s often possible to determine the approximate age at which those deaths occurred, either from remains or, for more recent cases, from records kept over the last few centuries. Less clear is the cause of each demise. Any attempt to determine the cause of death for ancient people is complicated by at least two issues. First, evidence of many diseases simply disappears as bodies decompose; mummies from ancient Egypt and the occasional “bog person” preserved in permafrost or peat can provide more information than skeletons, but these complete specimens are relatively few and far between. And bodies will not bear the evidence of infectious diseases such as cholera or influenza, because those diseases tend to kill too quickly to leave signs of their presence on the skeleton.

Second, using written records of the cause of death is tricky because our ability to diagnose illness has improved so much over time. Someone who was noted to have died of “fever” may have suffered from malaria, pneumonia, or sepsis—or none of these. Children were sometimes said to die of hives, a skin condition that accompanies a variety of diseases but that is rarely fatal in itself, or from teething; the latter may have been due to lancing swollen gums with dirty instruments, to subsequent infection, or to the teething child’s being weaned and switched to fluids contaminated with pathogens.

With these caveats in mind, anthropologist Timothy Gage calculated mortality rates and causes of death for humans from different places and times using a variety of sources.
6
He focused on the last century and a half, but also included some prehistoric data. Although mortality rates were clearly higher in prehistoric times than they are now, contrary to what is suggested by Wells and others despairing about the effects of human settlement, people did not seem to die at an earlier age once agriculture was adopted. Note, too, that average life expectancy, as I discussed in the Introduction, is just that—an average; high childhood mortality from infectious diseases such as diarrhea can make it seem as if people in ancient times keeled over at age thirty-five, when in fact, if they made it past seven or eight their chances of living past sixty were reasonably good.

Gage suggests that the reason other studies found lower life expectancies for populations with agriculture than for those before settlement is that agricultural populations usually grow at a much more rapid rate than nonagricultural populations. This means a lot of babies being produced. More babies mean more opportunity to succumb to the aforementioned diseases. This high childhood death rate makes the average life expectancy seem lower than it actually is. And while we may bemoan the so-called diseases of affluence, Gage also points out that in most parts of the world, mortality rates have decreased substantially over the last 300 years.

Is that decline simply a result of fewer infectious diseases in modern populations, with degenerative ailments like cardiovascular disease still increasing as part of the price we pay for agriculture and sedentary living? Gage thinks not, for two reasons. First, some of the deaths may have been mistakenly attributed to degenerative diseases, when in fact they were due to other causes, from infections to accident. Second—well, we all have to die of something. Infectious diseases are declining, so other causes of death will take their place. What’s more, Gage is skeptical of the reports of the low incidence of degenerative disease in contemporary hunter-gatherer societies. Virtually all such groups studied by anthropologists are very small, making the total number of deaths that would occur during a given field study similarly tiny and the drawing of conclusions about their cause risky at best.

Some solutions to the controversy may be forthcoming, as the results of the European Global History of Health Project become available. This ambitious effort brings together data on skeletal remains from thousands of years ago from across the continent, attempting to evaluate evidence of injury, joint damage from osteoarthritis, dental status (a good indicator of diet), and other health markers.
7
By pooling the efforts of seventy-two researchers, the project will be able to overcome earlier difficulties of small sample sizes from isolated locations. Initial results suggest that people living in early urban settlements were indeed of poorer health than their hunter-gatherer ancestors, but that health improved later, after trade networks allowed the exchange of goods and food became more diverse. Newly agricultural societies may suffer, but the steady source of food, and the potential to trade for the foods not locally available, seems to pay off later.

Sickness old, sickness new, and sickness borrowed too

We can blame animals for some of our diseases, but in a larger sense our own genes are also at fault. Diseases can be divided into several categories, and our genetic heritage—and evolution—can influence all of them, though some of the effects are more direct and others more subtle. First and most obvious are infectious diseases—those caused by living things: viruses, bacteria, or larger parasites such as worms. Here our genes act to make us more or less susceptible to infection; the immune system varies in its efficacy, and we can inherit a vulnerability to particular types of infections.

Second, some ailments are due to defects or vulnerabilities in our genes themselves. Huntington’s disease, for example, is a neurological disorder that is caused by a defect on one of the chromosomes; people who suffer from the disease have a segment of DNA that is repeated too many times, and they will inevitably develop a variety of debilitating symptoms. Of course, even in genetically based diseases, the environment can play a role. Diabetes has a genetic component and runs in families, but its manifestation can be drastically influenced by the lifestyle of the person with that genetic predisposition.

Finally, degenerative diseases such as cancer or cardiovascular malfunction can occur because of a complex mixture of genetic and lifestyle factors. Some scientists have even suggested that infectious agents, especially viruses, can play a role in heart disease or other illnesses previously thought to be solely due to deterioration of the body’s systems. Regardless of whether this idea pans out, it is clear that genes associated with disease are both important and numerous: from actual disease-causing genes, to genes that influence the operation of the immune system, to those that affect the strength of our tissues and hence their vulnerability to failure, much of our genome is occupied with keeping us healthy. Therefore, such genes are particularly good places to look for evidence of recent evolution.

Humans, of course, are not unique in our ability to get sick; not only do big fleas have little fleas upon them, as the doggerel goes, but animals, plants, and even bacteria are subject to various ailments. The genes associated with defense against those diseases are passed along when new species evolve. In trying, then, to understand how genes related to health evolved, we need to look at where they originated in our evolutionary history. Modern genomic techniques now allow scientists to compare the DNA sequences of different organisms and calculate the time at which various genes of two species, or higher-level groupings like birds and reptiles, diverged. So, for example, we can see which of our DNA sequences we retain from our common ancestry with fish, how many from our ancestry with our fellow primates, and so on.

Evolutionary geneticists Tomislave Domazet-Lo
š
o and Diethard Tautz did just that, using an enormous database that catalogs human genes associated with diseases.
8
The database contains over 4,000 chromosome regions that are associated with a genetic disease, meaning that people who have the disease are more likely than the population at large to have that particular gene variant on their chromosomes. After “cleaning up” the data by removing questionable sequences or other potential errors, the researchers ended up with 1,760 disease genes based on Morbid Map, the delightfully named guide to the location on the chromosomes of genetic disorders that was developed at Johns Hopkins University. They then compared the genes to those of other creatures on Earth, from bacteria to primates.

Other books

Search the Dark by Charles Todd
Footsteps by Susan Fanetti
En el camino by Jack Kerouac
Nadia Knows Best by Jill Mansell
Paradise Alley by Kevin Baker
A Perfect Life by Eileen Pollack
Wild Cards [07] Dead Man's Hand by George R.R. Martin
Dark Defender by Morgan, Alexis