The Coming Plague (106 page)

Read The Coming Plague Online

Authors: Laurie Garrett

BOOK: The Coming Plague
13.53Mb size Format: txt, pdf, ePub
Critics of Cairns's experiments on bacteria and yeast grown under starvation conditions, which gave rise to directed mutations, charged that the British scientist's conclusions were unjustified: even in the Cairns model, they said, the mutations could have been due to random events.
108
The arguments heated up as researchers found evidence of seemingly strange behaviors in microbes. For example, some transposons seemed to be able to sense when it was a good time to pop out of bacterial DNA and go their separate ways in search of a safer genome. How did they “know” that the bacterium was under fatal attack? Or was it possible the transposons didn't “know” anything and scientists were simply witnessing the results of successful, though utterly random, gene jumping?
109
In fungi, it was noted, environmental stress could induce a process called “ripping,” in which a massive number of single point mutations were suddenly made. Again, was the fungus responding to a stress by mutating in a specific, directed manner, or was it simply randomly mutating at a feverish pace?
110
On an even more basic level, many scientists argued that utterly random
mutation and absorption and use of mobile DNA would be prohibitively expensive for microbes. It cost chemical energy to scavenge plasmids and transposons, to sexually conjugate, or to move pieces of DNA around inside cells. It seemed inconceivable that stressed organisms, in particular, would waste energy soaking up all sorts of DNA completely at random. Several genes had to be switched on and membrane changes had to be made in order, for example, for
E. coli
to absorb useful antibiotic-resistance factors from another species,
Bacteroides fragilis
.
111
And though such horizontal transfers of genes between entirely different species of organisms were costly, they clearly occurred, spreading advantageous traits for resistance and virulence among microbes.
112
In some cases the plasmids themselves seemed to improve as they moved about between species, recombining and adding new pieces of DNA as they went.
113
Chemicals such as anesthetics, detergents, and environmental carcinogens seemed, for example, to influence bacterial sexual conjugation.
114
In 1994 the Cairnsian view of directed mutation got a boost from experiments performed at Rockefeller University and the University of Alberta, Canada. Researchers first confirmed Cairns's initial experiments, showing that there was a specialized pathway of mutations that was switched on during
E. coli
starvation. Further, they showed that genetic recombination and resultant adaptive mutation occurred
in the absence of bacterial reproduction
. In other words, bacteria altered themselves not just through a process of random, error-prone reproduction that eventually yielded a surviving strain—the classic Darwinian view. In addition, they changed themselves, in some concerted manner, without reproducing.
115
The differences in the Darwinian and Cairnsian views were not trivial. If, for example, an
E. coli
bacterium residing in the human gut were suddenly exposed to a flood of tetracycline, would it occasionally mutate and perhaps become resistant after generations of bacterial reproduction? Or could it acquire instant resistance via some directed recombination or transposon mechanism?
As issues of emerging diseases drew greater attention within the scientific community, theoretical debates centered on key questions: How likely was it that a previously unknown microbe would suddenly appear out of some stressed ecosphere? What were the odds that a fundamentally new pathogenic organism would emerge, the result either of recombination among other microbes or of large-scale mutation? Was it likely that old, well-understood microbes might successfully mutate into more dangerous forms? The first two questions were the subjects of mathematical models and extensive theoretical discussion, though the numbers of unknowns involved in such computations were enormous and significantly impeded conclusive analysis. Most scientists involved in such exercises felt that further basic research on microbial ecology and human behavior was needed in order to obtain enough data points to solve these quandaries.
116
As to the question of virulence, it was considered axiomatic that all pathogenic microbes would seek a state of moderate virulence in which
they didn't kill off their unwitting hosts too rapidly, giving themselves plenty of time to reproduce many times over and spread to other would-be hosts.
117
Over time, even a rapid killer such as the 1918–19 Swine Flu would evolve toward lower virulence. Or so it was thought.
But in the 1990s the world saw two viral cousins take off on very different virulence pathways. HIV-2 in West Africa became markedly less virulent between 1981 and 1993, infecting fewer people (despite the lack of safe sex practices) and possibly causing less severe disease in those it did infect.
118
In contrast, over the same period there emerged strains of HIV-1 that seemed to be
more
transmissible and to cause more rapid disease. Thus, tendencies toward both less and greater virulence seemed to be occurring simultaneously in the global AIDS epidemic.
Max Essex, Phyllis Kanki, and Souleymane MBoup studied HIV-2 closely and felt that there were inherent differences in the two species of AIDS viruses that could explain their opposite tendencies in virulence. Kevin DeCock felt, on the basis of his studies in Côte d'Ivoire, that HIV-2 was less transmissible than HIV-1, and probably always had been.
Biology theorist Paul Ewald of Amherst College in Massachusetts believed HIV-1 was also becoming less virulent. He argued that Kaposi's sarcoma, which was primarily seen among gay men with AIDS, was caused by a more virulent form of the virus that existed during early years of the epidemic. In Australia, Kaposi's sarcoma and AIDS deaths had declined markedly over the course of the epidemic, due, Ewald thought, to a shift toward less virulent HIV-1 strains.
119
But Australia's situation was not mirrored in the rest of the world in 1994: globally HIV-1 was spreading at an extraordinary pace, and strains of the virus had recently emerged that seemed to be especially adapted to rapid heterosexual or intravenous transmission. A Ugandan strain surfaced sometime in 1992 that appeared to cause full disease within less than twelve months after the time of infection.
120
On the basis of mathematical models, British researchers predicted that HIV-1 would continue its trend toward greater virulence so long as the rates of multiple partner sexual activity remained high in a given area. As sexual activity declined, or as it became more monogamous, the rates of successful mutation, the number of quasispecies, and the virulence of HIV-1 would decrease.
121
And on that one point Ewald agreed: namely, that multiple partner sex was the key to virulence for sexually transmissible microbes.
122
At the root of much of the new thinking about virulence lay a key assumption: that microbes would be extremely virulent if long-term survival of the host wasn't important for the spread and survival of the microbial species.
123
If host population density increased, the microbes could afford to become more virulent, as they were guaranteed greater exposure to secondary and tertiary victims.
That theoretical view received some experimental support in 1993 when Allen Herre, of the Smithsonian Tropical Research Institute in Panama,
made a startling observation on the relationship between fig tree wasps and the minute roundworms that parasitized the insects. After ten years of observation and manipulation, Herre concluded that the worms became more virulent when the size of the wasp population, and the number of broods occupying any given fig tree niche, grew. When population size was low, the parasites were of low virulence and were passed from female wasps to their offspring via infected eggs laid in the figs. When the wasp population size swelled, and various broods intermingled, the parasites spread horizontally, from wasp to wasp. This allowed the parasites to become more virulent and, among other things, to destroy the insects' eggs. The difference could be seen in the paradoxical observation that the figs might be healthier, and suffer less wasp larvae infestation, at times when the adult wasp population was at its peak.
124
At Harvard Medical School, John Mekalanos studied a host of known virulence factors and developed a technique for teasing out unknown bacterial virulence genes. He concluded that many microbes stored virulence factors, just as they did resistance genes, on plasmids and transposons, snapping them up when conditions were ripe for all-out activity, and discarding them as excess baggage when the time was right. Such virulence factors could be shared across microbial species.
Things that seemed to turn on known virulence factors included calcium fluxes, warmer temperatures (98.6°F inside a human body versus an external 60°F), the presence of iron, and a number of key chemicals.
125
But Mekalanos also showed that for every known virulence factor in a given microbe there were dozens awaiting discovery. What mechanisms might switch those genes on, or cause them to mutate, weren't known.
Mekalanos disagreed with Ewald's theory that virulence was tightly linked to transmissibility. There were exceptions. For example, a huge dose of cholera vibrio was needed to cause a human infection—on the order of one million. In contrast,
Shigella
could cause infection and disease with fewer than a hundred bacteria. Nevertheless, cholera was far more lethal than shigellosis.
“It's more complicated than mere transmissibility,” Mekalanos said. “Microorganisms respond to a more complex array of pressures that decide levels of virulence.”
The most blatant source of pressure was the host's immune system. In most cases the microbial advantage might look like virulence because the host's disease progressed badly, but from the microbe's point of view what was transpiring could better be described as escape. Microbes had discovered a long list of ways to escape the immune system, including disguise, Trojan Horse-like use of immune system cells as modes of entry and avoidance, constant mutation of genes coding for their outer surfaces so that the immune system failed to recognize them, and manipulation of immune system chemicals to set off false alarms that would occupy the system while the microbes slipped into safe hiding places.
126
Theorists were busy trying to determine whether the balances between human immunity and microbial virulence were tipped by any particular identifiable contemporary factors. Nobel laureate Dr. Thomas Weller expressed concern that the ever-increasing numbers of severely immunosuppressed people on the planet posed a real threat for emergence of new disease problems. Cancer patients treated with high doses of chemotherapy or radiation, people infected with HIV, and individuals undergoing transplant operations all represented potential breeding sites for new or mutated microbes. Weller worried about a possible “piggyback” effect, with one microbial population taking advantage of severe immunodeficiencies produced by another microbe or medical treatment.
127
Another population of immunosuppressed individuals consisted of those suffering from chronic malnutrition. Wherever a significant percentage of the
Homo sapiens
population was starving was likely to be a spawning ground for disease.
128
Vaccines, where available, protected people against disease, but not against infection. Microbes could enter the body, but even highly virulent organisms found themselves facing an immune system that was primed and ready to mass-produce antibodies. Battles ensued; the invader was vanquished.
129
If a sufficient number of
Homo sapiens
in a given area possessed such immunity it would be possible to essentially eliminate the microbe. Unable to find a
Homo sapiens
host in which it could replicate, the microbial population would nearly disappear. Nearly. In this state, known as herd immunity, humans (or livestock animals) never suffered disease, though they might be infected, unless the necessary level of immunity in the overall population slacked off. For that reason, schoolchildren vaccine campaigns had to reach a critical threshold of successful completion or the unvaccinated children would be a great risk for disease.
130
Herd immunity faced tough challenges in the age of air travel because individuals who carried microbes to which they were personally immune could fly into geographic areas where herd immunity was extremely low. Under such circumstances, even organisms not generally thought to be particularly virulent could produce devastating epidemics.
The best example of the phenomenon was the estimated 56 million American Indians who succumbed to disease following the arrival of Europeans—and their microbes. That die-off continued 500 years later, into the 1990s, as Old World microbes reached the Xikrin, Surui, and other Amazon Indians.
Yale University epidemiologist Francis Black argued forcefully in the 1990s that the terrifying death toll among New World natives was not a straightforward question of their having naive immune systems that hadn't previously been exposed to the European microbes. Such an explanation was, he said, overly facile and flew in the face of evidence that new diseases commonly afflicted other populations of peoples without exacting such horrendous tolls. For example, new diseases were also introduced into sub-Saharan
Africa by European explorers, and though they claimed many lives, nowhere were there wholesale microbial genocides, as were witnessed in South America.

Other books

Stormswept by Sabrina Jeffries
Backstage with Julia by Nancy Verde Barr
Color Blind by Colby Marshall
The Valley by Richard Benson