Authors: Randolph M. Nesse
The list of threats we face from antibiotic-resistant bacteria is long and frightening. A plasmid-mediated ability to prevent binding of erythromycin has made over 20 percent of pneumococcal bacteria resistant to treatment with that drug in France. Some strains of the cholera now threatening thousands in South America are resistant to all five previously effective drugs. Amoxicillin is no longer effective against 30 to 50 percent of pathogenic
E. coli
. It appears that we are indeed running, together with the Red Queen, as fast as we can just to stay in the same place.
Perhaps most frightening of all, one third of all cases of tuberculosis in New York City are caused by tuberculosis bacilli resistant to one antibiotic, while 3 percent of new cases and 7 percent of recurrent cases are resistant to two or more antibiotics. People with tuberculosis resistant to multiple drugs have about a 50 percent chance of survival. This is about the same as before antibiotics were invented! Tuberculosis is still the most common cause of death from infection in developing countries, causing 26 percent of avoidable adult deaths and 6.7 percent of all deaths. TB rates in the United States fell steadily until 1985 but have increased 18 percent since then. About half of these cases resulted from impaired immune function in people with AIDS, the rest from increased opportunity for contagion and drug-resistant pathogens.
Increasing tolerance to antibiotics is the most widely known and appreciated kind of pathogen evolution. Since their discovery in the
1950s, an enormous number of studies have established many medically important conclusions:
1. Bacterial resistance to antibiotics arises not by the gradual development of tolerance by individual bacteria but by rare gene mutations or new genes introduced by plasmids.
2. Gene mutations can be transmitted by plasmid infection or other processes to different species of bacteria.
3. The presence of an antibiotic causes the initially rare mutant strain to increase and gradually replace the ancestral type.
4. If the antibiotic is removed, ancestral strains slowly replace the resistant forms.
5. Mutations within a resistant strain can confer still greater resistance, so that increasing the dose of an antibiotic may be effective only temporarily.
6. Low concentrations of an antibiotic, which may retard bacterial growth only slightly, will eventually select for strains that resist the slight retardation.
7. Mutations that confer still higher levels of resistance arise in such partially adapted strains more often than in the original nonresistant strain.
8. Resistance to one antibiotic may confer resistance to another, especially if the two are chemically related.
9. Finally, the disadvantage of resistant strains in the absence of an antibiotic is gradually lost by further evolutionary changes, so that resistance can prevail even where no antibiotics have been used for a long time.
The implications of these findings for medical practice are now widely appreciated. If one antibiotic doesn’t alleviate your disease, it may be better to try another, instead of increasing the dose of the first. Avoid long-term exposure to antibiotics; taking a daily penicillin pill to ward off infection is accepted therapy for some conditions, such as infection of vulnerable heart valves, but has the incidental effect of selecting for resistant strains. Unfortunately, we may often be exposed to this side effect without knowing it, by consuming
meat or eggs or milk from animals routinely dosed with antibiotics. This is a hazard that has recently provoked conflict between food producers and public health activists. The problem of antibiotic use in farm animals needs to be more widely recognized and carefully evaluated in relation to whatever economic gains may be claimed. As Harold Neu, professor of medicine at Columbia University, says in concluding his 1992 article “The Crisis in Antibiotic Resistance,” “The responsibility of reducing resistance lies with the physician who uses antimicrobial agents and with patients who demand antibiotics when the illness is viral and when antibiotics are not indicated. It is also critical for the pharmaceutical industry not to promote inappropriate use of antibiotics for humans or for animals because this selective pressure has been what has brought us to this crisis.” Such advice is unlikely to be heeded. As Matt Ridley and Bobbi Low point out in a recent article in
The Atlantic Monthly
, moral exhortations for the good of the many are often welcomed but rarely acted upon. To get people to cooperate for the good of the whole requires sanctions that make lack of cooperation expensive.
Viruses don’t have the same kind of metabolic machinery as bacteria and are not controllable by fungal antibiotics, but there are drugs that can combat them. An important recent example is zidovudine (AZT), used to delay the onset of AIDS in HIV-infected individuals. Unfortunately, AZT, like antibiotics, is not as reliable as it once was because some HIV strains are now (no surprise) resistant to AZT. HIV is a retrovirus, a really minimal sort of organism with special limitations and special strengths. It has no DNA of its own. Its minute RNA code acts by slowly subverting the DNA-replicating machinery of the host to make copies of itself. The cells it exploits include those of the immune system. The virus can hide inside these cells, where it is largely invulnerable to the host’s antibodies.
A retrovirus’s lack of self-contained proliferation machinery is both its weakness and its strength. It reproduces and evolves more slowly than DNA viruses or bacteria. Another weakness is its low level of reproductive precision, which means that it produces an appreciable number of defective copies of itself. This functional weakness can be an evolutionary strength, however, because some of the defective copies may be better at evading the host’s immune system or antiviral drugs. Another strength of retroviruses is their lack of any easily exploited Achilles’ heel in their simple makeup.
It takes months or years for HIV to evolve resistance to AZT, in marked contrast to the few weeks it takes bacteria to evolve significant levels of resistance to some antibiotics. Unfortunately, HIV has a long time to evolve in any given host. A single infection, after years of replication, mutation, and selection, can result in a diverse mixture of competing strains of the virus within a single host. The predominant strains will be those best able to compete with whatever difficulties must be overcome (e.g., AZT or other drug). They will be the ones that most rapidly divert host resources to their own use—in other words, the most virulent.
T
he evolution of virulence is a widely misunderstood process. Conventional wisdom has it that parasites should always be evolving toward reduced virulence. The reasoning assumes, correctly, that the longer the host lives, the longer the parasites can live and the longer they can disperse offspring to new hosts. Any damage to the host on which they depend will ultimately damage all dependent parasites, and the most successful parasites should be those that help the host in some way. The expected evolutionary sequence starts with a virulent parasite that becomes steadily more benign until finally it may become an important aid to the host’s survival.
There are several things wrong with this seemingly reasonable argument. For example, it ignores a pathogen’s ultimate requirement of dispersing offspring to new hosts. This dispersal, as noted in the previous chapter, frequently makes use of host defenses, such as coughing and sneezing, that are activated only as a result of appreciable virulence. A rhinovirus that does not stimulate the host to defend itself with abundant secretion of mucus and sneezing is unlikely to reach new hosts.
Another error in the traditional view is the assumption that evolution is a slow process not only on a time scale of generations, but also in absolute time. Such a belief arises from a failure to appreciate the capacity for rapid evolution of any parasite that will go through hundreds or thousands of generations in one host’s lifetime. If the
virulence of the amoeba that causes dysentery is too low or too high for maximizing its fitness, the virulence can be expected to evolve quickly toward whatever level is currently ideal. We should not expect the present virulence of any pathogen to be in transit from one level to another unless conditions have changed recently. By “recently,” we mean last week or last month, not the last ice age, which is what an evolutionary biologist often means by “recently.”
Yet another flaw in the conventional wisdom is its neglect of selection among different parasites within hosts, as we just implied in our discussion of HIV. What good would it do a liver fluke to restrain itself so as not to harm the host if that host is about to die of shigellosis? The fluke and the
Shigella
are competing for the same pool of resources within the host, and the one that most ruthlessly exploits that pool will be the winner. Likewise, if there is more than one
Shigella
strain, the one that most effectively converts the host’s resources to its own use will disperse the most progeny before the host dies. As a rule, all else being equal, such
within-host selection
favors increased virulence, while
between-host selection
acts to decrease it. A recent comparative study of eleven species of fig wasps and their parasites confirmed that increased opportunities for parasite transmission are associated with increased parasite virulence.
As with many other applications of evolutionary theory, careful quantitative reasoning is needed to understand the balance between natural selection within and between hosts. The graph on the next page is a naive representation of what we have in mind.
An adequate theory of the evolution of virulence must take into account the rate of establishment, in a given host, of new infections; the extent to which these competing pathogens differ in virulence; the rate of origin of new strains by mutation within a host; and the extent to which these new strains differ in virulence. From such considerations it should be possible to infer the expected levels of virulence for a given pathogen, assuming that conditions stay the same, which they never really do. The most important changes would be those that alter the means by which a pathogen reaches new hosts. If dispersal depends not only on a host’s survival but also on its mobility, any damage to the host is especially harmful to the pathogen. If you are so sick from a cold that you stay home in bed, you are unlikely to come into contact with many people that your virus might infect. If you feel well enough to be up and about, you may be able to disperse it far and wide. It is very much in a cold virus’s interest to avoid making you
really sick. By contrast, the malaria agent
Plasmodium
gets no benefit from the host’s feeling well. In fact, as shown by experiments with rabbits and mice, a prostrate host is more vulnerable to mosquitoes. People in the throes of a malarial attack are not likely to expend much effort warding off insects. Mosquitoes can feast on them at leisure and spread the disease far and wide.
F
IGURE
4–1. S
ELECTION
W
ITHIN AND
B
ETWEEN
H
OSTS
.
A
shows the effects of an extremely virulent pathogen, which would be favored by natural selection
within
a host. It exploits its host to maximize the current rate of dispersal of new individuals to new hosts. It may kill the host quickly, but while the host lives it does better than any competing pathogen.
B
shows the effects of a pathogen that is favored by selection
between
pathogen communities of different hosts. It maximizes its long-term total productivity (rate of reproduction times duration, graphically the area under the production curve). Host death in
B
is most likely from something other than the pathogen.
This evolutionary perspective suggests that diseases spread by personal contact should generally be less virulent than those conveyed by insects or other vectors. Do the facts fit this expectation? They do indeed. Among Paul Ewald’s important discoveries is the truth of this generalization and its importance for public health. He has shown that diseases from vector-borne pathogens tend to be more severe than those spread by personal contact and that mosquito-borne infections are generally mild in the mosquito and severe in vertebrate hosts. This is to be expected because any harm to the
mosquito would make it less likely to bite another vertebrate. For gastrointestinal pathogens, the death rate is lower for direct, as compared to waterborne, transmission, as long as really sick hosts can effectively contaminate the water supply. As pure water became the norm in the United States early in this century, the deadly
Shigella dysenteriae
was displaced by the less virulent
Shigella flexneri
. As water was purified in South Asia during the middle of the century, the lethal form of cholera was steadily displaced by a more benign form, and the transition took place earliest at the places where water was first purified.