The Coming Plague (90 page)

Read The Coming Plague Online

Authors: Laurie Garrett

BOOK: The Coming Plague
10.95Mb size Format: txt, pdf, ePub
With every passing year in America's AIDS epidemic the impact upon the nation's poorest urban areas grew more severe. It compounded the effects of other plights—homelessness, drug abuse, alcoholism, high infant mortality, syphilis, gonorrhea, violence—all of which conspired to increase levels of desperation where dreams of urban renewal had once existed.
As the virus found its way into communities of poverty, the burden on urban public hospitals was critical. Unlike Canada and most of Western Europe, the United States had no system of national health care. By 1990 an estimated 37 million Americans were without any form of either public or private health insurance. Too rich to qualify for government-supported health care, which was intended only for the elderly and the indigent, but too poor to purchase private insurance, millions of Americans simply prayed that they wouldn't fall ill. Another 43 million Americans were either chronically uninsured or underinsured, possessing such minimal coverage that the family could be bankrupted by the required deductible and co-payments in the event of serious illness.
111
Any disease that hit poor urban Americans disproportionately would tax the public hospital system. But AIDS, which was particularly costly and labor-intensive to treat, threatened to be the straw that broke the already weakened back of the system.
112
“We are fighting a war here,” declared Dr. Emilio Carrillo, president of the New York City Health and Hospitals Corporation, which ran the city's network of public medical facilities. “People are sick and dying from AIDS, tuberculosis is rampant, malnutrition, drug addiction, and other
diseases resulting from poverty are also at epidemic levels, while at every level of government, city, state, and federal, the health care system is facing cutbacks. Only the number of sick people and people in need of basic health care is not being cut back. Among them there have been no reductions, no downsizing. They are still coming in to us for treatment.”
A 1990 survey of 100 of the nation's largest public hospitals (conducted by the National Association of Public Hospitals) revealed worsening situations in all American cities and predicted collapse of the “public safety net” offered by the system. A microbe that had emerged in America only a decade earlier was threatening to topple the system.
By 1987, 3 percent of the women giving birth in hospitals in New York City were HIV-positive, as were some 25 percent of their babies, according to the U.S. Public Health Service. Nearly two-thirds of those mothers and babies were born in public hospitals located in largely African-American or Hispanic neighborhoods of Brooklyn and the Bronx. The following year the state of New York concluded that one out of every 61 babies born in the state was infected with the virus. But that rate varied radically by neighborhood: in posh, semi-rural communities located far from New York City fewer than one out of every 749 babies was born HIV-positive in 1988. But in desperately poor neighborhoods of the South Bronx one out of every 43 newborns, or 2.34 percent, was infected—and every one of them was born in a public hospital.
113
Those numbers could only be expected to worsen as the epidemic's demographics shifted into younger, predominantly heterosexual population groups.
114
A significant percentage of the nation's HIV-positive population was also homeless, living on the streets of American cities. A 1991 study, led by Andrew Moss, of homeless men and women in San Francisco found that 3 percent of those who had no identifiable risk factors for HIV exposure were infected. Another 8 percent of the homeless were HIV-positive due to injecting drug use, prostitution, or sex with an infected individual. Overall, more than one out of every ten homeless adults in San Francisco carried the virus.
115
HIV wasn't the only microbe that was exploiting opportunities in America's urban poor population: hepatitis B (which by 1992 was responsible for 30 percent of all sexually transmitted disease in America), syphilis, gonorrhea, and chancroid were all appearing less commonly in Caucasian gay men and with alarming, escalating frequency in the heterosexual urban poor, particularly those who used crack cocaine or heroin. By 1990 two-thirds of New York State's syphilis cases, for example, were African-Americans residing in key areas of poverty, and within that population male and female infection rates were equal.
In 1993 the New York City Health Department announced that life expectancy for men in the city had
declined
, for the first time since World War II, from a 1981 level of 68.9 years to a 1991 level of 68.6 years. This occurred even though outside New York City life expectancies for men in
the state
had risen
during that time from 71.5 years to 73.4 years. Though rising homicide rates played a role, city officials credited AIDS with the bulk of that downward shift. By 1987 AIDS was already the leading cause of premature death for New York City men of all races and classes; by 1988 it was the number one cause for African-American women as well.
Well before AIDS was claiming significant numbers of Americans, Harlem Hospital chief of surgery Dr. Harold Freeman calculated that men growing up in Bangladesh had a better chance of surviving to their sixty-fifth birthday than did African-American men in Harlem, the Bronx, or Brooklyn. Again, violence played a significant role in the equation, but it was not critical to why a population of hundreds of thousands of men living in the wealthiest nation on earth were living shorter lives than their counterparts in one of the planet's poorest Third World nations. Average life expectancy for Harlem's African-American men born between 1950 and 1970 was just 49 years. Freeman indicted disease, poverty, and inequitable access to medical care as the primary factors responsible for the alarming death rate among African-American men.
116
Well before a new tuberculosis epidemic struck several U.S. cities, the warning signs were there for all to see: rising homelessness, fiscal reductions in social services, complacency in the public health sector, rampant drug abuse, and increases in a number of other infectious diseases. The emergence of novel strains of multiply drug-resistant TB came amid a host of clangs, whistles, and bells that should have served as ample warning to humanity. But the warning fell on unhearing ears.
During the Ronald Reagan presidency American fiscal policies favored expansion of the investment and monetary sectors of society and simultaneous contraction of social service sectors. Economist Paul Krugman of the Massachusetts Institute of Technology estimated that 44 percent of all income growth in America between 1979 and 1989 went to the wealthiest 1 percent of the nation's families, or about 800,000 men, women, and children. On the basis of Federal Reserve Board data, Krugman calculated that total wealth (which included far more than the cash income measured above) was more concentrated in the hands of the nation's super-rich than at any time since the 1920s. By 1989, the top 1 percent richest Americans controlled 39 percent of the nation's wealth.
Several studies showed that by the end of 1993 more than 25 million Americans were hungry, consuming inadequate amounts of food. In 1993 one in ten Americans was compelled to stand at least once a week on a breadline, eat in a soup kitchen, or find food through a charitable agency. And the numbers of people living below the federally defined poverty line increased three times faster between 1982 and 1992 than the overall population size. In 1992 some 14.5 percent of all American citizens lived in conditions of legally defined poverty. Most were single mothers and their children.
117
Though difficult to measure precisely, the numbers of homeless people
in America rose steadily between 1975 and 1993,
118
and the demographics of the population shifted from the traditional hard-core group of older male vagrants and alcoholics to a younger, more heterogeneous contingent that included large numbers of military service veterans, chronically institutionalized mental patients, individuals with severe cocaine or heroin habits, and newly unemployed families and individuals. Estimates of the size of the nation's homeless population ranged from about 200,000 to 2,200,000, based on head counts in emergency shelters and a variety of statistical approaches to the problem.
119
Even more difficult to calculate was the rise in housing density in urban areas. As individuals and whole families faced hardships that could lead to homelessness, they moved in with friends and relatives. One estimate for New York City during the 1980s suggested that 35,000 households were doubled up in public housing, along with 73,000 double-density private households. Assuming each family averaged four members, that could mean that more than 400,000 men, women, and children were packed into double-density housing.
120
Finally, a large percentage of the urban poor population cycled annually in and out of the criminal justice system. Young men, in particular, were frequently incarcerated in overcrowded jails and prisons. In 1982 President Ronald Reagan called for a war on drugs: by 1990 more men were in federal prisons on drug charges alone than had comprised the entire 1980 federal prison population for all crimes combined. The pace of federal, state, and county jail construction never came close to matching the needs created by the high arrest rates. As a result, jail cells were overcrowded, and judges often released prisoners after shortened terms, allowing them to return to the community. This, too, would prove advantageous to the microbes.
Some of the microbial impact of this urban Thirdworldization might have been controllable had the U.S. public health system been vigilant. But at all tiers, from the grass roots to the federal level, the system was by the mid-1980s in a very sorry state. Complacent after decades of perceived victories over the microbes, positioned as the runt sibling to curative medicine and fiscally pared to the bone by successive rounds of budget cuts in all layers of government, public health in 1990 was a mere shadow of its former self.
An Institute of Medicine investigation determined that public health and disease control efforts in the United States were in a shambles. Key problems included “a lack of agreement about the public health mission” between various sectors of government and research; a clear failure of public health advocates to participate in “the dynamics of American politics”; lack of cooperation between medicine and public health; inadequate training and leadership; and severe funding deficiencies at all levels.
“In the committee's view,” they wrote, “we have let down our public health guard as a nation and the health of the public is unnecessarily threatened as a result.”
121
An example of public health's disarray that proved painfully embarrassing to officials during the 1980s was provided by measles. In 1963 a safe, effective measles vaccine became widely available in the United States and childhood cases of the sometimes lethal disease plummeted steadily thereafter. In 1962 half a million children in the United States contracted measles; by 1977 fewer than 35,000 cases were reported annually and many experts forecast that virtual eradication of the disease would soon be achieved.
But problems were already apparent in 1977: many children who were vaccinated before the age of fourteen or fifteen months later developed measles, and researchers soon understood that timing was crucial to achievement of effective immunization. Vaccination schedules were adjusted accordingly, executed nationwide with vigor, and the number of measles cases in the country continued to decline. The only serious emergences of the microbe took place in communities where a significant number of parents refused, for religious reasons, to have their children vaccinated.
122
By the early 1980s the United States had achieved 99 percent primary measles vaccination coverage for young children and fewer than 1,497 measles cases occurred in the country in 1983.
In 1985, however, a fifteen-year-old girl returned from a trip to England to her Corpus Christi, Texas, home and promptly developed the roseola rash that was characteristic of measles. The virus quickly spread through her high school and the local junior high school. Ninety-nine percent of the students had, during infancy, received their primary live-measles immunizations; 88 percent had also had their recommended boosters. Nevertheless, fourteen students developed measles.
123
Blood tests performed during the outbreak on more than 1,800 students revealed that 4.1 percent of the children, despite vaccination, weren't making antibodies against the virus, and the lowest levels of antibody production were among those who hadn't had boosters. All the ailing teens fit that category. The clear message was: (1) primary immunization, in the absence of a booster, was inadequate to guarantee protection against measles; and (2) having even a handful of vulnerable individuals in a group setting was enough to produce a serious outbreak.
124
The crucial importance of proper timing of vaccination and booster follow-up was further supported by other measles outbreaks among groups of youngsters whose primary vaccination rates exceeded 97 percent.
125
In 1989 the measles rate in the United States climbed considerably. More than 18,000 cases of measles occurred, producing 41 deaths: a tenfold increase since 1983. Forty percent of the cases involved young people who had received their primary, but not booster, vaccinations; the remainder had had no shots, or their vaccinations were administered at improper times.

Other books

A Most Unpleasant Wedding by Judith Alguire
Black Jade by David Zindell
Alexandra, Gone by Anna McPartlin
Grey Eyes by Frank Christopher Busch
Shadow Hills by Anastasia Hopcus
In the Middle of the Night by Robert Cormier
Letters and Papers From Prison by Dietrich Bonhoeffer