THAT IMAGE made people who knew Gettler laugh. He liked viciously competitive bowling, the nosebleed seats at baseball games, and any and all horse races, as long as he could place a bet. He called his bookie from his office every week. He talked Alice into vacationing at the Saratoga racetrack. He never could resist a good card game, playing weekly with fellow gamblers from the office. His wife joked that the real reason he had moved into her family’s house was so that she’d have company when he was out with the boys.
Gettler brought that competitive streak directly into the laboratory. He hated to give up on a chemical analysis, and he positively hated the idea that some poisoner off the street could outwit him. He was such a good chemist that Bellevue had refused to turn him loose from its pathology work. Further, he genuinely enjoyed the teaching work he was doing for NYU. So if Norris wanted Gettler to work for him—and he definitely did—he had to settle for a joint appointment.
It would be a challenge, Norris warned his new hire. No other city in the United States had a dedicated toxicology laboratory. Gettler would have to design the lab from scratch and invent a methodology for the New York office. There were no training programs in forensic toxicology; there were astonishingly few books on the subject, most of them based on European research.
In the days of coroners like Patrick Riordan, high-profile poisoning cases in New York had been sent to chemistry professors at the city’s well-respected universities. And they had often done good work. But as Gettler noted, those chemists “had but little experience in toxicological analyses of human organs” and little time to spend on the work itself, usually squeezing it in around other demands. In one case, a professor had taken twenty months to analyze some organs (during which time he had taken a three-month vacation to Europe).
In other words, the job—uncharted and overwhelming—was perfect for Gettler, a man who liked to cram as much work as possible into his hours, loved to juggle multiple challenges, and believed passionately in the power of chemistry. He took a straightforward approach. If a test didn’t exist, he would invent it. If research methods didn’t exist, he would develop them himself. If a new poison or drug came on the market, he went off to a butcher shop, just around the corner from his Brooklyn home, and bought three pounds of liver. He would arrive at Bellevue carrying his twine-wrapped and bloodstained parcel under his arm. In his laboratory, he would slice the liver into neat segments, inject each with a different drug, and grind up the organ pieces. He would then experiment with different ways of extracting the injected chemicals from the tissue, looking for ways to detect ever smaller and smaller amounts of each poison.
It was a rough beginning, a bloody one, and a messy one, but he had to start somewhere.
GETTLER USED his liver test mostly on newly invented poisons, the ones that he anticipated could someday be a problem. For the familiar, everyday toxins, he could often obtain a body for study instead. In the case of wood alcohol, for example, Gettler could now, unfortunately, count on a steady supply of cadavers from the morgue.
Wood alcohol—technically known as methyl alcohol, but also as wood spirit, hydroxymethane, carbinol, colonial spirit, Columbian spirit, and, some years later, methanol—was in itself nothing new. The ancient Egyptians had used it in their embalming processes. For centuries it had been the essential ingredient in homemade whiskey. Its chemical formula had been identified in 1661 by a chemist who called it “spirit of box” because he’d made it by distilling boxwood. The term
methyl
was derived from the Greek
methy
(meaning wine) and
hyle
(meaning wood, or, more precisely, patch of trees).
The chemical structure of wood alcohol is simple: three hydrogen atoms bonded to a single carbon atom (in a cluster known as a methyl group), with one oxygen atom and another hydrogen atom tagging along. It is also simple to make, as industrialists and moonshiners had realized, requiring little more than wood and heat. The process was called destructive distillation. Slabs and slices of wood went into a closed container and were heated to at least 400 degrees Fahrenheit (204 degrees Celsius). As the wood cooked into charcoal, its natural liquids vaporized. That vapor could be cooled, condensed, and distilled into a rather murky soup containing methyl alcohol, acetone, and acetic acid. A second distilling would separate out the pure methyl alcohol, a liquid as clear as glass and as odorless as ice, from the other ingredients.
By the end of the nineteenth century, manufacturing plants called wood factories were clustered along the East Coast, more than forty in Pennsylvania alone. The factories burned thousands of trees a year, mostly birch, beech, maple, oak, elm, and alder, to fill a near-endless appetite for charcoal and wood by-products. Methyl alcohol could be used as a solvent, to make varnish, as an ingredient in dyes, and as a fuel. Some countries, such as England and Germany, prohibited its use in domestic products, considering it too risky. But the United States allowed it into a host of household materials, including essence of peppermint, lemon extract, cologne, aftershaves such as citrus-scented Florida water or bay rum, and liniments such as witch hazel.
It was also used to “denature” grain alcohol, which essentially meant changing drinkable spirits into a lethal industrial product. Methyl alcohol was so poisonous—an estimated two tablespoons could kill a child—that one needed only a little to turn drinking alcohol into a toxic substance. The resulting “industrial” alcohol was exempt from government liquor taxes but was also required by law to be labeled as a poison.
Long before Gettler arrived as city toxicologist, wood alcohol had achieved a sinister reputation. It was mountain alcohol, cheap street alcohol, and everyone knew that it could, and did, kill its share of unlucky drinkers. But at this particular moment Gettler suspected—no, he was positive—that the reputation of wood alcohol was destined to get worse.
“THE PROHIBITION by our government of the manufacture of distilled liquors will unquestionably lead to much ‘moonshining,’ adulteration and dilution of liquors offered to the public,” he predicted in a 1918 article in the
Journal of the American Medical Association.
Prohibition hovered just a legislative breath away when Gettler made his report. The Eighteenth Amendment to the U.S. Constitution, which would prohibit the manufacture, sale, transportation, import, and export of “intoxicating liquors,” had passed both houses of Congress in December 1917. A full thirty-six states were needed to ratify the amendment to make it law. Within a year fifteen states approved the measure, and the Ohio-based Anti-Saloon League and the Women’s Christian Temperance Union were demanding that the rest of the country fall into line.
The Anti-Saloon League, the most powerful of those lobbying groups, had sent its best political organizer to New York City, which it liked to call “the liquor center of America.” The league routinely cited federal records showing that in one week city residents consumed some 75,000 quarts of gin, 76,000 quarts of brandy, 500 quarts of absinthe, and more than 500,000 quarts of beer, wine, and miscellaneous spirits. That unrestricted guzzling of alcohol was about to stop, organizer William Anderson promised. “I’m willing to work night and day,” he said. “We won’t admit defeat.”
With such promises—or threats—occurring daily, Gettler suspected that the city’s equally dedicated drinkers were already bracing against the disappearance of legal alcohol. They were building secret stills and stockpiling supplies. “That such is the case even at this early period is quite evident from the recent poisoning in this city of over thirty persons, six of whom died,” he wrote. The whiskey that killed them contained “a considerable amount of wood alcohol.”
Wood alcohol came wonderfully cheap—a few cents a glass. It could be distilled out of discarded wood chips, sawdust, lumber scrap, bits and pieces of dead plants—and it tasted just fine going down. “The refined wood alcohol tastes like ethyl [grain] alcohol and, moreover is considerably cheaper,” Gettler wrote, “hence the adulterator buys the latter, ignorant that severe poisoning, blindness and often death lurk within it.”
Why is one alcohol so poisonous when another—spirits made from grain—is so much safer? It has to do with the way wood alcohol’s chemistry interacts with human metabolic processes. As the body’s enzymes break apart the carbon, hydrogen, and oxygen that form the alcohol, those atoms form new and more dangerous breakdown products. The deadly chemical detritus consists mostly of formaldehyde and formic acid. Methyl alcohol is toxic in its own right, as Gettler noted, but formic acid is at least six times more deadly. Further, methyl alcohol metabolizes comparatively slowly, lingering in the body. The conversion to the “more dangerous poisons” can take up to five days, meaning that the wood alcohol drinker can stew in an increasingly lethal cocktail for the better part of a week.
Gettler wanted doctors to be aware of, and to watch for, the pattern of a wood alcohol death: a sudden sense of weakness, severe abdominal pain and vomiting, blindness, a slip into unconsciousness, heart failure. He wanted them to brace for the onset of Prohibition, which ensured new and more dangerous alcohol issues. Both he and Norris anticipated a dramatic rise in wood alcohol deaths.
Neither man was persuaded by the ideals that supporters of the new amendment expressed. Neither believed that the prohibition of legal alcohol was likely to create a society that suddenly rejected the pleasures of beer, wine, and cocktails. Rather, “we feel it timely to warn physicians, coroners and health officers in order that they may be on their guard” and anticipate an epidemic of wood alcohol poisoning.
Gettler’s message to doctors was simple:
This is poison. Warn everyone. Do it now
.
POISON WAS already in the air that spring of 1918, both figuratively and literally. It tracked through the newspaper headlines in a litany of horror stories–dying children, blinded villagers, dead and disfigured soldiers.
The United States had declared war on Germany in April 1917, joining the Great War. The first American soldiers arrived in France that June; some two months later, Germany launched the use of the war’s most devastating poison gas, bombarding battlefields with bright yellow shells loaded with mustard gas. In November the British captured a stockpile of enemy mustard gas shells (Germany produced more than ten thousand tons of the gas during the war) and returned the attack, bombarding the Kaiser’s troops with their own poison bombs.
Mustard gas derives from a thick, oily liquid that takes its color from the sulfur it contains. Mixed with an explosive variety of other chemicals, the liquid—called sulfur mustard—fragments into a brownish-yellow aerosol mist. The “mustard” itself is the point, though; the concentrated sulfur it contains mixes with other ingredients to become a ferocious form of sulfuric acid. Known technically as a vesicant, or blistering agent, it burns on contact, through uniforms, through leather, and through skin, raising a thick layer of oozing yellow blisters, searing the eyes into crusted blindness. If inhaled, mustard gas plasters bloody blisters across the lining of the lungs, making breathing a rasping, painful misery. German military strategists considered it a disabling agent rather than a killer. It was rarely instantly lethal (although scientists would later find it to be a potent carcinogen) but always excruciatingly painful.
Medical personnel, as well as soldiers in their letters home, described the effects of poison gases. “I wish people who talk about going on with this war whatever the cost could see the soldiers suffering from mustard gas poisoning,” wrote one nurse, who told of teenage boys strapped down to their beds, fighting for breath, their voices burned away to a hoarse whisper, praying to die.
Although traditional weapons killed far more people in the Great War, poison gas gave a new nightmare edge to the fighting. “The chemists’ war,” some people nicknamed it, as Germany experimented with other gases, releasing lethal greenish clouds of chlorine; the French introduced phosgene, which combined chlorine and carbon monoxide; and the Americans developed Lewisite—an ugly combination of chlorine and arsenic.
By the summer of 1918 the United States was also manufacturing mustard gas—“the deadliest instrument of warfare yet devised,” the New York newspapers called it. The chief army chemist explained that it was the most useful of the gases, because unlike phosgene it didn’t break down in sunlight, and unlike Lewisite it was stable when wet. The Germans, it turned out, had first considered cyanide gas, but they’d decided it dispersed too easily in air. That disintegration made its effects mild compared to those of the heavier, oily droplets of mustard gas, which settled in a poisonous blanket over the trenches where soldiers had sought protection.
The United States had hesitated at first to use mustard gas, the military said, because it seemed somehow more akin to torture than other weapons. It might seem peculiar to say that, when men and boys were so routinely blown to bits on the battlefield. But poison seemed a different kind of evil, insidious and cowardly, without the redeeming heroics of combat.
ON THE HOME FRONT, New York City—like much of the world in those days—was suffering from another murderous by-product of the war. It could be seen in the new look of the streets, patterned by a blizzard of white masks. Letter carriers, transit workers, train passengers, office and factory workers were wearing, even required to wear, protective masks, not against gas but against the so-called Spanish flu, which seemed to be blowing in from the battlefields.