In Meat We Trust (31 page)

Read In Meat We Trust Online

Authors: Maureen Ogle

BOOK: In Meat We Trust
8.03Mb size Format: txt, pdf, ePub

Fragmentation plus convenience fueled a self-perpetuating cycle: the easier it was to put dinner on the table without cooking, the less relevant cooking skills became. Kids who grew up in homes where no one cooked became adults who didn’t know how to cook and relied on manufacturers, grocery stores, and microwave ovens to do it for them. Nor, it’s worth noting, did economic upheaval derail the long-term trend. In the mid-1970s, even amid inflation and unemployment, a manufacturer of plastic packaging materials was delighted by soaring sales—delighted, but puzzled—and conducted a study to determine what drove its good fortune. The answer: grocery stores were installing “deli” departments to meet the demands of “young and leisure-oriented shoppers”
(read: young adult baby boomers) who subsisted on prepared foods like fried chicken, macaroni and cheese, and presliced meats and cheeses. Hence the demand for take-it-home packaging. “The supermarkets are crying
for anything new that will stop people from going out to eat,” mused a Tyson executive in 1979. The company embraced the new niches and dumped millions into the “precooked frozen” market, moving beyond conventional TV dinners with their tinfoil compartments of sliced chicken and pasty mashed potatoes into chicken-based hot dogs, corn dogs, and bologna; packaged, presliced chicken; chicken and turkey “ham”; boneless turkey breasts; chicken patties and steaks; and frozen, ready-to-cook chicken Kiev and prefried chicken that only needed to be heated before eating. “I think my mother
could cook it better,” Don confided to a reporter who asked about the fried chicken, “but I’m not sure my wife could.” Nor did it matter in an era when convenience trumped taste: “People who eat precooked frozen today are not as fussy as the previous generation,” he added, and predicted that “succeeding generations” would prove even “less discriminating.”

What’s most remarkable is how little Americans spent to satisfy their desires. From 1960 to 1990, the cost of food fell by a third; even during the inflation-dogged 1970s, and the passion for eating away from home notwithstanding, Americans spent a minuscule amount on food. In the early 1990s, on average, consumers paid out just 11 percent of their disposable income to feed themselves. Obviously many households spent more. People earning less than $10,000 a year, for example, devoted about 35 percent of their income to food. But even those in what was then the lower range of the middle class—households with incomes of $20,000 to $30,000—spent only about 17 percent; the wealthiest spent less than 9 percent. And of course that was good for the economy: people had money left to buy other consumer goods.

But more than demographics roiled the culinary landscape. By the late 1970s, Keys’s fat-is-bad theory had become gospel, and the nation’s medical experts urged the public to cut back on fat and cholesterol, a message many Americans interpreted as “Don’t eat beef and pork.” That view got a federal stamp of approval in 1977 when a Senate committee chaired by George McGovern of South Dakota issued a report recommending that everyone eat more poultry and fish and reduce their intake of “meat,” by which it meant pork and beef. The report also documented the extent to which the political establishment had embraced consumer activism and Naderist ideas, complete, it must be said, with an establishment-like dollop of hypocrisy. Naderites had long criticized government agencies for relying on information and advice from industry insiders trying to protect their turf, but when it suited their cause, they did precisely the same. In this case, Nick Mottern, who wrote the McGovern report, was a Nader acolyte eager to challenge the powerful meat lobby. Mottern relied on expertise provided by D. M. Hegsted, a Harvard professor who endorsed and admired the work of Ancel Keys. The staffers who assisted Mottern gathered information primarily from newspaper and magazine coverage that affirmed the view the committee wanted to promote, namely, the Keysian version of the relation between diet, fat, and good health. (“We really were
totally naive,” a staff member later conceded.) The press conference to introduce the committee’s final findings was a masterpiece of glib assertion. Senator McGovern summarized the document’s largely unsupported claims about the relationship between diet and health and then introduced experts who espoused still more assertions as if those were fact, all of which reporters dutifully recorded and passed on to the public.

Beef and pork producers as well as meatpackers took offense at the document because it specifically indicted meat (as a man involved in the fracas put it, all “hell
broke loose”), and McGovern released a revised version that avoided that word. But opposition to the report came from more than just the meat industry. In 1980, for example, the National Academy of Sciences published a study that challenged the heart-healthy mantra. Consumer advocates denounced the academy’s findings as biased because one of the report’s authors had once worked as a consultant for the egg industry. The man pointed out the lunacy of that criticism: during his career he’d received a quarter-million dollars in grants from industry sources, but $10 million from government agencies. How could he be a corporate patsy because of $250,000, but not a government stooge thanks to $10 million? (It’s worth mentioning that Hegsted, the Harvard scholar who tutored Mottern in the “correct” view, devoted his later career to research funded in part by Frito-Lay.)

But the damage was done; in the minds of many Americans, beef and pork had become public enemies one and two, and the bad news kept coming. In the early 1980s, several widespread, and widely reported, disease outbreaks were traced back to beef tainted with a newly discovered and exceptionally virulent form of an otherwise common bacteria,
Escherichia coli
O157:H7. After investigators tracked one of the episodes back to a South Dakota cattle herd, they concluded that feeding antibiotics to livestock had potentially fatal consequences for humans. There could no longer be any doubt, argued the researchers, that “antimicrobial-resistant organisms
of animal origin cause serious human illness.” No doubt in their minds, but plenty in other people’s. In the wake of the findings, a consumer advocacy group petitioned the FDA to ban drug additives in livestock feed, but a hearing on the request ended like every other discussion of the subject: it raised more questions than it answered, and the scientists’ seemingly irrefutable evidence proved to be both debatable and refutable.

Between lethal bacteria on one hand and heart disease on the other, beef and pork consumption plunged. A 1983 consumer poll documented pork’s woes. Those surveyed complained that pork contained too much salt, cholesterol, fat, and calories. Forty-five percent said they’d cut back on fresh pork for “health reasons,”
and nearly a quarter said they’d reduced their consumption of all pork products, fresh or processed. Even McDonald’s, the wizard of food, couldn’t work its magic on pork. In the summer of 1980, the company began testing a “McRib” sandwich, rib-shaped slabs of ground and chopped pork slathered with barbecue sauce. The pork industry salivated at the potential of this new menu item, but the McRib proved a no-go; the company pulled it from the menu in 1983. Part of the problem lay in preference: Kansas City–style barbecue sauce leans toward sweet, and North Carolina’s toward tart; McDonald’s one-taste-suits-all could not overcome those regional differences. The condiment also made for messy eating, a detriment to Americans accustomed to eating on the run and in their cars. But in the end, McDonald’s conceded that the McRib succumbed to consumer resistance: good taste and low price could not overcome pork’s bad reputation. McDonald’s fared better with its Egg McMuffin, which also contained pork, apparently because it suited a consumer niche: when analysts dissected the ten-pound-per-capita drop in pork consumption, they discovered that Americans would eat pork as long as it was processed and convenient—whether as bacon, “lean” microwaveable sausages, or Egg McMuffins.

The toppling of King Beef was more shocking. Per-capita consumption dropped from 131 pounds in 1976 to 105 in 1980 to 97 pounds a decade later. “A story about
the beef industry belongs in the obituary column,” mourned Ken Monfort. A Nebraska cattle raiser agreed. “Nobody eats beef anymore,”
he mused. “Sometimes I wonder if I would be better off not getting out of bed in the morning.” Cattle feeders pooled their funds to support pro-beef advertising campaigns—“the Mercedes of Meat”
and “Somehow, nothing satisfies like beef”—but those did little to bolster the king’s sagging reputation. In desperation, members of the California Cattlemen’s Association petitioned the national cattlemen’s group to end the use of low-level antibiotics. “We thought everybody
would always eat beef,” said the California organization’s director, “but it turned out not to be true”; his group’s members reasoned that eliminating antibiotics might persuade some people to come home to beef. The National Cattlemen’s Association refused to go along, but Paul Engler, whose cattle-feeding operation was by then the largest in the world, announced that he would stop using two controversial antibiotics. He didn’t believe that antibiotic-laced feeds were dangerous, he explained, but many consumers did. The “inference”
of danger was already out there, he argued, so “why jeopardize the demand for your product?” “By dropping antibiotics,”
added a company vice president, “we are trying to teach the public that beef is healthy.”

But Engler’s decision had no effect on beef sales, although how much was due to fear of fat and calories and how much to potentially lethal bacteria was not clear. A financial analyst warned cattlemen that it was time to accept “the harsh reality
that the collapse in consumer taste for beef is permanent.” “It’s a declining industry,” he emphasized, “and the only question is how far it will decline.” Even the president of the National Livestock and Meat Board conceded that the days when meat makers could take consumers for granted were over. “It’s the younger,
more highly educated, high-income people who are turning away from beef toward more vegetables and white meat in their diet,” he said. “These are the opinion leaders that are eventually going to be influencing the eating habits of our bread and butter customers.”

It’s not clear what role income and education played in the shift, but he was correct about the ascent of “white meat,” by which he meant chicken. Every report about heart disease, fat, and cholesterol touted the virtues of poultry (and, to a lesser extent, fish) as a healthy alternative to beef and pork. Consumers didn’t need much convincing, in part because chicken consistently cost less than the other two meats, a factor of biology: cattle and hogs needed months of expensive grains to reach market weight, but a broiler was table-ready in eight weeks or less. As important, chicken was everywhere shoppers and diners wanted to be. From its inception, the broiler industry had worked the convenience angle more aggressively than its pork and beef counterparts, and packaged chicken products abounded. Don Tyson hit the broiler jackpot in the early eighties when he won a contract to supply McDonald’s with its newest offering: the Chicken McNugget, which consisted of a bit of chicken, a lot of “filler” and batter, and even more calories and fat. McNuggets were an instant success—and drew instant fire from Michael Jacobson: in a complaint filed with the FTC, Jacobson’s Center for Science in the Public Interest accused the chain of false advertising. McDonald’s described the McNugget’s contents as “delicious chunks
of juicy breast and thigh meat,” but Jacobson pointed out that the bites also contained sodium phosphate, chicken skin, and beef fat. Who cared? Want to eat healthy? Eat chicken. Eat a McNugget: a bit of chicken and a lot of calories and fat. All of it added up. In 1960, Americans ate twenty-eight pounds of chicken per capita; by 1970, that had risen to forty. In 1980, they put away forty-eight pounds, and in 1987, broiler makers squawked with delight as poultry toppled King Beef. It’s no accident that in the late eighties, pork producers adopted an ad campaign that touted pork as “the other white meat.”

 

But even broiler producers couldn’t take their market for granted, not in an era of media bloat and heightened consumer awareness. The final decade of the century dished up plenty of evidence that when it came to meat, whether beef, pork, or poultry, whether on the table or on the farm, something had gone wrong.

In 1993, scores of people became ill, and some died, in a food-poisoning episode traced to undercooked hamburgers purchased at Jack in the Box, a northwestern fast-food chain. The culprit proved to be the same one scientists had linked to bacterial resistance a decade earlier:
E. coli
O157:H7. The tiny organism became a household name after the Jack in the Box incident, and the tragedy highlighted the flaws in a food safety system designed for a premicrobial era. Federal meat inspection dated back to 1906, when inspectors were trained to look for diseased
livestock
, not diseased meat. They were right to do so: back then, epizootic diseases routinely ravaged poultry flocks and cattle and hog herds, and the USDA poured money into researching and eradicating those scourges. That work proved so successful that by the 1950s, many once-common vaccinations were no longer necessary. Indeed, researchers theorized
that
E. coli
0157:H7 had flourished because cattle ranchers and feeders had reduced or eliminated once-routine vaccinations. But meat inspection procedures had not kept pace with science. Federal rules allowed inspectors to condemn foods that were “so infected”
that eating them might “give rise to food poisoning,” but the men and women charged with monitoring slaughterhouse output had few tools for identifying microorganisms. A 1974 court ruling made that even more difficult: bacteria were not an adulterant and inspectors were not required to consider their presence when giving a carcass the thumbs-up or -down. But the proliferation of O157:H7 and the Jack in the Box episode amounted to a line in the sand. Critics demanded that meat inspection be overhauled. Strategies that worked back when slaughterhouse lines moved at the pace of a single-load rifle were useless on kill lines that operated at machine-gun pace, and speed, many argued, contributed to cross-contamination that led to tragedies like the one at Jack in the Box.

Other books

Obsidian Sky by Julius St. Clair
Unmasking the Wolf by Gissendaner, Christy
Hunter's Moon by John Townsend
Needle in a Haystack by Ernesto Mallo
Through a Camel's Eye by Dorothy Johnston
The Death of Perry Many Paws by Deborah Benjamin
The Eternal Flame by Greg Egan