'Big Chicken': The Medical Mystery That Traced Back To Slaughterhouse Workers : The Salt In the 1950s, the poultry industry began dunking birds in antibiotic baths. It was supposed to keep meat fresher and healthier. That's not what happened, as Maryn McKenna recounts in her new book.
NPR logo 'Big Chicken': The Medical Mystery That Traced Back To Slaughterhouse Workers

'Big Chicken': The Medical Mystery That Traced Back To Slaughterhouse Workers

In the 1950s, the poultry industry began dunking birds in antibiotic baths. It was supposed to keep meat fresher and healthier. That's not what happened, as Maryn McKenna recounts in her new book. Express/Getty Images hide caption

toggle caption
Express/Getty Images

In the 1950s, the poultry industry began dunking birds in antibiotic baths. It was supposed to keep meat fresher and healthier. That's not what happened, as Maryn McKenna recounts in her new book.

Express/Getty Images

Editor's note: In the 1950s, the U.S. poultry industry began adopting a new process: Acronizing. Ads that ran in women's magazines pictured crisp-skinned whole chicken that tasted "fresh," "wholesome" and "country sweet" thanks to a "revolutionary process which helps maintain freshness in perishables" like chicken. In reality, Acronizing referred to the use of antibiotics. Birds were doused in a diluted solution of antibiotics while they were being butchered. The goal was to keep the meat from spoiling, allowing birds to be sold not just days, but weeks after slaughter.

But as Acronizing became widespread, so too did its misuse. Slaughterhouse workers didn't always get training on how to use the antibiotics properly, and even those who did sometimes used way more of the drugs in their solutions than the manufacturers called for. That meant some birds might be getting far more antibiotics than could be denatured through the heat of cooking.

As Maryn McKenna writes in her new book, Big Chicken: The Incredible Story of How Antibiotics Created Modern Agriculture and Changed the Way the World Eats, which examines the use of antibiotics in modern agriculture, "it was possible that housewives were unwittingly feeding their families tetracycline-laced fish and chicken. And doctors would soon discover that the people responsible for getting those proteins to dinner tables were being exposed to antibiotics in a manner that no one had accounted for."

Below is an excerpt from Chapter 4 of the book.


Reimert Ravenholt, a physician at the Seattle Department of Public Health, was puzzled. It was the winter of 1956, and for weeks now, local doctors had been calling him, describing blue-collar men coming into their offices with hot, red rashes and swollen boils running up their arms. The men were feverish and in so much pain they had to stay home from work, sometimes for weeks.

The puzzle was not what was afflicting them. That was easy to establish: It was Staphylococcus aureus, or staph, a common cause of skin infections. Ravenholt happened to have a lot of experience with staph. He was the health department's chief of communicable diseases, the person who recognized and tracked down outbreaks, and for the entire previous year, he had been dealing with a staph epidemic in Seattle's hospitals. The organism had infected 1,300 women immediately after they gave birth, and more than 4,000 newborn babies, killing 24 mothers and children. It was a dreadful episode.

The thing that was keeping Ravenholt up at night now was not the cause of this apparent new outbreak: It was the victims. Medicine already knew that staph could spread rapidly through a hospital, carried unknowingly by health care workers as they went from patient to patient. But outside of hospitals, it was equally taken for granted that staph infections occurred individually and by happenstance. Unless there was an explicit health care connection — a shared nurse or doctor, a crib in a nursery shared by many other newborns — there was no reason to suppose two staph cases were linked. The men coming down with the bug, several a month for five months in a row, were not linked by any hospital or doctor, yet they all had the same pattern of lesions in the same places on their arms and hands.

The outbreak looked like a mystery, one that required a detective. Fortunately, Ravenholt was one. He was a graduate of the Epidemic Intelligence Service, an elite training program for epidemiologists—disease detectives—maintained by the CDC. Ravenholt was one of the first graduates of the two-year program, which was designed to create a rapid-reaction force that could deploy across the country. It had begun in 1951, and Ravenholt entered the next year. When Seattle-area doctors began calling him in 1956, fewer than 100 people in the United States had been schooled, as he had been, in what the CDC called "shoe-leather epidemiology": sleuthing the details of disease outbreaks by leaving the office to meet victims, wherever they happened to be.

Thanks to that training, Ravenholt was equipped to recognize the pattern of an outbreak, even though everything that was known about staph indicated that an outbreak with no hospital connection ought not to exist. The 31-year-old physician called the doctors who had seen the men, pored over the medical records, tracked down the patients, and interviewed them all. It did not take long to discover that they were in fact connected. They had not gone to the same hospital — or any hospital, for that matter — but they did share another institution, one that they visited every day: their workplace. They were slaughter workers at a single poultry processing plant.

Ravenholt called the plant's owners. He half-expected that they would refuse to talk to him and was surprised when they said he could come by. When he got there, they told him why they allowed the visit: They were struggling with poor-quality poultry, sold to them by local farms, that showed the same problems processors would complain about to Congress later that year. They wanted it known that they were doing what they could to get out a clean, quality product, and they felt they were being undermined.

They showed him what they were dealing with. Birds that looked healthy turned out, once killed and defeathered, to be riddled with hidden abscesses, pockets of pus layered in their breast muscles. Ravenholt took some of the pus and cultured bacteria from it. The lesions were caused by staph. He told the owners the bacteria in the abscesses were leaking out when the birds were cut apart, contaminating the ice bath where the just-killed chickens were chilled, and getting into nicks and cuts that the knife-wielding workers naturally accumulated over the course of a workday. Well, that was frustrating, the owners said back to him. They had spent a lot of money and invested a lot of time to add a hygienic new process, called Acronizing, that was supposed to prevent bacterial contamination. They had only installed it in May.

May was when the workers' doctors had started calling.

Ravenholt had never heard of Acronizing before, but he instantly perceived the contradiction. If the point of the antibiotic soak was to kill bacteria that cause spoilage, it also should have killed the staph bacteria that were oozing from the meat and infecting the workers. He asked the plant owners for the names of all the farmers who raised the birds that were killed at the slaughterhouse. There were 21 farms, and he wrote letters to all of them, asking them whether there had been disease outbreaks in any of their poultry. Fifteen wrote back, and all of them assured him that their flocks showed no visible signs of illness. Thirteen of the 15 said they were shocked to hear of the problems, because they were taking special steps to keep their poultry healthy. They were dosing their chickens with Aureomycin to prevent them from developing any disease.

The lab tools that were available in 1956 were much cruder than the ones that exist today; it was more difficult and time-consuming then to distinguish between staph strains or demonstrate that a cluster of cases of illness came from a single source. Ravenholt could not prove in a lab that the antibiotic doses, the chickens' lesions, the antibiotic soaks, and the workers' health problems were linked. But he was confident that what happened had proceeded like this: Drugs in the feed had affected bacteria in the birds, habituating them to antibiotics, and the low dose of the same antibiotics in the chilling bath had eliminated all the bacteria except for the ones that had become resistant. Those had survived to infect the workers who were plunging their hands and arms into the contaminated water.

Ravenholt is in his 90s now and still lives in Seattle. More than 60 years later, his memory of his conclusions then is sharp.

"Instead of the old tried-and-true preventatives of contamination, they had switched to these miraculous new drugs that they thought did everything," he told me. "Instead of preventing a problem, it was like putting kerosene on a blaze."

By the time he was done investigating, the problem had spread from one slaughterhouse to several, and fully half the workers in the plants had the same hot, painful abscesses and boils. Even without lab evidence, that was enough to demonstrate that Acronizing was creating a problem. Ravenholt was able to persuade the slaughter plant owners to cease using the antibiotic dunks, and when they stopped, the outbreak did too.

With the outbreak over and other diseases clamoring for his attention, Ravenholt had no reason to keep poking at the issue of the plant workers' illnesses. But the episode nagged at him, and periodically he looped back to the problem of how the men became infected, scrutinizing any blip that suggested farms and slaughterhouses might be conducting illnesses into the city undetected. He conducted a survey of meat cutters in processing plants, asking about lacerations and boils and hospitalizations. The workers he interviewed all told the same story: of skin eruptions that hurt and ached, gave them fevers, kept them away from work, and recurred for years. They believed their problems originated in the meat and fish they were handling. The illnesses had names on the cutting floors, they told him. They were called "pork infection" and "fish poisoning."

Ravenholt thought back to the terrible 1955 hospital outbreak in mothers and babies. He had assumed at the time that the staph ravaging mothers and newborns in Seattle's hospitals had arisen there first and then leaked into the outside world. Now it occurred to him that the bacterial traffic might have gone the other way. Perhaps the virulent staph originated in the meat trade, affected by the antibiotics that the animals consumed while they were living and that they soaked in after they died. Meat cutters were overwhelmingly men, but maybe one of them had brought the bacteria home on his bloody clothing or his soaked boots or in the cuts on his injured hands. Maybe he had passed the bacterium without knowing it to his pregnant wife or girlfriend, and she had carried it innocently into a hospital and sparked an explosion of disease.

It was years later and there was no way to know. And there was not even a ripple of concern yet in the wider world about the possibility of resistant bacteria arising from antibiotic use in food animals. But at the CDC, Ravenholt had learned that diseases could echo in odd ways down the decades of a career; an outbreak that seemed mysterious at the time might eventually be explained by a discovery years later. So he noted his concerns, in case they might be useful in the future. In 1961 he wrote:

The outbreak of boils among workers in a poultry-processing plant ... is the only such outbreak in this community in at least the last 15 years ... That outbreak coincided in time and place with the use of the chlortetracycline process, which was discontinued shortly thereafter ...

These findings suggest that the use of tetracycline in the processing of poultry somehow caused the outbreak ... And if so, that possibly hospital outbreaks ... are in some way, not yet defined, related to the use of tetracycline.

THE OUTBREAK THAT Ravenholt unraveled was a small one, even by the standards of the 1950s, and until he published his description in 1961, it received no attention outside Seattle. But elsewhere in the United States, the problem of disease organisms in food and food workers, and the ways in which antibiotic use might be affecting them, was gathering attention.

The first sign of trouble surfaced in cheese — or rather, in milk that was supposed to become cheese but would not coagulate. The reason was penicillin. Automatic milking machines had recently come on the market, replacing the laborious hand milking that dairy farmers had done for millennia. The suction generated by the machines was tough on cows' udders, bruising them and causing infections. Injecting large doses of penicillin into the teats cured the problem, but the antibiotic lingered in the udder and could contaminate the cow's milk for a while. To prevent any of that penicillin from being consumed accidentally, the FDA required dairymen to throw away any milk that was collected in the first few days after the drugs were injected. (The British government had a similar but looser rule that hinged on how advanced a cow's infection was.) Yet some farmers must have objected to sacrificing that small amount of profit lost in that discarded milk — because starting in the mid-1950s, penicillin allergies in both countries suddenly became much more common.

This was strange timing, because penicillin had just been made prescription only, precisely because enthusiastic buyers of the drug had sensitized themselves into becoming allergic when it was sold over the counter. With the introduction of prescription penicillin, allergies to the drug should have been decreasing. They were not. Doctors reported adults and, even more, children — who drink more milk than adults — breaking out in the kinds of rashes that previously had affected nurses who handled raw penicillin in the early days. In 1956, the FDA tested milk that it bought in supermarkets across the United States and found that more than 11 percent of the samples contained penicillin; some contained so much that the milk could have been administered as a drug. By 1963, the situation was serious enough for the World Health Organization to flag it in a special report.

Other foods were getting close examination as disease vehicles rather than resistance risks. In March 1964, the CDC summoned physicians, epidemiologists, and federal planners to its headquarters in Atlanta to discuss an urgent trend: Salmonella infections in the United States had increased 20-fold in 20 years. Eggs seemed to be to blame. In the largest single outbreak, liquid eggs — ones that are broken open, combined, frozen while still raw, and sold to food service companies — sickened more than 800 ill and fragile patients in 22 hospitals. Dr. Alexander Langmuir, the founder of the Epidemic Intelligence Service, who had trained Ravenholt, complained: "It certainly piques our pride that in these days of heart surgery, artificial kidneys and organ transplants, we cannot take dominance over a minuscule little bacillus ... that gets into our hospitals, causes no end of trouble, and has us stumped."

Foodborne illness outbreaks in institutions — hospitals, prisons, schools — were usually assumed to be the fault of whoever was in the kitchen. The CDC's investigation established that this was wrong. There was no way that identical outbreaks could have happened in so many hospital kitchens at the same time, caused by the same food, and yet be unconnected. Salmonella was not a kitchen problem; it was a food system one. That shift in emphasis enraged the egg industry, in a way that would echo through every foodborne outbreak thereafter, pitting the suffering of the victims against companies' lost sales. After the egg outbreaks were publicized, "There was probably an egg price depression of somewhere near a cent a dozen," Dr. Wade Smith, Jr., a veterinarian with the Tennessee egg producer Blanton-Smith, fumed during the CDC meeting. "A cent a dozen does not sound like very much to those of us who buy a dozen eggs a week. But a cent a dozen for six months is approximately half a bird's production."

The concern for foodborne outbreaks and the new worries over resistant foodborne bacteria forced a reexamination of Acronizing. At the USDA, several scientists who had been monitoring poultry plants — watching how much drug they used in the chilling bath and how long they soaked the birds — went back to their federal laboratory to try to recreate the process. Their results, once they replicated what slaughterhouses were doing, confirmed Ravenholt's suspicions from years before. Acronizing treatment changed the mix of bacteria on the surface of meat, encouraging resistant bacteria to develop and multiply — resistant bacteria that were present only on pieces of meat that had been Acronized.

Everyday shoppers were probably not reading the scientific publications where those results were made public. Nevertheless, in supermarkets and home kitchens, a cultural shift was occur ring: Consumers were scrutinizing food additives and losing trust in food production. "We have felt for a long time that something was wrong with the poultry we buy," "A Consumer" wrote to the editor of the Pottstown, Pennsylvania, Mercury. "It does not have the good flavor that it had in the past, regardless of how it is pre pared. We would like to see a ban on the use of all dyes and preservatives in the food we buy, including the acronizing of chicken." Lois Reed of Twin Falls, Idaho, wrote to the Montana Standard-Post: "How about the acronizing of chickens? When you purchase one such chicken you are completely in the dark as to the time it was prepared for market — two days ago — six months ago — who knows? ...We are doing ourselves and our children a great injustice by being indifferent to these various practices. Our very lives depend upon action now!" "Non-Acronized" began to appear in grocery-store advertisements across the country — including in the Helena, Montana, Independent Record; the Bend, Oregon, Bulletin; and the Eau Claire, Wisconsin, Daily Telegram — as prominently displayed as "Acronized" had been just a few years before. "Even your children can tell the difference," Capuchino Foods promised in the San Mateo, California, Post. Colorado and then Massachusetts banned Acronized birds from being sold within their borders.

The weight of negative opinion changed the FDA's mind. In September 1966, the agency canceled the licenses it had granted a decade earlier for Acronizing and the rival process, Biostat [from Pfizer]. Antibiotics could no longer be added to food as it was packaged. But the agency did nothing about antibiotics fed to animals before they were slaughtered and became food. That was not yet on the public's agenda, and only a few scientists were concerned. One was Marie E. Coates, a scientist at England's National Institute for Research in Dairying, who studied poultry nutrition. In 1962, at a conference on antibiotics and agriculture that was held periodically at the University of Nottingham, she worried aloud:

Widespread use of antibiotic feed supplements may induce the establishment of strains of organisms resistant to their action. The least harmful result would be the loss of efficiency of antibiotics as growth promoters. A more disastrous consequence might be the development of resistance in pathogens against which antibiotics are at present the only means of defense.

Coates was prescient. Just a few years later, a little more than a hundred miles away, a tragic outbreak would demonstrate that she was right to be afraid.


Excerpted from Big Chicken by Maryn McKenna; published by National Geographic Partners on Sept. 12.