Death in the Pot

Before the regulation of the food industry, there was very little preventing us from eating poison.

By Deborah Blum

Kitchen Scene, by Peter Wtewael, c. 1620. The Metropolitian Museum of Art, Rogers Fund, 1906.

Kitchen Scene, by Peter Wtewael, c. 1620. The Metropolitan Museum of Art, Rogers Fund, 1906.

 

Audio brought to you by Curio, a Lapham’s Quarterly partner

The following menu for a 1902 Christmas dinner party stands—as far as I know—as one of the most unusual ever printed. And also one of the least appetizing.

Apple Sauce.
Borax.
Soup.
Borax. Turkey. Borax.
Borax.
Canned Stringed Beans.
Sweet Potatoes. White Potatoes.
Turnips.
Borax.
Chipped Beef. Cream Gravy.
Cranberry Sauce. Celery. Pickles.
Rice Pudding.
Milk. Bread and Butter. Tea.
Coffee.
A Little Borax.

Unless, of course, one happens to enjoy meals spiced up by the taste of borax—a little metallic, sweet and unpleasant, or so they say—a preservative used to keep meat from rotting in the late nineteenth and early twentieth century. 

This particular menu grew from a series of federal experiments that ran from 1902 to 1907 and were designed to test the toxicity of food additives. In these tests, groups of volunteers—popularly known as “Poison Squads”—agreed to dine dangerously in the interests of science, working their way through a laundry list of suspect compounds.

Borax came first on the list, partly because it was so widely used by meat processors. Derived from the element boron, it slowed decomposition but could also react with proteins and firm them up, giving rotting meat a more shapely appearance. Borax had thus figured in the “embalmed beef” scandal of the Spanish-American War, in which officers in the U.S. Army accused their suppliers of shipping tins of refrigerated beef that was treated with “secret chemicals” and canned beef that was no more than a “bundle of fibers.” “It looked well but had an odor similar to that of a dead human body after being injected with preservatives,” an Army medical officer wrote of the refrigerated meat, adding that when cooked, the product tasted rather depressingly like boric acid.

But beyond the disgust element was another more important question concerning borax: was it actually safe to eat? This troubling issue was the reason why squad members were imbibing the compound at Christmas, the reason for the Poison Squad experiments themselves. Established by a famously outspoken, crusading chemist from the U.S. Department of Agriculture, Harvey Washington Wiley, the squads were also meant to answer another, larger question: were manufacturers actually poisoning the food supply? 

Businesses had a near free hand to do so at that point. At the turn of the twentieth century, the federal government did not regulate food safety, did not require testing of food products in advance, and did not hold companies liable for resulting illnesses. Neither did it require food producers to inform consumers what materials actually went into a food product. Wiley actually had a range of alarming compounds on his test list beyond borax, including formaldehyde (used to slow the souring of old milk) and copper sulfate (used to restore color to canned vegetables).

Mauvais Sujet, by Ford Madox Brown, 1863. Tate, London.

Mauvais Sujet, by Ford Madox Brown, 1863. Tate, London. 

The “Poison Squad” nickname did not originate with Wiley, who worried that the moniker anticipated the experiment’s conclusion. He had persuaded the U.S. Congress to fund the research under the less conclusive title of “hygienic table trials.” But the catchier description came naturally to journalists who followed the story, especially as squad members began sickening. Neither did Wiley write up that borax-rich menu for Christmas dinner 1902; his menus were dry catalogs of carefully measured portions. That was one reason behind the restaurant-style menu—it was likely a protest penned by disgruntled squad members. They were weary, as one told the Washington Post, “of eating chemically treated foods in apothecary doses.”

In its way, though, that very complaint illustrated Wiley’s reason for doing the experiments at all. He believed that everyone in the country was eating chemically treated foods, that all consumers were receiving daily apothecary doses. He suspected that the country was, in fact, suffering from a coast-to-coast epidemic of food poisoning, strictly due to commercial food production. But he wanted more than his own suspicions. He wanted to prove it, and then, he wanted to stop it. 

 

It may not surprise you to know that almost every description of Wiley by academic and popular historians first praises his devotion to food safety and goes on to marvel at his superlative ego. “No one ever accused Harvey Washington Wiley of false modesty,” writes British author Bee Wilson in her lively book, Swindled: The Dark History of Food Fraud, from Poisoned Candy to Counterfeit Coffee.

There is, of course, a kind of soaring hubris—or maybe just soaring naiveté—in the idea that a single person, or even a single government, could put an end to the poisoning of the food supply. We’ve been altering and poisoning food—deliberately and accidentally, for evil reasons and with good intentions—throughout most of our history. There’s certainly no sign that we’ve given it up today. 

“The adulteration of foods is as old as commerce itself,” wrote historian and U.S. Food and Drug Administration employee F. Leslie Hart, some sixty years ago in the Food, Drug, Cosmetic Law Journal. Hart’s evidence included fines for adulterating food which appear in ancient Sanskrit laws dating back to around 300 bc. Warnings about purveying risky foodstuffs filter into the Bible, including a very pointed passage in Leviticus concerning the consumption of bad meat. Similar warnings occur in Chinese writings dating back to the second century bc, as well as in the literature of the ancient Greek and Romans. Pliny the Elder wrote of wine purveyors who “I regret to say, employ noxious herbs” to color wine, creating a more beautiful and more toxic drink.

Thank God for tea! What would the world do without tea? How did it exist? I am glad I was not born before tea.

—Sydney Smith, 1855

Of course, the ancients were also fully aware that foods could be dangerous without human help, hence the warnings regarding meat consumption. And they’d learned from long-time experience that even routinely safe foods carried unexpected risks. Consider the wonderfully bizarre story of “mad honey” and the Greek army commanded by Xenophon in 401 bc. Returning from an unsuccessful raid in Persia, Xenophon’s men raided beehives along the eastern edge of the Black Sea, acquiring a treasure trove of local honey. By day’s end, the raiding party was immobilized. They were like men “greatly intoxicated,” wrote Xenophon, whose army was suffering from nausea, inability to walk straight, and lethargy. Over three centuries later, the Roman general Pompey’s troops also encamped by the Black Sea and gorged themselves on the local honey. Pompey lost three squadrons to the enemy fighters who had deliberately placed honeycombs in the path of his troops. 

So what is mad honey? It’s just honey, but it comes from bees feeding on some very poisonous flowering plants that flourish along the Black Sea (and elsewhere), notably rhododendrons. These plants contain a class of poisons called grayanotoxins that act directly on the nervous system. The classic symptoms range from tingling and numbness, dizziness and nausea, impaired speech and a loss of balance. Some victims report a sense of being surrounded by spinning lights, others complain of a tunnel vision. “Mad-honey poisoning” can also be fatal, as the compromised nervous system starts shutting down the lungs and heart.

In Xenophon’s case, the honey was just a conveyance for plant poisons. As was known long ago—and should be well-known today—plants, as well as fungi, are not entirely harmless as a group. Some are rich with essential nutrients, others wicked poisons. The history of eating wild mushrooms is a case in point. Many are harmless, but people die every year from mistaking a lethal toadstool for a garden variety. In the first week of April 2011, mushrooms eaten at a religious ceremony in India’s Lakhimpur district killed at least two people and sickened another eleven. Such events were probably more common before we started domestic production of very safe mushroom varieties—and such events occasionally altered history. 

The prime culprit in this story is a rather pretty mushroom with a pale greenish-gold cap and white underside, known scientifically as Amanita phalloides and more informally as the death cap. The mushroom’s native habitat sprawls across Europe and into North Africa (although it now exists in many other countries, including the United States, accidentally brought in with other imports). It is dismayingly similar to a number of edible species, such as the straw mushroom, and it is so deadly that scientists estimate that it takes only half a cap to kill an adult human. As history conjectures, the Greek poet and playwright Euripides lost his wife, two sons, and daughter to a dinner that included this particular mushroom. The death cap is also blamed in the suspected poisoning of Pope Clement VII and the accidental death of Emperor Charles VI.

Many historians believe that a stew of these mushrooms was probably the weapon used to murder the Roman emperor Claudius in 54. Most also suspect that his death was engineered by his wife, Agrippina the Younger, so that her seventeen-year-old son Nero could become emperor. Also widely suspected at the time was the emperor’s taste-tester, Halotus. Despite that possibility—or perhaps because of it—Halotus went on to become Nero’s taste-tester as well.

It seems that history records only a few deaths of taste-testers, especially in Europe. The insistence of the well-born and powerful in hiring taste-testers speaks as much to paranoia as to real risks. Not that there weren’t other poisonous plotters. In Renaissance Italy, the notorious Borgia family was suspected of brewing up its own special poisons to eliminate their enemies. It was thought that the Spanish government sent a physician to England to poison Elizabeth I; the doctor, Rodrigo Lopez, was caught and executed on evidence so thin that even the queen doubted it. It was also widely reported that while staying at the Louvre, Henry IV of France refused to eat anything that he hadn’t cooked himself (eggs, mostly) or poured himself.

Perhaps the most powerful twentieth century ruler to use a taste-tester was China’s Mao Zedong; if his successors have followed in that practice, they haven’t boasted about it. It was announced, however, during the 2008 Olympic games in Beijing that white mice would be taste-testing food served to athletes, but this was mostly due to fears of poor food-handling practices and bacterial food contamination.

Rainy Day in Camp, by Winslow Homer, 1871. The Metropolitan Museum of Art, Gift of Mrs. William F. Milton, 1923.

Rainy Day in Camp, by Winslow Homer, 1871. The Metropolitan Museum of Art, Gift of Mrs. William F. Milton, 1923.

The focus on assassination, though, misses the point; most homicidal poisonings are everyday matters. History’s best-known poisoners, for the most part, aimed low rather than high. They mixed arsenic into oatmeal, aconitine into cake and curry, mercury into figs, and they served these toxic snacks to husbands and wives, lovers and mistresses, friends, family, and business partners, repeatedly demonstrating that deliberate poisoning of food is mostly a domestic affair. Consider the horrifying example of Mary Ann Cotton, a British arsenic killer born in 1832, who was suspected of killing around twenty people—including three husbands, one lover, and most of her own children—probably by mixing arsenic into their morning cereal or evening soup. Many of her adult victims had made wills or life-insurance policies in her favor. Her children, she complained, were simply inconvenient. Unfortunately for Cotton, she had discussed the inconvenient nature of her youngest stepson, Charles, with a public official who was shocked when the boy suddenly died. He began an investigation that discovered lethal levels of arsenic in the child and led authorities to explore the many deaths surrounding Cotton’s life over the previous fifteen to twenty years.

Mary Ann Cotton was convicted of murdering her son and hanged in 1873. The late nineteenth century heralded the end of what historians sometimes refer to as the golden age of poisoners. Up until the mid-1800s, scientists had no real test for arsenic in a human body, much less the myriad of other poisons available. Toxic substances were so easy to acquire and so poorly researched that many poison killers just assumed they wouldn’t be caught. Cotton’s execution—resting neatly on scientific evidence—was among many reminders that the world had changed. Scientists were catching up with poisoners and with the catalog of toxic compounds as well. 

 

Harvey Washington Wiley had the new power of science in mind when he planned out his poison trials. Manufacturers had long felt free to add any toxic substances to foods, chemically altering them almost at will. The best way to counter this practice, he thought, was with solid research showing that borax and formaldehyde and their ilk were a genuine danger. With the results of his poison-squad trials in hand, Wiley believed he could finally persuade the U.S. government that it needed to step in and protect the nation’s food producers.

Wiley’s plan built on influential work done in Britain, starting with Frederick Accum, a German chemist then living in London, whose 1820 book A Treatise on Adulterations of Food, and Culinary Poisons shocked readers on both sides of Atlantic. Accum’s book, experts agree, changed the conversation about food adulteration. The treatise was combative to its core, starting with its wonderfully inflammatory title page featuring the image of a pot with a coiling snake forming its handles, a skull bubbling out of its cloth-draped top, and a Biblical quotation from the Book of Kings inscribed on its front: “There is Death in the Pot.” Accum documented not only such trickery as mixing floor-sweepings into pepper, or peas and beans into coffee, but the use of poisonous compounds to improve the look of food—candies dyed red using the heavy metal lead, pickles made green with copper compounds, the common use of cyanide-rich cherry-laurel leaves to flavor custards.

To eat is to appropriate by destruction.

—Jean-Paul Sartre, 1943

“Most of the articles are transmitted to the consumer in a disguised state, or in such a form that their real nature cannot possibly be detected by the unwary,” he wrote in an impassioned plea for change. It would take another forty years and the work of other equally outraged scientists—including the discovery that arsenic was rather lethally being used to color candy, resulting in poisoned children—before Britain passed its first law regulating food safety in 1860. But no such legislation existed in the United States, even into the first years of the twentieth century.

Wiley was determined to change this situation, prove it wrong with his poison trials. Still, to avoid doing real harm, Wiley selected young men for his experiments. He thought they would be healthy enough to withstand a daily dose of poison. Nevertheless, once the borax trials got under way, the squad members began losing weight, some complaining of stomach pains and severe nausea. Two years later, when Wiley began testing benzoic acid on another group of twelve recruits, only three lasted until the end; the rest became so ill that they had to withdraw. The dramatic results, reported nationally by newspapers, prompted a rising sense of public unease.

Wiley had originally sought minimal publicity, hoping to simply impress policy makers with the evidence. But—as Accum before him—Wiley came to appreciate the power of public interest. While the federal government hesitated, he instead collaborated with journalists, such as Samuel Hopkins Adams at Collier’s Weekly and with groups such as the National Consumer’s League. In his 1929 book The History of a Crime Against the Food Law, he even took to quoting a popular song about the experiments, “The Song of the Poison Squad”:

For we are the Pizen Squad.
On Prussic acid we break our fast;
We lunch on a morphine stew;
We dine with a match-head consommé
drink carbolic acid brew

Not that the squads ever dealt with those poisons. Wiley just liked making the parallel to such famously evil substances. Four years after the squad work started, the United States passed its first law regulating food safety, the Pure Food and Drugs Act of 1906. Wiley’s work and constant advocacy was so influential that people of the time sometimes called it the Wiley Act. Wiley himself thought that fitting; he compared himself to a general who “wins a great battle.” 

In reality, the law also benefited hugely from the support of President Theodore Roosevelt, a crusader himself, and from Upton Sinclair’s horrifying 1906 novel about the Chicago meat-packing industry, The Jungle. (Sinclair had meant the book to incite reaction toward improving worker rights. He famously later complained that, “I aimed at the public’s heart, and by accident I hit it in the stomach.”)

That first food-safety law did not establish the U.S. Food and Drug Administration but rather empowered the Bureau of Chemistry, where Wiley was chief, and where he remained for another six years, leaving in frustration over what he saw as limited support from his supervisors. The FDA was officially created in 1930 but didn’t gain real power until 1938, when Congress passed a new law, giving it the enforcement authority that laid the foundation for the agency of today.

A wild turkey being prepared by chef and blacksmith Angelo Garro, San Francisco. Photograph by Lena Herzog. © Lena Herzog.

A wild turkey being prepared by chef and blacksmith Angelo Garro, San Francisco. Photograph by Lena Herzog. © Lena Herzog

Since that time, really dangerous food—the term food poisoning, even—has tended to refer to bacterial contamination issues rather than toxic-chemical contamination. Still, the public continues to worry about pesticide residues, preservatives, and food dyes—the FDA recently investigated concerns that food dyes might contribute to attention-deficit disorders. But thanks to the work of Wiley, his valiant poison squads, and a host of other crusaders, we don’t fear being killed by arsenic-dyed candy or formaldehyde-improved milk, as we once did.

On the other hand, this has not been a good decade for those who fear bacterial food poisoning. Salmonella and E. coli outbreaks—in beef, eggs, sprouts, and nut products—have plagued the nation’s food supply. A 2007 peanut product recall swept through forty-seven states, where the item sickened more than six hundred people; in 2008 and 2009, bacteria-contaminated peanut butter may have killed nine people and sickened more than seven hundred in the United States; in April 2011, Canadian investigators reported the suspected contamination of walnuts by the lethal bacterial variant E. coli 0157:H7 had killed one person and sickened another thirteen.

Recently, in response to such outbreaks, the federal government passed another law, the Food Safety Modernization Act, in an attempt to break this newest cycle of food poisoning. In the recently passed budget, the law is not fully funded, but even with full funding, one can make an easy prediction. We’ve been trying to regulate food poisoning out of existence since Biblical times. We’ve reduced it; we’ve saved countless lives by doing so. But we’ll never really erase it from our history. Wiley, Accum, and even the Sanskrit scribes could have told you why: food, in all its chemical complications and possibilities, remains the most dangerous substance we will ever eat.

Related Reads