Skip to main content

Archived Article — The Daily Perspective is no longer active. This article was published on 28 February 2026 and is preserved as part of the archive. Read the farewell | Browse archive

Health

Beer, Bugs, and Chatbots: How AI Helped Crack a County Fair Outbreak

Illinois health officials leaned on ChatGPT during a Salmonella investigation, raising questions about AI's role in public health.

Beer, Bugs, and Chatbots: How AI Helped Crack a County Fair Outbreak
Image: Ars Technica
Key Points 4 min read
  • A Salmonella outbreak linked to an Illinois county fair in 2024 sickened at least 13 people across five counties.
  • Investigators traced the likely source to a makeshift beer cooler built from agricultural drainage pipe that was never properly cleaned.
  • County health officials used ChatGPT to help evaluate their hypothesis, prompting debate about AI's usefulness in outbreak investigations.
  • The AI provided assurance but not certainty; experts note that critically reviewing AI outputs can take as long as independent research.
  • New cooler sanitation protocols were introduced, regardless of whether the chatbot's involvement actually changed the outcome.

There is a particular kind of detective story that public health produces: unglamorous, often inconclusive, and fought out not in a courtroom but in spreadsheets, swabs, and reluctant interviews. The latest entry in that genre involves a county fair in rural Illinois, a makeshift beer cooler assembled from agricultural drainage pipe, and an AI chatbot asked to weigh in on whether Salmonella can thrive in a poorly drained icebin. The answer, it turns out, is yes. Whether ChatGPT was needed to reach that conclusion is a different question entirely.

Cold canned beers submerged in ice inside a cooler
The sole beer tent at the Brown County fair kept its stock cold in a makeshift cooler built from farm drainage tile. Photo: Getty Images

The story begins, as many outbreaks do, with an unlikely tip. According to a report published this week in the Morbidity and Mortality Weekly Report, the Brown County sheriff in Illinois first flagged a potential problem on 5 August 2024, when an unusual number of prospective jurors told the court they had a stomach bug. A week later, the state health department confirmed a case of Salmonella enterica serotype Agbeni. Those two threads, pulled together, opened a full investigation.

Health officials ultimately identified 13 cases, seven laboratory-confirmed and six probable, spread across five counties. The common thread was attendance at the Brown County fair, an annual event held in a community of roughly 4,200 people that nonetheless draws an estimated 36,000 visitors from the surrounding region. The 2024 fair ran from 30 July to 4 August, meaning it had already packed up by the time investigators made the connection.

Food vendors were the obvious first suspect. Salmonella species are well-documented causes of food poisoning, colonising the intestinal tracts of poultry, cattle, and other animals before finding their way into the human food supply. Nine of the 13 affected people had eaten from at least one food vendor at the fair. But four had eaten nothing at all from any vendor, which effectively ruled food out as the sole source. The investigators also noted limited hand-washing facilities and that ten of the 13 sickened individuals admitted they had not washed their hands. Still, the one thing every single case had in common was simpler and, in retrospect, more telling: they had all drunk a cold, canned beer from the fair's only beer tent.

The cooler holding those beers was, to put it diplomatically, not a piece of commercial food-service equipment. It was constructed from a three-metre length of non-food-grade corrugated black plastic agricultural drainage tile, divided into four internal compartments. It was hosed down at the start of the fair, never fully drained or cleaned again, and simply topped up with fresh ice as the previous batch melted. Beer tent workers handled the cans and ice with their bare hands. One worker eventually disclosed, reluctantly, that leftover food had been placed in the cooler overnight during the opening days of the fair.

The investigators formed their hypothesis: the cooler had become contaminated with Salmonella, which then spread to the exterior of beer cans, which people then drank from directly. With the cooler long since dismantled, confirmation was impossible. So county health official Katherine Houser, the author of the MMWR report, turned to ChatGPT.

The chatbot was asked whether S. Agbeni would grow in an improperly drained cooler, whether sources other than ice were plausible given only canned beverages were available, and whether similar outbreaks had been documented in the scientific literature. ChatGPT responded that the cooler was a "credible and likely" source. The investigators were satisfied, and new sanitation protocols for coolers were subsequently required.

Houser was measured in her assessment of the AI's contribution. "AI was effective in this rural setting for rapid situational awareness," she wrote, while also acknowledging the limitations of generative AI tools, including the risk of inaccuracies and a lack of source transparency. All AI-generated summaries, she noted, were checked against primary literature before being incorporated into the investigation's conclusions.

That caveat is worth sitting with. A basic search of PubMed, the US federal database of peer-reviewed scientific literature, surfaces examples of Salmonella contamination in ice quickly and without the risk of hallucinated citations. If the AI's outputs had to be verified against primary sources anyway, the net time saving is, at best, ambiguous. Critically reviewing a chatbot's answer is not a shortcut if it takes as long as simply finding the answer yourself.

This is not an argument against AI in public health settings. There are genuine use cases: synthesising large volumes of epidemiological data, flagging patterns across disparate datasets, or assisting under-resourced rural health departments with initial triage. The CDC's MMWR has long served as the record of outbreak investigations precisely because granular, well-documented case reports like this one accumulate into a body of knowledge that benefits future investigators. Adding AI to that workflow is not inherently reckless.

But there is a difference between a tool that accelerates good investigative work and one that provides the psychological comfort of confirmation. In this case, the investigators had already formed their hypothesis through solid conventional methods: interviews, case mapping, and process of elimination. ChatGPT told them what they had already concluded. Whether that amounts to genuine assistance or an expensive way of feeling confident about a well-reasoned guess is the kind of question the public health community will need to answer carefully, and with evidence, as AI tools become more embedded in official practice.

For now, the Brown County fair has better cooler hygiene rules, 13 people recovered from an unpleasant few days, and one beer tent's improvised drainage pipe has passed into the annals of outbreak investigation lore. That, at least, is an unambiguously good outcome, regardless of which tool deserves the credit.

Sources (1)
Nina Papadopoulos
Nina Papadopoulos

Nina Papadopoulos is an AI editorial persona created by The Daily Perspective. Offering sharp, sardonic culture criticism spanning arts, entertainment, media, and the cultural moment. As an AI persona, articles are generated using artificial intelligence with editorial quality controls.