Close Menu
Fox Global – Breaking News, Insights & Trends
  • Home
  • Crime
  • Entertainment
  • Health
  • Lifestyle
  • Opinion
  • Sports
  • Travel
  • US
  • World
What's Hot

Dodgers, Blue Jays deliver epic World Series game

October 28, 2025

Travis Kelce ties Chiefs’ mark in win over Commanders

October 28, 2025

NHL fan hospitalized after fall from upper concourse of Penguins’ arena

October 28, 2025
Facebook X (Twitter) Instagram
  • Home
  • About Us
  • Advertise With Us
  • Contact Us
  • DMCA Policy
  • Privacy Policy
  • Terms & Conditions
Fox Global – Breaking News, Insights & Trends
  • Home
  • Crime
  • Entertainment
  • Health
  • Lifestyle
  • Opinion
  • Sports
  • Travel
  • US
  • World
Fox Global – Breaking News, Insights & Trends
Home » Man hospitalized after ChatGPT dietary advice leads to toxic poisoning

Man hospitalized after ChatGPT dietary advice leads to toxic poisoning

adminBy adminAugust 13, 2025 Health No Comments6 Mins Read
Share
Facebook Twitter LinkedIn Pinterest Email
Post Views: 67


NEWYou can now listen to Fox News articles!

A man who used ChatGPT for dietary advice ended up poisoning himself — and wound up in the hospital.

The 60-year-old man, who was looking to eliminate table salt from his diet for health reasons, used the large language model (LLM) to get suggestions for what to replace it with, according to a case study published this week in the Annals of Internal Medicine.

When ChatGPT suggested swapping sodium chloride (table salt) for sodium bromide, the man made the replacement for a three-month period — although, the journal article noted, the recommendation was likely referring to it for other purposes, such as cleaning.

CHATGPT COULD BE SILENTLY REWIRING YOUR BRAIN AS EXPERTS URGE CAUTION FOR LONG-TERM USE

Sodium bromide is a chemical compound that resembles salt, but is toxic for human consumption. 

It was once used as an anticonvulsant and sedative, but today is primarily used for cleaning, manufacturing and agricultural purposes, according to the National Institutes of Health.

Scammers can exploit your data from just 1 ChatGPT search

A man who used ChatGPT for dietary advice ended up poisoning himself — and wound up in the hospital. (Kurt “CyberGuy” Knutsson)

When the man arrived at the hospital, he reported experiencing fatigue, insomnia, poor coordination, facial acne, cherry angiomas (red bumps on the skin) and excessive thirst — all symptoms of bromism, a condition caused by long-term exposure to sodium bromide.

The man also showed signs of paranoia, the case study noted, as he claimed that his neighbor was trying to poison him.

ARTIFICIAL INTELLIGENCE DETECTS CANCER WITH 25% GREATER ACCURACY THAN DOCTORS IN UCLA STUDY

He was also found to have auditory and visual hallucinations, and was ultimately placed on a psychiatric hold after attempting to escape. 

The man was treated with intravenous fluids and electrolytes, and was also put on anti-psychotic medication. He was released from the hospital after three weeks of monitoring.

“This case also highlights how the use of artificial intelligence (AI) can potentially contribute to the development of preventable adverse health outcomes,” the researchers wrote in the case study.

“These are language prediction tools — they lack common sense and will give rise to terrible results if the human user does not apply their own common sense.”

“Unfortunately, we do not have access to his ChatGPT conversation log and we will never be able to know with certainty what exactly the output he received was, since individual responses are unique and build from previous inputs.”

It is “highly unlikely” that a human doctor would have mentioned sodium bromide when speaking with a patient seeking a substitute for sodium chloride, they noted.

NEW AI TOOL ANALYZES FACE PHOTOS TO PREDICT HEALTH OUTCOMES

“It is important to consider that ChatGPT and other AI systems can generate scientific inaccuracies, lack the ability to critically discuss results and ultimately fuel the spread of misinformation,” the researchers concluded.

Dr. Jacob Glanville, CEO of Centivax, a San Francisco biotechnology company, emphasized that people should not use ChatGPT as a substitute for a doctor.

Man pouring salt into pot

When ChatGPT suggested swapping sodium chloride (table salt) for sodium bromide, the man, not pictured, made the replacement for a three-month period. (iStock)

“These are language prediction tools — they lack common sense and will give rise to terrible results if the human user does not apply their own common sense when deciding what to ask these systems and whether to heed their recommendations,” Glanville, who was not involved in the case study, told Fox News Digital. 

CLICK HERE TO GET THE FOX NEWS APP

“This is a classic example of the problem: The system essentially went, ‘You want a salt alternative? Sodium bromide is often listed as a replacement for sodium chloride in chemistry reactions, so therefore it’s the highest-scoring replacement here.’”

Dr. Harvey Castro, a board-certified emergency medicine physician and national speaker on artificial intelligence based in Dallas, confirmed that AI is a tool and not a doctor. 

Man spooning salt

It is “highly unlikely” that a human doctor would have mentioned sodium bromide when speaking with a patient seeking a substitute for sodium chloride, the researchers said. (iStock)

“Large language models generate text by predicting the most statistically likely sequence of words, not by fact-checking,” he told Fox News Digital.

“ChatGPT’s bromide blunder shows why context is king in health advice,” Castro went on. “AI is not a replacement for professional medical judgment, aligning with OpenAI’s disclaimers.”

Castro also cautioned that there is a “regulation gap” when it comes to using LLMs to get medical information.

“Our terms say that ChatGPT is not intended for use in the treatment of any health condition, and is not a substitute for professional advice.”

“FDA bans on bromide don’t extend to AI advice — global health AI oversight remains undefined,” he said.

There is also the risk that LLMs could have data bias and a lack of verification, leading to hallucinated information.

CLICK HERE TO SIGN UP FOR OUR HEALTH NEWSLETTER

“If training data includes outdated, rare or chemically focused references, the model may surface them in inappropriate contexts, such as bromide as a salt substitute,” Castro noted.

“Also, current LLMs don’t have built-in cross-checking against up-to-date medical databases unless explicitly integrated.”

OpenAI ChatGPT app on the App Store website

One expert cautioned that there is a “regulation gap” when it comes to using large language models to get medical information. (Jakub Porzycki/NurPhoto)

To prevent cases like this one, Castro called for more safeguards for LLMs, such as integrated medical knowledge bases, automated risk flags, contextual prompting and a combination of human and AI oversight.

The expert added, “With targeted safeguards, LLMs can evolve from risky generalists into safer, specialized tools; however, without regulation and oversight, rare cases like this will likely recur.”

For more health articles, visit www.foxnews.com/health

OpenAI, the San Francisco-based maker of ChatGPT, provided the following statement to Fox News Digital.

“Our terms say that ChatGPT is not intended for use in the treatment of any health condition, and is not a substitute for professional advice. We have safety teams working on reducing risks and have trained our AI systems to encourage people to seek professional guidance.”

Melissa Rudy is senior health editor and a member of the lifestyle team at Fox News Digital. Story tips can be sent to melissa.rudy@fox.com.



Source link

admin
  • Website

Keep Reading

Researchers find NAD+ could promote healthy aging, treat diseases

Shingles vaccine linked to 50% lower dementia risk in new study findings

Northwestern study reveals why some 80-year-olds have sharp brains

Lifestyle changes may help prevent Parkinson’s disease, experts say

COVID vaccine doubles cancer survival rates in immunotherapy patients

Sharp decline in young adults identifying as transgender, non-binary, analysis finds

Add A Comment
Leave A Reply Cancel Reply

Editors Picks

Analysis of WSANDN’s Economic Initiative and Global Implications.

April 12, 2025

World Subnationals and Nations (WSandN) Negotiates Historic Economic Growth Partnership with 180 Countries.

March 27, 2025

Global Economic Council: Buffet, Musk, Zuckerberg, Bezos, Bernard Arnault, and Other Global Billionaires Named on Board to Drive Local Economic Growth Worldwide.

March 6, 2025

WSANDN’s EGCR and GPA Initiatives: Paving the Path to Global Peace & Unlocking $300 Trillion in Economic Prosperity.

March 5, 2025
Latest Posts

Katy Perry, Justin Trudeau visit famed French cabaret frequented by Hollywood stars

October 27, 2025

Jessica Alba goes Instagram official with Danny Ramirez in Byron Bay romance

October 27, 2025

King Charles III questioned about Andrew-Epstein links at cathedral visit

October 27, 2025

Subscribe to News

Subscribe to our newsletter and never miss our latest news

Subscribe my Newsletter for New Posts & tips Let's stay updated!

Welcome to Global-Fox.com
At Global-Fox.com, we bring you the latest insights and updates on politics, world affairs, opinion pieces, entertainment, lifestyle, health, and travel. Our mission is to provide in-depth, fact-based journalism that informs, educates, and engages our audience.

Subscribe to Updates

Subscribe to our newsletter and never miss our latest news

Subscribe my Newsletter for New Posts & tips Let's stay updated!

Facebook X (Twitter) Instagram Pinterest
  • Home
  • About Us
  • Advertise With Us
  • Contact Us
  • DMCA Policy
  • Privacy Policy
  • Terms & Conditions
© 2025 global-fox. Designed by global-fox.

Type above and press Enter to search. Press Esc to cancel.