60-Year-Old Hospitalised After Following ChatGPT Diet Advice

fe89cvos saline

A 60-year-old man in the US was hospitalised with severe psychiatric symptoms after following a diet plan suggested by ChatGPT, according to the New York Post. The man, with no prior psychiatric or medical history, had asked the AI chatbot how to remove sodium chloride (table salt) from his diet. ChatGPT suggested replacing it with sodium bromide — a compound that looks like salt but is used mainly in industrial cleaning and certain medicines, and is toxic in large amounts.

Over three months, the man consumed sodium bromide purchased online, while restricting water intake and adhering to strict dietary rules. He developed paranoia, hallucinations, excessive thirst, and coordination issues before being admitted to hospital. Doctors treated him with fluids, electrolytes, and antipsychotics, and he spent three weeks recovering in both medical and psychiatric units.

The case, published in the American College of Physicians Journal, highlights the risks of relying solely on AI for health advice. The authors warned that AI tools can produce scientific inaccuracies and spread misinformation. OpenAI, developer of ChatGPT, notes in its terms of use that outputs may not be accurate and should not replace professional medical guidance.