nygazet.com logo

Man took diet advice from ChatGPT, ended up hospitalized with hallucinations

Man took diet advice from ChatGPT, ended up hospitalized with hallucinations
health8/13/2025

A 60-year-old man ended up in the ER after becoming convinced his neighbor was trying to poison him. In reality, the culprit was ChatGPT diet advice.

A man was hospitalized for weeks and suffered from hallucinations after poisoning himself based on dietary advice from ChatGPT. A case study published on Aug. 5 in the Annals of Internal Medicine, an academic journal, states the 60-year-old man decid... [4060 chars]