nygazet.com logo
Man took diet advice from ChatGPT, ended up hospitalized with hallucinations
health

Man took diet advice from ChatGPT, ended up hospitalized with hallucinations

1 min read

A 60-year-old man ended up in the ER after becoming convinced his neighbor was trying to poison him. In reality, the culprit was ChatGPT diet advice.

A man was hospitalized for weeks and suffered from hallucinations after poisoning himself based on dietary advice from ChatGPT. A case study published on Aug. 5 in the Annals of Internal Medicine, an academic journal, states the 60-year-old man decid... [4060 chars]

Read Original Article

Source: Yahoo News Canada

Visit Source

Share this article