ChatGPT’s salt advice lands man in hospital

ChatGPT’s salt advice lands man in hospital
AI
Latest News

A 60-year-old man developed bromide poisoning after ChatGPT suggested swapping table salt for sodium bromide — a pesticide chemical long banned in food.

- Used sodium bromide in cooking for 3 months, causing paranoia and hallucinations

- Bromide toxicity was once common before FDA restrictions in the ’70s–’80s, now extremely rare

- Case shows the risks of AI offering complex medical or nutrition advice without expert oversight

- Patient required hospitalization and treatment to recover

Even the smartest AI can’t replace basic medical safety checks.