Calls for clear guardrails and consumer education before wider rollout of OpenAI’s health advice platform ...
A man was hospitalized with severe physical and psychiatric symptoms after replacing table salt with sodium bromide in his diet, advice he said he received from ChatGPT, according to a case study ...
A 60-year-old man’s attempt to improve his health by using an artificial intelligence chatbot for dietary advice ended with a hospital stay after he accidentally poisoned himself, according to a case ...
A man who used ChatGPT for dietary advice ended up poisoning himself — and wound up in the hospital. The 60-year-old man, who was looking to eliminate table salt from his diet for health reasons, used ...
A 60-year-old man developed a rare medical condition after ChatGPT advised him on alternatives to table salt, according to a case study published in The Annals of Internal Medicine. The patient ...
Hosted on MSN
Man in hospital after ChatGPT diet advice goes wrong
A man gave himself a psychological condition after turning to ChatGPT for medical advice. The unnamed man, 60, told doctors he was trying to eliminate table salt from his diet, having read about its ...
A 60-year-old man spent three weeks being treated at a hospital after replacing table salt with sodium bromide following consultation with a popular artificial intelligence chatbot. Three physicians ...
ChatGPT isn't stupid- a LLM has no thoughts on anything at all, let alone bad ones. It's a stochastic parrot, mindlessly repeating whatever the training data gave it. The people aren't necessarily ...
The new ChatGPT-5 is wiser than the last one. It wouldn't have encouraged a man to eat sodium bromide, which literally drove him crazy. The guy's goal was to lower his sodium use, so he asked GPT-3.5, ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results