Man who asked ChatGPT about cutting out salt from his diet was hospitalized with hallucinations


A 60 -year -old man spent three weeks being treated in a hospital after replacing table salt with sodium bromide after consultation with a popular artificial intelligence chatbot.
Three doctors published a case report on the issue in the annals of internal medicine at the beginning of the month. According to the report, the man did not have a psychiatric history prior to his arrival at the hospital “expressing his concern that his neighbor poisons”.
The man shared that he had distilled his own water at home and the report noted that he seemed “paranoid” on the water that had been offered to him. Broma, or high levels of bromide, was taken into account after a laboratory report and a consultation with the control of the poison, according to the report.
“During the first 24 hours of admission, he expressed growing paranoia and hearing and visual hallucinations which, after trying to escape, led to an involuntary psychiatric socket for a serious handicap,” said the case report.
Once his condition has improved, the man shared that he had taken it upon him to lead a “personal experience” to eliminate table salt from his diet after reading his negative health effects. The report indicates that he did it after consulting Chatgpt, an artificial intelligence bot.
He said the replacement lasted three months.
The three doctors, all of Washington University, noted in the report they did not have access to the patient’s conversation newspapers with Chatgpt. However, they asked for Chatgpt 3.5 on what chloride could be replaced by themselves.
According to the report, the answer they received included bromide.
“Although the answer indicates that the context is important, it did not provide a specific alert for health, and it did not ask why we wanted to know, because we assume that a health professional would do it,” said the report.
An Openai representative, the company that created Chatgpt, did not immediately respond to a request for comments. The company noted in a press release to Fox News that its conditions of use indicate that the bot should not be used in the treatment of a state of health.
“We have safety teams working on risk reduction and have trained our AI systems to encourage people to seek professional advice,” the statement said.
The toxicity of bromide was a more common toxic syndrome in the early 1900s, according to the report, because it was present in a number of over -the -counter medications. He was supposed to contribute to 8% of psychiatric admissions at the time, according to the report.
It is a rare syndrome, but cases have recently reappeared “because the substances containing bromide have become more easily available with generalized use of the Internet,” said the report.




