Man develops rare condition after ChatGPT query over stopping eating salt | ChatGPT

An American medical newspaper warned of the use of Chatppt for health information after a man has developed a rare condition after interaction with the chatbot about the deletion of table salt from his diet.
An article in The Annals of Internal Medicine has reported a case in which a 60 -year -old man has developed bromism, also known as the toxicity of bromide, after consulting Chatgpt.
The article described bromism as a “well -recognized” syndrome at the beginning of the 20th century which would have contributed to almost a psychiatric admissions in 10 at the time.
The patient told doctors that after reading the negative effects of sodium chloride or table salt, he consulted the ChatPPT on the elimination of chloride of his diet and began to take sodium bromide over a three -month period. This despite reading that “chloride can be exchanged with bromide, although probably for other purposes, such as cleaning”. Sodium bromide was used as a sedative at the beginning of the 20th century.
The authors of the article, from the University of Washington to Seattle, said that the case stressed “how the use of artificial intelligence can potentially contribute to the development of non -evident health results”.
They added that because they could not access the patient’s chatgpt conversation journal, it was not possible to determine the advice that man had received.
However, when the authors consulted himself Chatgpt himself on what chloride could be replaced, the answer also included bromide, did not provide a specific health warning and did not ask why the authors were looking for such information-“as we assume that a health professional”, they wrote.
The authors have warned that Chatgpt and other AI applications could “generate scientific inaccuracies, not have the capacity to discuss results critically and, ultimately, fuel the spread of disinformation”.
The company announced an upgrade of the chatbot last week and said that one of its greatest forces was healthy. He said Chatgpt – now powered by the GPT -5 model – would be better to answer health related questions and would also be more proactive to “report potential concerns”, as a serious physical or mental illness.
However, he pointed out that the chatbot was not a replacement for professional aid. Chatbot guidelines also indicate that it is not “intended for use in diagnosis or treatment of any health”.
The newspaper article, which was published last week before the launch of GPT-5, said that the patient seemed to have used an earlier version of Chatgpt.
While recognizing that AI could be a bridge between scientists and the public, the article said that technology also included the risk of promoting “decontextualized information” and that it was very unlikely that a health professional would have suggested sodium bromide when a patient asked for a replacement for table salt.
Consequently, the authors said, doctors should consider using AI when verifying the patients obtained their information.
The authors said that the brumism patient had presented himself in a hospital and said that his neighbor could poison him. He also said he had several food restrictions. Although he was thirsty, he was noted as a paranoid about the water offered to him.
He tried to escape the hospital within 24 hours of his admitted and, after being cut, was treated for psychosis. Once the patient stabilized, he said he had several other indicated symptoms of bromism, such as facial acne, excessive thirst and insomnia.



