Man develops psychosis following ChatGPT’s salt-free diet

Reducing salt intake is often a solid way to improve your overall health. However, exchanging classic sodium chloride for sodium bromide is a solid way to give you acne, involuntary muscle spasms and a paranoid psychosis. Knowing this, it is probably preferable to fully avoid this chemical compound – even if Chatgpt tells you the opposite. In the recent case, a patient who would have followed the nutritional suggestion of the generative AI was placed in the involuntary psychiatric socket of the hospital for three weeks.
At the beginning of the 20th century, bromide salts were available in a range of over -the -counter drugs targeting problems such as anxiety, insomnia and hysteria. Consequently, historical files indicate that 5 to 10% of admissions to the psychiatric institution at the time were attributable to poisoning with bromide or bromism. Although it is not as important of a medical problem today, ingesting too much compound often leads to serious problems, including pustula rashes, nausea and vomiting, as well as neurological conditions such as confusion and hallucinatory behavior.
Brove poisoning cases have largely disappeared once the United States Food and Drug Administration has started to ban its use in 1975, but this number has been in recent years thanks to its reintroduction in unregulated and sedative food supplements. Dead two with generative AI program reports on several occasions (if not downright dangerous) suggestions), and it was probably only a matter of time before a situation presents itself as that detailed of a case report published in the Annals of internal medicine.
According to doctors, a 60 -year -old man without psychiatric or medical history arrived in the emergency room of their hospital, saying that a neighbor poisoned him. He did not first disclose drugs or food supplements and received “normal” assessments for his vital signs and his physical examination.
Things started to go down shortly after the hospital staff admitted it to a room. Once there, he admitted several food restrictions and said that he distilled his own water at home. Although he told staff that he was extremely thirsty, he became paranoid about the water they offered him. It only worsened from there.
“During the first 24 hours of admission, he expressed growing paranoia and hearing and visual hallucinations which, after trying to escape, led to an involuntary psychiatric socket for a serious handicap,” said the doctors.
Subsequent consultations with experts in poison control as well as additional laboratory tests have led the medical team to believe that bromism is the most likely explanation of their patient’s erratic behavior. After a diet of several days of intravenous liquids and the repletion of electrolyte and antipsychotic risperidone, the doctors were finally able to obtain the complete history.
According to the patient, he started looking for means to cut salt from his diet after recently reading his negative potential health effects. But the man did not try to simply reduce his sodium intake. He would have tried to eliminate It’s entirely.
“It was surprised that he could only find literature linked to the reduction of sodium of his diet,” wrote the doctors. “Inspired by his story of studying nutrition in college, he decided to carry out personal experience to eliminate chloride from his diet.”
After consulting Chatgpt, the bot would have suggested exchanging chloride for bromide. He then carried out this – for three months.
The authors of the case report note that it is possible that their patient has misinterpreted the suggestion of Chatgpt because of the way in which he formulated his prompt. The program may not have recorded it as a medical request and offered bromide “probably for other purposes, such as cleaning”.
The team unfortunately never had access to the transcriptions of the man’s cat newspapers, so they warned that this theory was only speculation. Despite this, their own experience with Chatgpt 3.5 indicated that the hypothesis is certainly plausible.
“”[W]Thumb that we asked Chatgpt 3.5 with which chloride can be replaced by, we also produced an answer that included bromide, “they wrote.” Although the answer indicates that the context is important, this did not provide a specific health warning, and he did not ask why we wanted, as we assume that a health professional would do it. »»
The patient’s physical condition finally normalized and his psychotic symptoms calmed down during his three -week stay in hospital. He received a little clear during his post-tank recording two weeks later. It is as good a reminder as everything, although AI can be decent in certain things, it is in your interest to leave medical consultations to human professionals.