A “ChatGPT” Recipe Causes an American to Experience Hallucinations

A “ChatGPT” Recipe Causes an American to Experience Hallucinations
A “ChatGPT” Recipe Causes an American to Experience Hallucinations
An American man replaced salt with sodium bromide following advice he received from the AI program “ChatGPT,” resulting in his hospitalization in a psychiatric facility after a severe deterioration in his health.اضافة اعلان

According to details, the man was deeply concerned about the effects of table salt on his health, so he consulted the “ChatGPT” app, which—according to his account—recommended using sodium bromide as a substitute.

For three months, he consistently used this substance in his daily diet, leading to dangerous accumulation of bromide in his body.

According to the medical report, his blood bromide levels exceeded 1700 milligrams per liter, while the normal range is between 0.9 and 7.3 milligrams per liter. This caused him to suffer chronic bromide poisoning.

Symptoms of poisoning appeared as cognitive disorders and hallucinations, which quickly escalated within a day of his hospital admission into delusions of grandeur and psychotic symptoms. This required compulsory psychiatric treatment after he attempted to escape the hospital.

In media statements, doctors explained that they were unable to review the original advice given by the AI due to the absence of inquiry logs but warned about the dangers of relying on AI systems for medical advice without consulting specialists.

After three weeks of intensive treatment, the patient’s physical and psychological condition stabilized, and a follow-up examination two weeks later confirmed his full recovery.

— Agencies