Sunday, 28 September 2025
27.9 C
Singapore
27.3 C
Thailand
20.2 C
Indonesia
27.9 C
Philippines

ChatGPT medical advice linked to rare psychosis case, report says

A man developed rare bromism after following ChatGPT's advice, highlighting the dangers of relying on AI for medical decisions.

A recent medical case has raised concerns about the dangers of relying on artificial intelligence for health advice, after a man developed a rare and largely forgotten condition following guidance from ChatGPT. The report, published in the Annals of Internal Medicine, details how the AI chatbot’s suggestions allegedly contributed to a case of bromide intoxication, or bromism, which can cause severe neuropsychiatric symptoms, including psychosis and hallucinations.

From salt swap to hospitalisation

The incident involved a 60-year-old man who suspected his neighbour was secretly poisoning him. Influenced by reports on the potential harms of sodium chloride – commonly known as table salt – he sought advice from ChatGPT on possible alternatives. Acting on the chatbot’s response, he replaced salt with sodium bromide, a compound whose use in medicine was discontinued decades ago.

Over time, the man developed bromism, a condition rarely seen since the mid-20th century. When admitted to the hospital, doctors noted he was extremely thirsty yet refused water offered to him, preferring instead to distil his own. His restrictive eating and drinking habits, along with mounting paranoia, soon escalated.

“In the first 24 hours of admission, he expressed increasing paranoia and auditory and visual hallucinations, which, after attempting to escape, resulted in an involuntary psychiatric hold for grave disability,” the report noted.

Bromism: a condition from another era

Bromism, caused by chronic bromide exposure, was once a well-known problem in the early 1900s. Bromine-based salts were historically prescribed for neurological and mental health disorders, especially epilepsy, and were also used in sleep aids. However, prolonged use was found to trigger nervous system issues, ranging from delusions and poor coordination to fatigue, tremors, and even coma in severe cases.

By 1975, bromide use in over-the-counter medicines was banned in the United States due to these risks. Today, bromism is considered extremely rare, which makes this recent case particularly striking.

The treating medical team could not directly access the patient’s ChatGPT conversation; however, they conducted their tests using ChatGPT 3.5. They reported receiving similarly problematic suggestions, including bromide as a salt replacement, without adequate health warnings or follow-up questions.

“When we asked ChatGPT 3.5 what chloride can be replaced with, we also produced a response that included bromide. Though the reply stated that context matters, it did not provide a specific health warning, nor did it inquire about why we wanted to know, as we presume a medical professional would do,” the researchers said.

AI in healthcare: promise and pitfalls

This is not the first time ChatGPT has been linked to medical outcomes. Earlier this year, a widely shared story described how a mother used the chatbot to help identify her son’s rare neurological disorder after numerous doctors had been unable to diagnose it. In that case, the advice led to effective treatment.

However, experts stress that AI can only offer safe and accurate recommendations when provided with detailed context and should never replace a proper medical evaluation. A study in the journal Genes found that ChatGPT’s ability to diagnose rare conditions was “very weak”, reinforcing the need for caution.

OpenAI, when contacted by LiveScience about the incident, responded: “You should not rely on output from our services as a sole source of truth or factual information, or as a substitute for professional advice.” The company added that its safety teams work to reduce risks and train users to seek professional input.

With the launch of GPT-5, OpenAI has emphasised improvements in safety and accuracy, aiming to produce “safe completions” that guide users away from harmful recommendations. Nonetheless, the fundamental limitation remains: AI cannot reliably assess a patient’s clinical features without direct medical examination. Experts agree that AI may be a valuable tool in a healthcare setting, but only when deployed under the supervision of certified medical professionals.

Hot this week

ECOVACS launches DEEBOT X11 family with PowerBoost technology in APAC

ECOVACS launches its DEEBOT X11 family in APAC, featuring PowerBoost technology, advanced AI cleaning, and long-lasting performance.

Singapore EXPO to track carbon emissions for MICE events with ESGpedia

Singapore EXPO partners with ESGpedia to calculate event carbon emissions and boost sustainability across its MICE operations and supply chain.

Qualcomm expects 6G devices to arrive as early as 2028

Qualcomm CEO Cristiano Amon predicts that 6G devices could be available by 2028, with AI driving the shift toward faster, smarter networks.

ASUS wins seven 2025 European Hardware Association Reader Awards

ASUS wins seven 2025 European Hardware Association Reader Awards for excellence in motherboards, graphics cards, monitors, and gaming PCs.

Republic of Gamers launches new ROG Xbox Ally handhelds in Singapore

ROG brings the new Xbox Ally handheld gaming PCs to Singapore on 17 October with advanced AMD processors and AI-powered features.

Zeekr marks first anniversary in Singapore with launch of Zeekr 7X at Marina Bay Sands

Zeekr celebrates its first anniversary in Singapore with the launch of the tech-luxe Zeekr 7X SUV, combining fast charging, long range and advanced safety.

Commvault partners with BeyondTrust to strengthen privileged access security

Commvault and BeyondTrust have partnered to integrate identity security with data protection, helping businesses manage privileged access and boost cyber resilience.

GameSir expands GameHub app with Steam game support for Android gamers

GameSir’s GameHub app now supports Steam games on Android, bringing PC-level graphics, smooth performance and cross-platform play.

Microsoft’s fix for PC shader stutter could take years to arrive

Microsoft’s Advanced Shader Delivery aims to fix PC shader stutter, but widespread adoption could take years.

Related Articles

Popular Categories