Dubai Doctors Warn: Real Cases Show Risks of Using ChatGPT for Medical Advice
Dubai Doctors Share Real-Life Cases Showing Dangers of Using ChatGPT for Medical Advice
Artificial Intelligence (AI) platforms like ChatGPT have become popular tools for people seeking instant answers — even for health issues. But Dubai doctors are warning that relying on AI for medical advice can do more harm than good, revealing several alarming real-life cases where patients suffered due to wrong or delayed diagnoses after following AI-generated recommendations.
“A Sword in the Wrong Hands”
Dr. Alok Chaubey, Medical Director and Specialist General Surgeon at Prime Medical Center, Oasis Mall, explained that while ChatGPT can help educate medical students and professionals, it becomes dangerous when used by untrained individuals.
“There’s a difference between gaining medical education and becoming a professional,” Dr. Chaubey said. “It takes years of study, experience, and clinical judgment to diagnose correctly. ChatGPT can be a good tool for trained doctors — but a sword in the wrong hands can cause serious harm.”
He shared a real case of a patient who reported leg pain, tingling, and numbness. After describing symptoms to ChatGPT, the AI suggested taking gabapentin, a controlled prescription drug used for nerve pain. However, upon examination, Dr. Chaubey found that the patient was simply deficient in Vitamin B12 and Vitamin D, a condition that required simple supplementation, not restricted medication.
Heart Attack Misread as Indigestion
Another concerning case was shared by Dr. Azeem Irshad, Specialist in Internal Medicine at Aster Clinic, Al Nahda, who highlighted that AI lacks the clinical reasoning and intuition that comes with years of medical experience.
“AI tools can educate and engage patients, but they cannot replace a doctor’s clinical judgment,” he said.
He recalled a patient who experienced chest tightness and, after reading AI advice online, assumed it was indigestion. The person began self-medicating with antacids. When symptoms worsened, they sought medical attention — only to discover they were having an evolving myocardial infarction, better known as a heart attack.
“Had the patient not come in time, it could have been fatal,” Dr. Irshad said. “Fortunately, quick medical intervention prevented a life-threatening situation.”
He also shared that other patients who ignored persistent fever after AI suggested it was “just viral” were later diagnosed with typhoid fever or autoimmune diseases. Similarly, people suffering from prolonged fatigue or abdominal discomfort, which AI attributed to stress, were eventually diagnosed with hypothyroidism or inflammatory bowel disease (IBD).
“AI suggestions often generalize symptoms and miss subtle warning signs,” Dr. Irshad added. “Only a doctor can correlate symptoms with medical history, physical exams, and diagnostic tests.”
He emphasized that the safest approach is “AI-assisted, doctor-led care”, where technology supports but never replaces professional expertise.
AI Worsened Skin Infections
Dermatology is another area where AI-based self-diagnosis has caused serious harm, according to Dr. Nishit Bodiwala, Specialist Dermatologist at Prime Medical Center.
“We’ve seen several cases where patients used AI recommendations and ended up worsening their skin condition,” he said.
In one instance, a 38-year-old man had severe itching in his groin and inner thighs. ChatGPT told him it was contact dermatitis and suggested cortisone cream. In reality, the man had a fungal infection, and the steroid cream made the infection worse.
In another case, a 44-year-old woman developed flu-related hives and, after reading AI advice, took oral steroids. The treatment backfired — her condition worsened into infective urticaria, requiring antibiotics and proper flu medication.
“In dermatology, correct diagnosis often depends on physical examination and years of experience,” Dr. Bodiwala said. “AI can’t inspect a rash, assess its texture, or understand its history. It’s supplementary at best — never a replacement for a clinician.”
AI Is a Tool — Not a Doctor
All three doctors agreed that while AI can help patients understand symptoms and prepare for medical consultations, using it as a substitute for professional care is risky and potentially life-threatening.
AI lacks critical abilities such as:
Conducting physical exams
Recognizing emergency symptoms
Understanding patient history and comorbidities
Evaluating lab results or imaging scans
“AI can inform, but it’s the physician who interprets and heals,” Dr. Irshad summarized. “Technology should empower — not replace — human expertise.”
Why People Still Ask AI for Health Advice
Many users admit they turn to AI tools like ChatGPT for quick answers, privacy, or reassurance, especially for mild or embarrassing symptoms. However, experts warn that AI-generated medical advice, while confident in tone, can be misleading or incomplete.
What’s more, AI does not have access to real-time health data or physical evaluations, which are essential to any accurate diagnosis. Doctors stress that even minor conditions can escalate if treated incorrectly — as seen in several Dubai cases.
Related News