Utah's medical board is demanding an immediate suspension of Doctronic, a controversial pilot program that allows AI to diagnose and treat patients without physician oversight. The program has been operating for months but now faces intense regulatory scrutiny over safety concerns and questions about whether the state jumped from prototype to deployment without adequate safeguards.
The program works like this: patients describe symptoms through a digital interface, Doctronic analyzes the input using large language models trained on medical literature, and the system provides diagnosis and treatment recommendations. No human doctor reviews the AI's decisions before they reach the patient. That's the part making regulators nervous.
AI in healthcare is inevitable and potentially beneficial. Systems that help doctors spot patterns in imaging, flag drug interactions, or suggest differential diagnoses can improve care. But Utah appears to have skipped past "assistant" and gone straight to "replacement." And the medical board is now saying: not so fast.
The specific concerns center on several documented cases where Doctronic missed diagnoses that a physician would likely have caught. Details are limited due to patient privacy, but sources familiar with the program describe instances of the AI failing to recognize symptom patterns indicating serious conditions, recommending treatments contraindicated by patient history the system didn't adequately account for, and generating confident-sounding recommendations based on incomplete information.
This highlights a fundamental problem with current AI systems: they're excellent pattern matchers but poor at reasoning about what they don't know. A good doctor will say "I'm not sure, let's run more tests." An AI trained to provide answers will often just provide an answer, even when certainty isn't warranted.
The program's defenders point to efficiency gains and improved access in underserved areas. Utah has rural communities with genuine physician shortages. If AI can provide basic triage and treatment recommendations, maybe that's better than nothing. But "better than nothing" is a low bar for medical care.
There's also a liability question. When makes a mistake, who's responsible? The state that approved the pilot? The company that built the system? The patient who trusted it? Traditional medical malpractice law assumes a licensed physician is making decisions. This program operates in regulatory gray space.





