New Zealand's health service has ordered staff to stop using ChatGPT to write clinical notes after discovering the AI tool was being used to document patient information, RNZ reports.
Health NZ issued the directive after privacy concerns were raised about medical professionals using the AI chatbot to generate or edit clinical documentation containing sensitive patient information.
The revelation raises serious questions about how widely the practice had become, and whether patient data may have been compromised by being fed into ChatGPT's systems, which store and learn from user inputs.
Mate, imagine finding out your doctor's been copying your medical notes into ChatGPT. Not exactly what you picture when they're typing away during your appointment, is it?
Health NZ officials confirmed that some staff had been using the AI tool, though the exact number and scope of usage remains unclear. The practice appears to have emerged as overworked healthcare workers sought ways to manage crushing administrative burdens, including the extensive documentation required for patient records.
But using ChatGPT for clinical notes creates multiple problems. The AI could introduce errors or hallucinations into medical records. Patient privacy is compromised when sensitive health information is sent to external servers. And there are serious questions about who owns and has access to data processed through the system.
The ban comes as healthcare systems worldwide grapple with how to handle AI tools that staff are increasingly using without official approval or oversight. While AI may offer genuine productivity benefits, the healthcare sector's strict privacy requirements and patient safety obligations make unauthorized use particularly problematic.
Health NZ says it's now reviewing policies around AI tool usage and will provide guidance on approved technologies for clinical documentation. The organization emphasized that patient safety and privacy must remain paramount, even as the healthcare system faces staffing shortages and administrative pressures.
For healthcare workers already stretched thin, losing access to an efficiency tool—even an inappropriate one—won't be welcome news. But when it comes to patient privacy, there's no room for shortcuts.
