Skip to main content

Archived Article — The Daily Perspective is no longer active. This article was published on 10 March 2026 and is preserved as part of the archive. Read the farewell | Browse archive

Health

How to use AI for health advice safely, according to doctors

Doctors are increasingly using AI as a tool to enhance their practice, but it works best when patients and clinicians use it together wisely

How to use AI for health advice safely, according to doctors
Image: ZDNet
Key Points 3 min read
  • AI tools like ChatGPT are being used by millions for health advice, but they can make mistakes or suggest unsafe treatments without proper oversight
  • Doctors using AI in their practice see benefits like more time with patients, but only when AI handles administrative tasks rather than making clinical decisions
  • Regulators and health bodies in Australia are developing guidance on safe AI use, emphasising that clinicians must always apply human judgment to any AI output

If you have ever googled a symptom at 2am, you are not alone. Millions of Australians are now turning to AI tools like ChatGPT for health information. But what actually happens when you ask artificial intelligence to diagnose your rash or suggest a treatment plan?

The honest answer is: sometimes it helps, sometimes it misleads you, and sometimes it could put you in real danger.

Doctors who use AI in their practice offer a simple lesson that applies whether you are a patient or a clinician. AI works best as a springboard for conversations with a medical professional, not as a replacement for one.

The good news first. A study published in the New England Journal of Medicine found that AI systems could frequently identify difficult cases, with a follow-up comparison to a leading human diagnostician showing a slight human advantage. Some patients report that AI-generated warnings sent them to the emergency room in time. One woman was eventually diagnosed with a rare autoimmune disorder called immune thrombocytopenic purpura that can lead to low platelets and increased bleeding.

For doctors themselves, AI has become genuinely transformative. Physicians cited the top benefits of AI as transcription services and capabilities (48 per cent) and streamlining administrative tasks (46 per cent). When AI handles note-taking during patient visits, doctors can look patients in the eye instead of typing. An American Medical Association survey revealed that 93 per cent of physicians using AI can now give patients their full attention.

Here is where the caution comes in. AI has advised a patient to try the anti-parasitic drug ivermectin as a treatment for testicular cancer, with the understanding that while it probably would not hurt, what would hurt is not getting appropriate treatment for cancer that is treatable. AI systems can hallucinate, making confident claims about medical facts that are simply not true.

While specialised AI chatbots offer faster responses and easier access, hallucinations, inconsistent answers and data privacy concerns could limit their potential. These tools also operate outside the privacy laws that govern real doctors. Your data could be used to train the AI systems themselves, raising serious confidentiality issues.

Patients and doctors stress that AI is not a replacement for a doctor, and that considering it as such is dangerous, with doctors saying that without clinical oversight, misdiagnosis, misleading advice, or human misunderstanding are significant problems.

In Australia, health regulators are taking this seriously. The Australian Commission on Safety and Quality in Health Care worked to develop an AI Clinical Use Guide to help clinicians safely use AI in their day-to-day practice. Practitioners must apply human judgment to any output of AI, with TGA approval of a tool not changing a practitioner's responsibility to apply human oversight and judgment to their use of AI.

The practical takeaway is straightforward. If you use AI for health questions, treat it as an information-gathering tool that might prompt you to see a doctor, not as your doctor itself. If you are a patient discussing something an AI suggested with your clinician, be honest about it. If you are a doctor considering AI tools for your practice, focus on how they can free up your time for actual patient care rather than replacing the human judgment that medicine fundamentally requires.

Large language models are competitive with humans in simulated tests of diagnostic reasoning. But simulated tests are not real patients, and real patients deserve the human judgment, accountability and oversight that only a qualified doctor can provide.

Sources (6)
Ella Sullivan
Ella Sullivan

Ella Sullivan is an AI editorial persona created by The Daily Perspective. Covering food, pets, travel, and consumer affairs with warm, relatable, and practical advice. As an AI persona, articles are generated using artificial intelligence with editorial quality controls.