The Times Australia
Fisher and Paykel Appliances
The Times World News

.

Some clinicians are using AI to write health records. What do you need to know?

  • Written by Stacy Carter, Professor and Director, Australian Centre for Health Engagement, Evidence and Values, University of Wollongong

Imagine this. You’ve finally summoned up the courage to see a GP about an embarrassing problem. You sit down. The GP says:

before we start, I’m using my computer to record my appointments. It’s AI – it will write a summary for the notes and a letter to the specialist. Is that OK?

Wait – AI writing our medical records? Why would we want that?

Records are essential for safe and effective health care. Clinicians must make good records to keep their registration[1]. Health services must provide good record systems to be accredited[2]. Records are also legal documents: they can be important in insurance claims or legal actions.

But writing stuff down (or dictating notes or letters) takes time. During appointments, clinicians can have their attention divided between good record-keeping and good communication with the patient. Sometimes clinicians need to work on records after hours, at the end of an already-long day.

So there’s understandable excitement[3], from all kinds of health-care professionals, about “ambient AI” or “digital scribes”.

What are digital scribes?

This is not old-school transcription software: dictate letter, software types it up word for word.

Digital scribes are different. They use AI – large language models with generative capabilities – similar to ChatGPT (or sometimes, GPT4[4] itself).

The application silently records the conversation between a clinician and a patient (via a phone, tablet or computer microphone, or a dedicated sensitive microphone). The AI converts the recording to a word-for-word transcript.

The AI system then uses the transcript, and the instructions it is given, to write a clinical note and/or letters for other doctors, ready for the clinician to check.

Most clinicians know little about these technologies: they are experts in their speciality, not in AI. The marketing materials promise to “let AI take care of your clinical notes so you can spend more time with your patients.”

Put yourself in the clinician’s shoes. You might say “yes please!”

GP talks to a patient
Some clinicians will welcome the chance to cut down their workload. Stephen Barnes/Shutterstock[5]

How are they regulated?

Recently, the Australian Health Practitioner Regulation Agency[6] released a code of practice for using digital scribes. The Royal Australian College of General Practitioners released a fact sheet[7]. Both warn clinicians that they remain responsible for the contents of their medical records.

Some AI applications are regulated as medical devices[8], but many digital scribes are not. So it’s often up to health services or clinicians to work out whether scribes are safe and effective.

What does the research say so far?

There’s very limited data or real world evidence on the performance of digital scribes.

In a big Californian hospital system, researchers followed 9,000 doctors for ten weeks in a pilot test of a digital scribe[9].

Some doctors liked the scribe: their work hours decreased, they communicated better with patients. Others didn’t even start using the scribe.

And the scribe made mistakes – for example, recording the wrong diagnosis, or recording that a test had been done, when it needed to be done.

So what should we do about digital scribes?

The recommendations[10] of the first Australian National Citizens’ Jury on AI in Health Care[11] show what Australians want from health care AI, and provide a great starting point.

Building on those recommendations, here are some things to keep in mind about digital scribes the next time you head to the clinic or emergency department:

1) You should be told if a digital scribe is being used.

2) Only scribes designed for health care should be used in health care. Regular, publicly available generative AI tools (like ChatGPT or Google Gemini) should not be used in clinical care.

3) You should be able to consent, or refuse consent, for use of a digital scribe. You should have any relevant risks explained, and be able to agree or refuse freely.

4) Clinical digital scribes must meet strict privacy standards. You have a right to privacy and confidentiality[12] in your health care. The whole transcript of an appointment may contain a lot more detail than a clinical note usually would. So ask:

  • are the transcripts and summaries of your appointments processed in Australia, or another country?
  • how are they kept secure and private (for example, are they encrypted)?
  • who can access them?
  • how are they used (for example, are they used to train AI systems)?
  • does the scribe access other data from your record to make the summary? If so, is that data ever shared?
Clinician writes paper notes in a clinic corridor
Clinicians need to adhere to privacy standards. PeopleImages.com - Yuri A/Shutterstock[13]

Is human oversight enough?

Generative AI systems can make things up, get things wrong, or misunderstand some patient’s accents. But they will often communicate these errors in a way that sounds very convincing. This means careful human checking is crucial.

Doctors are told by tech and insurance companies that they must check every summary or letter (and they must). But it’s not that simple[14]. Busy clinicians might become over-reliant on the scribe and just accept the summaries. Tired or inexperienced clinicians might think their memory must be wrong, and the AI must be right (known as automation bias).

Some have suggested[15] these scribes should also be able to create summaries for patients. We don’t own our own health records, but we usually have a right to access them. Knowing a digital scribe is in use may increase consumers’ motivation to see what is in their health record.

Clinicians have always written notes about our embarrassing problems, and have always been responsible for these notes. The privacy, security, confidentiality and quality of these records have always been important.

Maybe one day, digital scribes will mean better records and better interactions with our clinicians. But right now, we need good evidence that these tools can deliver in real-world clinics, without compromising quality, safety or ethics.

References

  1. ^ keep their registration (www.ahpra.gov.au)
  2. ^ good record systems to be accredited (www.safetyandquality.gov.au)
  3. ^ understandable excitement (www.goldcoast.health.qld.gov.au)
  4. ^ GPT4 (www.medicalrepublic.com.au)
  5. ^ Stephen Barnes/Shutterstock (www.shutterstock.com)
  6. ^ Australian Health Practitioner Regulation Agency (www.ahpra.gov.au)
  7. ^ released a fact sheet (www.racgp.org.au)
  8. ^ regulated as medical devices (www.tga.gov.au)
  9. ^ in a pilot test of a digital scribe (catalyst.nejm.org)
  10. ^ recommendations (www.mja.com.au)
  11. ^ Australian National Citizens’ Jury on AI in Health Care (www.uow.edu.au)
  12. ^ right to privacy and confidentiality (www.safetyandquality.gov.au)
  13. ^ PeopleImages.com - Yuri A/Shutterstock (www.shutterstock.com)
  14. ^ that simple (www.nature.com)
  15. ^ Some have suggested (www.pulseit.news)

Read more https://theconversation.com/some-clinicians-are-using-ai-to-write-health-records-what-do-you-need-to-know-237762

Times Magazine

A backlash against AI imagery in ads may have begun as brands promote ‘human-made’

In a wave of new ads, brands like Heineken, Polaroid and Cadbury have started hating on artifici...

Home batteries now four times the size as new installers enter the market

Australians are investing in larger home battery set ups than ever before with data showing the ...

Q&A with Freya Alexander – the young artist transforming co-working spaces into creative galleries

As the current Artist in Residence at Hub Australia, Freya Alexander is bringing colour and creativi...

This Christmas, Give the Navman Gift That Never Stops Giving – Safety

Protect your loved one’s drives with a Navman Dash Cam.  This Christmas don’t just give – prote...

Yoto now available in Kmart and The Memo, bringing screen-free storytelling to Australian families

Yoto, the kids’ audio platform inspiring creativity and imagination around the world, has launched i...

Kool Car Hire

Turn Your Four-Wheeled Showstopper into Profit (and Stardom) Have you ever found yourself stand...

The Times Features

YepAI Emerges as AI Dark Horse, Launches V3 SuperAgent to Revolutionize E-commerce

November 24, 2025 – YepAI today announced the launch of its V3 SuperAgent, an enhanced AI platf...

What SMEs Should Look For When Choosing a Shared Office in 2026

Small and medium-sized enterprises remain the backbone of Australia’s economy. As of mid-2024, sma...

Anthony Albanese Probably Won’t Lead Labor Into the Next Federal Election — So Who Will?

As Australia edges closer to the next federal election, a quiet but unmistakable shift is rippli...

Top doctors tip into AI medtech capital raise a second time as Aussie start up expands globally

Medow Health AI, an Australian start up developing AI native tools for specialist doctors to  auto...

Record-breaking prize home draw offers Aussies a shot at luxury living

With home ownership slipping out of reach for many Australians, a growing number are snapping up...

Andrew Hastie is one of the few Liberal figures who clearly wants to lead his party

He’s said so himself in a podcast appearance earlier this year, stressing that he has “a desire ...

5 Ways to Protect an Aircraft

Keeping aircraft safe from environmental damage and operational hazards isn't just good practice...

Are mental health issues genetic? New research identifies brain cells linked to depression

Scientists from McGill University and the Douglas Institute recently published new research find...

What do we know about climate change? How do we know it? And where are we headed?

The 2025 United Nations Climate Change Conference (sometimes referred to as COP30) is taking pla...