The Times Australia
Fisher and Paykel Appliances
The Times World News

.

Generative AI and deepfakes are fuelling health misinformation. Here’s what to look out for so you don’t get scammed

  • Written by Lisa M. Given, Professor of Information Sciences & Director, Social Change Enabling Impact Platform, RMIT University

False and misleading health information online and on social media is on the rise, thanks to rapid developments in deepfake technology and generative artificial intelligence (AI).

This allows videos, photos and audio of respected health professionals to be manipulated – for example, to appear as if they are endorsing fake health-care products, or to solicit sensitive health information from Australians.

So, how do these kinds of health scams work? And what can you do to spot them?

Accessing health information online

In 2021, three in four Australians over 18 said they accessed health services[1] – such as telehealth consultations with doctors – online. One 2023 study showed 82% of Australian parents consulted social media[2] about health-related issues, alongside doctor consultations.

However, the worldwide growth[3] in health-related misinformation (or, factually incorrect material) and disinformation (where people are intentionally misled) is exponential.

From Medicare email and text phishing scams[4], to sales of fake pharmaceuticals[5], Australians are at risk of losing money – and damaging their health – by following false advice.

What is deepfake technology?

An emerging area of health-related scams[6] is linked to the use of generative AI tools to create deepfake videos, photos and audio recordings. These deepfakes are used to promote fake health-care products or lead consumers to share sensitive health information with people they believe can be trusted.

A deepfake is a photograph or video of a real person, or a sound recording of their voice, that is altered to make the person appear to do or say something they haven’t done or said.

Up to now, people used photo- or video-editing software to create fake images, like superimposing someone’s face on another person’s body. Adobe Photoshop even advertises its software’s ability to “face swap[7]” to “ensure everyone is looking their absolute best” in family photos.

While creating deepfakes isn’t new, healthcare practitioners and organisations are raising alarm bells about the speed and hyper-realism that can be achieved with generative AI tools. When these deepfakes are shared via social media platforms, which increase the reach of misinformation[8] significantly, the potential for harm also increases.

How is it being used in health scams?

In December 2024, for example, Diabetes Victoria[9] called attention to the use of deepfake videos showing experts from The Baker Heart and Diabetes Institute in Melbourne promoting a diabetes supplement.

The media release[10] from Diabetes Australia made clear these videos were not real and were made using AI technology.

Neither organisation endorsed the supplements or approved the fake advertising, and the doctor portrayed in the video had to alert his patients[11] to the scam.

This isn’t the first time doctors’ (fake) images have been used to sell products. In April 2024, scammers used deepfake images of Dr Karl Kruszelnicki[12] to sell pills to Australians via Facebook. While some users reported the posts to the platform, they were told the ads did not violate the platform’s standards.

In 2023, Tik Tok Shop came under scrutiny[13], with sellers manipulating doctors’ legitimate Tik Tok videos to (falsely) endorse products. Those deepfakes received more than 10 million views.

What should I look out for?

A 2024 review of more than 80 scientific studies[14] found several ways to combat misinformation online. These included social media platforms alerting readers about unverified information and teaching digital literacy skills to older adults.

Unfortunately, many of these strategies focus on written materials or require access to accurate information to verify content. Identifying deepfakes requires different skills.

Australia’s eSafety Commissioner provides helpful resources[15] to guide people in identifying deepfakes.

Importantly, they recommend considering the context itself. Ask yourself – is this something I would expect this person to say? Does this look like a place I would expect this person to be?

The commissioner also recommends people look and listen carefully, to check for:

  • blurring, cropped effects or pixelation

  • skin inconsistency or discoloration

  • video inconsistencies, such as glitches, and lighting or background changes

  • audio problems, such as badly synced sound

  • irregular blinking or movement that seems unnatural

  • content gaps in the storyline or speech.

Worried man lies on his bed looking at phone.
Ask yourself: is this something I’d expect this person to say? MAYA LAB/Shhutterstock[16]

How else can I stay safe?

If you have had your own images or voices altered, you can contact the eSafety Commissioner[17] directly for help in having that material removed.

The British Medical Journal has also published advice specific to dealing with health-related deepfakes[18], advising people to:

  • contact the person who is endorsing the product to confirm whether the image, video, or audio is legitimate

  • leave a public comment on the site to question whether the claims are true (this can also prompt others to be critical of the content they see and hear)

  • use the online platform’s reporting tools to flag fake products and to report accounts sharing misinformation

  • encourage others to question what they see and hear, and to seek advice from health-care providers.

This last point is critical. As with all health-related information, consumers must make informed decisions in consultation with doctors, pharmacists and other qualified health-care professionals.

As generative AI technologies become increasingly sophisticated, there is also a critical role for government in keeping Australians safe. The release in February 2025 of the long-awaited Online Safety Review[19] makes this clear.

The review recommended Australia adopts duty of care legislation[20] to address “harms to mental and physical wellbeing” and grievous harms from “instruction or promotion of harmful practices”.

Given the potentially harmful consequences of following deepfake health advice, duty of care legislation is needed to protect Australians and support them to make appropriate health decisions.

References

  1. ^ said they accessed health services (www.statista.com)
  2. ^ Australian parents consulted social media (pubmed.ncbi.nlm.nih.gov)
  3. ^ worldwide growth (journals.sagepub.com)
  4. ^ Medicare email and text phishing scams (www.servicesaustralia.gov.au)
  5. ^ fake pharmaceuticals (www.abc.net.au)
  6. ^ emerging area of health-related scams (www.bmj.com)
  7. ^ face swap (www.adobe.com)
  8. ^ reach of misinformation (www.apa.org)
  9. ^ Diabetes Victoria (www.diabetesvic.org.au)
  10. ^ media release (www.diabetesvic.org.au)
  11. ^ alert his patients (www.abc.net.au)
  12. ^ deepfake images of Dr Karl Kruszelnicki (www.abc.net.au)
  13. ^ Tik Tok Shop came under scrutiny (www.mediamatters.org)
  14. ^ review of more than 80 scientific studies (journals.sagepub.com)
  15. ^ helpful resources (www.esafety.gov.au)
  16. ^ MAYA LAB/Shhutterstock (www.shutterstock.com)
  17. ^ contact the eSafety Commissioner (www.esafety.gov.au)
  18. ^ dealing with health-related deepfakes (www.bmj.com)
  19. ^ Online Safety Review (www.aph.gov.au)
  20. ^ duty of care legislation (theconversation.com)

Read more https://theconversation.com/generative-ai-and-deepfakes-are-fuelling-health-misinformation-heres-what-to-look-out-for-so-you-dont-get-scammed-246149

Active Wear

Times Magazine

How to Reduce Eye Strain When Using an Extra Screen

Many professionals say two screens are better than one. And they're not wrong! A second screen mak...

Is AI really coming for our jobs and wages? Past predictions of a ‘robot apocalypse’ offer some clues

The robots were taking our jobs – or so we were told over a decade ago. The same warnings are ...

Myer celebrates 70 years of Christmas windows magic with the LEGO Group

To mark the 70th anniversary of the Myer Christmas Windows, Australia’s favourite department store...

Kindness Tops the List: New Survey Reveals Australia’s Defining Value

Commentary from Kath Koschel, founder of Kindness Factory.  In a time where headlines are dominat...

In 2024, the climate crisis worsened in all ways. But we can still limit warming with bold action

Climate change has been on the world’s radar for decades[1]. Predictions made by scientists at...

End-of-Life Planning: Why Talking About Death With Family Makes Funeral Planning Easier

I spend a lot of time talking about death. Not in a morbid, gloomy way—but in the same way we d...

The Times Features

Why Every Australian Should Hold Physical Gold and Silver in 2025

In 2025, Australians are asking the same question investors around the world are quietly whisper...

For Young Australians Not Able to Buy City Property Despite Earning Strong Incomes: What Are the Options?

For decades, the message to young Australians was simple: study hard, get a good job, save a dep...

The AI boom feels eerily similar to 2000’s dotcom crash – with some important differences

If last week’s trillion-dollar slide[1] of major tech stocks felt familiar, it’s because we’ve b...

Research uncovering a plant based option for PMS & period pain

With as many as eight in 10 women experiencing period pain, and up to half reporting  premenstru...

Trump presidency and Australia

Is Having Donald Trump as President Beneficial to Australia — and Why? Donald Trump’s return to...

Why Generosity Is the Most Overlooked Business Strategy

When people ask me what drives success, I always smile before answering. Because after two decades...

Some people choosing DIY super are getting bad advice, watchdog warns

It’s no secret Australians are big fans[1] of a do-it-yourself (DIY) project. How many other cou...

Myer celebrates 70 years of Christmas windows magic with the LEGO Group

To mark the 70th anniversary of the Myer Christmas Windows, Australia’s favourite department store...

Pharmac wants to trim its controversial medicines waiting list – no list at all might be better

New Zealand’s drug-buying agency Pharmac is currently consulting[1] on a change to how it mana...