The Times Australia
Fisher and Paykel Appliances
The Times World News

.

What to do if you, or someone you know, is targeted with deepfake porn or AI nudes

  • Written by Nicola Henry, Professor & Australian Research Council Future Fellow, Social and Global Studies Centre, RMIT University
What to do if you, or someone you know, is targeted with deepfake porn or AI nudes

This week, about 50 female students from Victoria’s Bacchus Marsh Grammar School had fake, sexually explicit images of them shared without their consent[1] on Instagram and Snapchat. Images of their faces, purportedly obtained from social media, were stitched onto pornographic images using artificial intelligence (AI).

Deepfake porn, or what our team calls[2] “AI-generated image-based sexual abuse”, involves the use of AI to create a nude and/or sexual image of a person doing or saying things they haven’t said or done.

Celebrities and public figures[3], predominantly women, have experienced such abuse for nearly a decade, with various deepfake porn sites and “nudify apps” readily available online.

But as these technologies become more accessible and sophisticated, we’re starting to see this problem creep into our homes and schools. Teens – and even children – are now being targeted.

How widespread is deepfake abuse?

In 2023, my colleagues and I surveyed[4] more than 16,000 adults in ten countries and found that, despite widespread media coverage (particularly in Western countries), the concept of deepfake porn isn’t well known. When informed about it, however, most respondents indicated it should be criminalised.

Among respondents from Australia, 3.7% had been a victim of deepfake porn as an adult. This was the highest rate reported from the countries we surveyed.

At the same time, 2.4% of Australian respondents said they had created, shared or threatened to share a deepfake photo or video of another person without their consent. This too was a higher figure than every other country we surveyed except the United States.

Men were more likely to report being a victim of deepfake abuse, and more likely to report being a perpetrator. Men were also less likely to find the viewing, creating and/or sharing of deepfake pornography to be problematic.

What can you do if you’re targeted?

Image-based abuse can be a distressing experience. But victims should know they’re not alone, it isn’t their fault and there is plenty of help out there. Here are some steps they can take.

1. Report it

Creating or sharing deepfake sexual images of minors is a criminal offence under Australia’s federal child sexual abuse material[5] (“child pornography”) laws. It’s also a criminal offence to share non-consensual deepfake porn of an adult (and a crime to create it if you’re in Victoria).

Whether you’re the victim, or someone you know is, you can report deepfake abuse to digital platforms[6], to the Australian Centre to Counter Child Exploitation[7] (if the person depicted is a minor) and to the eSafety Commissioner[8].

School children in a blur.
Creating or sharing deepfake sexual images of minors is a criminal offence in Australia. LBeddoe/Shutterstock[9]

If you’re in danger, contact the police or ambulance on triple zero (000). If it’s not an emergency, you can call the Police Assistance Line[10] (131 444) or your local police station. The same steps apply if you’re a bystander who has come across non-consensual deepfake pornography of someone else online.

The eSafety commissioner can take action against image-based abuse under the federal Online Safety Act[11], and can work with victims and their supporters to get the content taken down within 24 hours. They can also issue formal warnings, take-down orders and civil penalties to individuals and technology companies that fail to take action.

Unfortunately, the deepfake content may continue to circulate even after it is taken down from the initial platform.

2. Seek help

If you’ve been targeted, it’s a good idea to talk to someone you trust, such as a friend, family member, teacher, counsellor or psychologist.

Our website has a list of relevant support services[12] for victim-survivors of image-based abuse, including specialist services for Aboriginal and Torres Strait Islander people, migrants and refugees, young people, people with disabilities, people from LGBTQI+ communities and sex workers.

Even if you’re not ready to talk about the experience, you can still find useful information about image-based abuse online, including on the eSafety commissioner’s website[13].

Two people holding hands. It’s a good idea to talk to someone you trust. fizkes/Shutterstock[14]

We’ve also developed a chatbot called Umibot[15], which provides free confidential advice and support to people who have experienced image-based abuse, including deepfake abuse. Umibot also has information for bystanders and perpetrators.

If you’re Aboriginal or Torres Strait Islander, you can check out WellMob[16]. This is an online resource made by Indigenous Australians to provide information on social and emotional wellbeing.

Resources for young people are also available from ReachOut[17], Beyond Blue[18], Youth Law Australia[19] and Kids Helpline[20].

3. Create a digital hash to stop the spread

The United Kingdom’s Revenge Porn Helpline and Meta have developed two digital hashing tools for victim-survivors. These are Stop NCII[21] for adults, and Take It Down[22] for minors.

Anyone in the world can use these tools to generate an anonymous digital hash (a unique numerical code) by scanning the image from their device. This hash is then shared with the companies participating[23] in the scheme (including Facebook, Instagram, Pornhub, TikTok and OnlyFans) so they may detect and block any matches on their platform. You aren’t required to upload the image, which means no one else sees it, nor does it leave your device.

It’s important to note this tool won’t block the image from appearing on platforms that aren’t part of the scheme. You also need to have access to the images in the first place to use the tool.

4. Block, report and distance yourself from the perpetrator (if it’s safe to do so)

You can block the perpetrator(s) through your mobile and on social media, and report them to the relevant platforms and authorities. In the case of platforms, it’s not always clear what will be done once a report is lodged, so it’s a good idea to ask about this.

If the perpetrator is someone you know, such as a classmate or student, authorities can take action to ensure you don’t interact with that person anymore.

Last week, a boy was expelled from Melbourne’s Salesian College[24] after he used AI to create sexually explicit images of a female teacher.

5. Boost your online safety

The eSafety commissioner has step-by-step video guides[25] on a range of online safety topics, from how to change your privacy settings on social media, to how to choose strong passwords.

For women experiencing family or domestic violence, the following resources may also be helpful:

References

  1. ^ shared without their consent (www.abc.net.au)
  2. ^ our team calls (arxiv.org)
  3. ^ Celebrities and public figures (theconversation.com)
  4. ^ surveyed (arxiv.org)
  5. ^ federal child sexual abuse material (www.criminalsolicitorsmelbourne.com.au)
  6. ^ digital platforms (www.imagebasedabuse.com)
  7. ^ Australian Centre to Counter Child Exploitation (www.accce.gov.au)
  8. ^ eSafety Commissioner (www.esafety.gov.au)
  9. ^ LBeddoe/Shutterstock (www.shutterstock.com)
  10. ^ Police Assistance Line (www.police.vic.gov.au)
  11. ^ Online Safety Act (www.esafety.gov.au)
  12. ^ relevant support services (www.imagebasedabuse.com)
  13. ^ eSafety commissioner’s website (www.esafety.gov.au)
  14. ^ fizkes/Shutterstock (www.shutterstock.com)
  15. ^ Umibot (umi.rmit.edu.au)
  16. ^ WellMob (wellmob.org.au)
  17. ^ ReachOut (au.reachout.com)
  18. ^ Beyond Blue (www.beyondblue.org.au)
  19. ^ Youth Law Australia (yla.org.au)
  20. ^ Kids Helpline (kidshelpline.com.au)
  21. ^ Stop NCII (stopncii.org)
  22. ^ Take It Down (tidstart.ncmec.org)
  23. ^ companies participating (stopncii.org)
  24. ^ Melbourne’s Salesian College (www.dailymail.co.uk)
  25. ^ video guides (www.esafety.gov.au)

Read more https://theconversation.com/what-to-do-if-you-or-someone-you-know-is-targeted-with-deepfake-porn-or-ai-nudes-232175

Times Magazine

Q&A with Freya Alexander – the young artist transforming co-working spaces into creative galleries

As the current Artist in Residence at Hub Australia, Freya Alexander is bringing colour and creativi...

This Christmas, Give the Navman Gift That Never Stops Giving – Safety

Protect your loved one’s drives with a Navman Dash Cam.  This Christmas don’t just give – prote...

Yoto now available in Kmart and The Memo, bringing screen-free storytelling to Australian families

Yoto, the kids’ audio platform inspiring creativity and imagination around the world, has launched i...

Kool Car Hire

Turn Your Four-Wheeled Showstopper into Profit (and Stardom) Have you ever found yourself stand...

EV ‘charging deserts’ in regional Australia are slowing the shift to clean transport

If you live in a big city, finding a charger for your electric vehicle (EV) isn’t hard. But driv...

How to Reduce Eye Strain When Using an Extra Screen

Many professionals say two screens are better than one. And they're not wrong! A second screen mak...

The Times Features

Q&A with Freya Alexander – the young artist transforming co-working spaces into creative galleries

As the current Artist in Residence at Hub Australia, Freya Alexander is bringing colour and creativi...

Indo-Pacific Strength Through Economic Ties

The defence treaty between Australia and Indonesia faces its most difficult test because of econ...

Understanding Kerbside Valuation: A Practical Guide for Property Owners

When it comes to property transactions, not every situation requires a full, detailed valuation. I...

What’s been happening on the Australian stock market today

What moved, why it moved and what to watch going forward. 📉 Market overview The benchmark S&am...

The NDIS shifts almost $27m a year in mental health costs alone, our new study suggests

The National Disability Insurance Scheme (NDIS) was set up in 2013[1] to help Australians with...

Why Australia Is Ditching “Gym Hop Culture” — And Choosing Fitstop Instead

As Australians rethink what fitness actually means going into the new year, a clear shift is emergin...

Everyday Radiance: Bevilles’ Timeless Take on Versatile Jewellery

There’s an undeniable magic in contrast — the way gold catches the light while silver cools it down...

From The Stage to Spotify, Stanhope singer Alyssa Delpopolo Reveals Her Meteoric Rise

When local singer Alyssa Delpopolo was crowned winner of The Voice last week, the cheers were louder...

How healthy are the hundreds of confectionery options and soft drinks

Walk into any big Australian supermarket and the first thing that hits you isn’t the smell of fr...