Google AI
The Times Australia
The Times World News

.

What to do if you, or someone you know, is targeted with deepfake porn or AI nudes

  • Written by: Nicola Henry, Professor & Australian Research Council Future Fellow, Social and Global Studies Centre, RMIT University
What to do if you, or someone you know, is targeted with deepfake porn or AI nudes

This week, about 50 female students from Victoria’s Bacchus Marsh Grammar School had fake, sexually explicit images of them shared without their consent[1] on Instagram and Snapchat. Images of their faces, purportedly obtained from social media, were stitched onto pornographic images using artificial intelligence (AI).

Deepfake porn, or what our team calls[2] “AI-generated image-based sexual abuse”, involves the use of AI to create a nude and/or sexual image of a person doing or saying things they haven’t said or done.

Celebrities and public figures[3], predominantly women, have experienced such abuse for nearly a decade, with various deepfake porn sites and “nudify apps” readily available online.

But as these technologies become more accessible and sophisticated, we’re starting to see this problem creep into our homes and schools. Teens – and even children – are now being targeted.

How widespread is deepfake abuse?

In 2023, my colleagues and I surveyed[4] more than 16,000 adults in ten countries and found that, despite widespread media coverage (particularly in Western countries), the concept of deepfake porn isn’t well known. When informed about it, however, most respondents indicated it should be criminalised.

Among respondents from Australia, 3.7% had been a victim of deepfake porn as an adult. This was the highest rate reported from the countries we surveyed.

At the same time, 2.4% of Australian respondents said they had created, shared or threatened to share a deepfake photo or video of another person without their consent. This too was a higher figure than every other country we surveyed except the United States.

Men were more likely to report being a victim of deepfake abuse, and more likely to report being a perpetrator. Men were also less likely to find the viewing, creating and/or sharing of deepfake pornography to be problematic.

What can you do if you’re targeted?

Image-based abuse can be a distressing experience. But victims should know they’re not alone, it isn’t their fault and there is plenty of help out there. Here are some steps they can take.

1. Report it

Creating or sharing deepfake sexual images of minors is a criminal offence under Australia’s federal child sexual abuse material[5] (“child pornography”) laws. It’s also a criminal offence to share non-consensual deepfake porn of an adult (and a crime to create it if you’re in Victoria).

Whether you’re the victim, or someone you know is, you can report deepfake abuse to digital platforms[6], to the Australian Centre to Counter Child Exploitation[7] (if the person depicted is a minor) and to the eSafety Commissioner[8].

School children in a blur.
Creating or sharing deepfake sexual images of minors is a criminal offence in Australia. LBeddoe/Shutterstock[9]

If you’re in danger, contact the police or ambulance on triple zero (000). If it’s not an emergency, you can call the Police Assistance Line[10] (131 444) or your local police station. The same steps apply if you’re a bystander who has come across non-consensual deepfake pornography of someone else online.

The eSafety commissioner can take action against image-based abuse under the federal Online Safety Act[11], and can work with victims and their supporters to get the content taken down within 24 hours. They can also issue formal warnings, take-down orders and civil penalties to individuals and technology companies that fail to take action.

Unfortunately, the deepfake content may continue to circulate even after it is taken down from the initial platform.

2. Seek help

If you’ve been targeted, it’s a good idea to talk to someone you trust, such as a friend, family member, teacher, counsellor or psychologist.

Our website has a list of relevant support services[12] for victim-survivors of image-based abuse, including specialist services for Aboriginal and Torres Strait Islander people, migrants and refugees, young people, people with disabilities, people from LGBTQI+ communities and sex workers.

Even if you’re not ready to talk about the experience, you can still find useful information about image-based abuse online, including on the eSafety commissioner’s website[13].

Two people holding hands. It’s a good idea to talk to someone you trust. fizkes/Shutterstock[14]

We’ve also developed a chatbot called Umibot[15], which provides free confidential advice and support to people who have experienced image-based abuse, including deepfake abuse. Umibot also has information for bystanders and perpetrators.

If you’re Aboriginal or Torres Strait Islander, you can check out WellMob[16]. This is an online resource made by Indigenous Australians to provide information on social and emotional wellbeing.

Resources for young people are also available from ReachOut[17], Beyond Blue[18], Youth Law Australia[19] and Kids Helpline[20].

3. Create a digital hash to stop the spread

The United Kingdom’s Revenge Porn Helpline and Meta have developed two digital hashing tools for victim-survivors. These are Stop NCII[21] for adults, and Take It Down[22] for minors.

Anyone in the world can use these tools to generate an anonymous digital hash (a unique numerical code) by scanning the image from their device. This hash is then shared with the companies participating[23] in the scheme (including Facebook, Instagram, Pornhub, TikTok and OnlyFans) so they may detect and block any matches on their platform. You aren’t required to upload the image, which means no one else sees it, nor does it leave your device.

It’s important to note this tool won’t block the image from appearing on platforms that aren’t part of the scheme. You also need to have access to the images in the first place to use the tool.

4. Block, report and distance yourself from the perpetrator (if it’s safe to do so)

You can block the perpetrator(s) through your mobile and on social media, and report them to the relevant platforms and authorities. In the case of platforms, it’s not always clear what will be done once a report is lodged, so it’s a good idea to ask about this.

If the perpetrator is someone you know, such as a classmate or student, authorities can take action to ensure you don’t interact with that person anymore.

Last week, a boy was expelled from Melbourne’s Salesian College[24] after he used AI to create sexually explicit images of a female teacher.

5. Boost your online safety

The eSafety commissioner has step-by-step video guides[25] on a range of online safety topics, from how to change your privacy settings on social media, to how to choose strong passwords.

For women experiencing family or domestic violence, the following resources may also be helpful:

References

  1. ^ shared without their consent (www.abc.net.au)
  2. ^ our team calls (arxiv.org)
  3. ^ Celebrities and public figures (theconversation.com)
  4. ^ surveyed (arxiv.org)
  5. ^ federal child sexual abuse material (www.criminalsolicitorsmelbourne.com.au)
  6. ^ digital platforms (www.imagebasedabuse.com)
  7. ^ Australian Centre to Counter Child Exploitation (www.accce.gov.au)
  8. ^ eSafety Commissioner (www.esafety.gov.au)
  9. ^ LBeddoe/Shutterstock (www.shutterstock.com)
  10. ^ Police Assistance Line (www.police.vic.gov.au)
  11. ^ Online Safety Act (www.esafety.gov.au)
  12. ^ relevant support services (www.imagebasedabuse.com)
  13. ^ eSafety commissioner’s website (www.esafety.gov.au)
  14. ^ fizkes/Shutterstock (www.shutterstock.com)
  15. ^ Umibot (umi.rmit.edu.au)
  16. ^ WellMob (wellmob.org.au)
  17. ^ ReachOut (au.reachout.com)
  18. ^ Beyond Blue (www.beyondblue.org.au)
  19. ^ Youth Law Australia (yla.org.au)
  20. ^ Kids Helpline (kidshelpline.com.au)
  21. ^ Stop NCII (stopncii.org)
  22. ^ Take It Down (tidstart.ncmec.org)
  23. ^ companies participating (stopncii.org)
  24. ^ Melbourne’s Salesian College (www.dailymail.co.uk)
  25. ^ video guides (www.esafety.gov.au)

Read more https://theconversation.com/what-to-do-if-you-or-someone-you-know-is-targeted-with-deepfake-porn-or-ai-nudes-232175

Times Magazine

What next from Apple

The question of what comes next for Apple Inc. is no longer theoretical. With leadership transitio...

Leapmotor Hybrid EV Review

The Leapmotor hybrid EV—most notably the Leapmotor C10 REEV (range-extended electric vehicle)—has ...

Navman Gets Even Smarter with 2026 MiVue™ Dash Cams

Introducing NEW Integrated Smart Parking and Australia-First Extended Recording Mode Navman to...

Why Interactive Panels Are Replacing Traditional Whiteboards in Perth

Whiteboards have been part of classrooms and meeting rooms for decades. They’re familiar, flexible...

The Engineering Innovations Transforming the Australian Heavy Transport Fleet

Australia is a massive continent, and its national supply chain relies almost entirely on the road...

Petrol Prices Soar and Rationing Fears Grow — The 10 Cheapest Cars to Run in Australia

Australians are once again confronting a familiar pressure point: the cost of fuel. With petrol pr...

The Times Features

The Quiet Luxury of Ink: Rediscovering the Joy of Writi…

In an age dominated by screens, taps and instant communication, the simple act of writing by hand ...

Owning a Restaurant: Buying One or Braving the Challeng…

Owning a restaurant has long been one of the most alluring—and misunderstood—paths in small busine...

Supermarket Prices Are Up — and So Is Dinner at a Modes…

For many Australians, the weekly grocery shop and a simple night out for dinner have quietly becom...

In 2006, The Devil Wears Prada Became One of the First …

When The Devil Wears Prada premiered in 2006, it was marketed as a sharp, entertaining adaptation ...

Protecting High-Value Homes Before Sale: A Practical Gu…

Selling a premium home is rarely just about listing and waiting. At the top end of the market, buy...

Eumundi Markets: One of the Sunshine Coast’s most power…

As Queensland prepares for Small Business Month in May, Experience Eumundi is highlighting the cri...

Club Med Expands Exclusive Collection Portfolio with a …

Club Med, the global leader in premium all-inclusive holidays for 75 years, and Central Group Capita...

Cost of living increases worry Farrer residents

COST OF LIVING ‘CRUNCH’ HITS FARRER HARD, THE NATIONALS HEAR During a visit to Albury this week...

What's On: Two Psychics and a Medium – Australian …

HIT LIVE SHOW TWO PSYCHICS AND A MEDIUM EMBARK ON  AUSTRALIAN TOUR — AND NO TWO NIGHTS WILL BE T...