The Times Australia
The Times World News

.
Times Media

.

What to do if you, or someone you know, is targeted with deepfake porn or AI nudes

  • Written by Nicola Henry, Professor & Australian Research Council Future Fellow, Social and Global Studies Centre, RMIT University
What to do if you, or someone you know, is targeted with deepfake porn or AI nudes

This week, about 50 female students from Victoria’s Bacchus Marsh Grammar School had fake, sexually explicit images of them shared without their consent[1] on Instagram and Snapchat. Images of their faces, purportedly obtained from social media, were stitched onto pornographic images using artificial intelligence (AI).

Deepfake porn, or what our team calls[2] “AI-generated image-based sexual abuse”, involves the use of AI to create a nude and/or sexual image of a person doing or saying things they haven’t said or done.

Celebrities and public figures[3], predominantly women, have experienced such abuse for nearly a decade, with various deepfake porn sites and “nudify apps” readily available online.

But as these technologies become more accessible and sophisticated, we’re starting to see this problem creep into our homes and schools. Teens – and even children – are now being targeted.

How widespread is deepfake abuse?

In 2023, my colleagues and I surveyed[4] more than 16,000 adults in ten countries and found that, despite widespread media coverage (particularly in Western countries), the concept of deepfake porn isn’t well known. When informed about it, however, most respondents indicated it should be criminalised.

Among respondents from Australia, 3.7% had been a victim of deepfake porn as an adult. This was the highest rate reported from the countries we surveyed.

At the same time, 2.4% of Australian respondents said they had created, shared or threatened to share a deepfake photo or video of another person without their consent. This too was a higher figure than every other country we surveyed except the United States.

Men were more likely to report being a victim of deepfake abuse, and more likely to report being a perpetrator. Men were also less likely to find the viewing, creating and/or sharing of deepfake pornography to be problematic.

What can you do if you’re targeted?

Image-based abuse can be a distressing experience. But victims should know they’re not alone, it isn’t their fault and there is plenty of help out there. Here are some steps they can take.

1. Report it

Creating or sharing deepfake sexual images of minors is a criminal offence under Australia’s federal child sexual abuse material[5] (“child pornography”) laws. It’s also a criminal offence to share non-consensual deepfake porn of an adult (and a crime to create it if you’re in Victoria).

Whether you’re the victim, or someone you know is, you can report deepfake abuse to digital platforms[6], to the Australian Centre to Counter Child Exploitation[7] (if the person depicted is a minor) and to the eSafety Commissioner[8].

School children in a blur.
Creating or sharing deepfake sexual images of minors is a criminal offence in Australia. LBeddoe/Shutterstock[9]

If you’re in danger, contact the police or ambulance on triple zero (000). If it’s not an emergency, you can call the Police Assistance Line[10] (131 444) or your local police station. The same steps apply if you’re a bystander who has come across non-consensual deepfake pornography of someone else online.

The eSafety commissioner can take action against image-based abuse under the federal Online Safety Act[11], and can work with victims and their supporters to get the content taken down within 24 hours. They can also issue formal warnings, take-down orders and civil penalties to individuals and technology companies that fail to take action.

Unfortunately, the deepfake content may continue to circulate even after it is taken down from the initial platform.

2. Seek help

If you’ve been targeted, it’s a good idea to talk to someone you trust, such as a friend, family member, teacher, counsellor or psychologist.

Our website has a list of relevant support services[12] for victim-survivors of image-based abuse, including specialist services for Aboriginal and Torres Strait Islander people, migrants and refugees, young people, people with disabilities, people from LGBTQI+ communities and sex workers.

Even if you’re not ready to talk about the experience, you can still find useful information about image-based abuse online, including on the eSafety commissioner’s website[13].

Two people holding hands. It’s a good idea to talk to someone you trust. fizkes/Shutterstock[14]

We’ve also developed a chatbot called Umibot[15], which provides free confidential advice and support to people who have experienced image-based abuse, including deepfake abuse. Umibot also has information for bystanders and perpetrators.

If you’re Aboriginal or Torres Strait Islander, you can check out WellMob[16]. This is an online resource made by Indigenous Australians to provide information on social and emotional wellbeing.

Resources for young people are also available from ReachOut[17], Beyond Blue[18], Youth Law Australia[19] and Kids Helpline[20].

3. Create a digital hash to stop the spread

The United Kingdom’s Revenge Porn Helpline and Meta have developed two digital hashing tools for victim-survivors. These are Stop NCII[21] for adults, and Take It Down[22] for minors.

Anyone in the world can use these tools to generate an anonymous digital hash (a unique numerical code) by scanning the image from their device. This hash is then shared with the companies participating[23] in the scheme (including Facebook, Instagram, Pornhub, TikTok and OnlyFans) so they may detect and block any matches on their platform. You aren’t required to upload the image, which means no one else sees it, nor does it leave your device.

It’s important to note this tool won’t block the image from appearing on platforms that aren’t part of the scheme. You also need to have access to the images in the first place to use the tool.

4. Block, report and distance yourself from the perpetrator (if it’s safe to do so)

You can block the perpetrator(s) through your mobile and on social media, and report them to the relevant platforms and authorities. In the case of platforms, it’s not always clear what will be done once a report is lodged, so it’s a good idea to ask about this.

If the perpetrator is someone you know, such as a classmate or student, authorities can take action to ensure you don’t interact with that person anymore.

Last week, a boy was expelled from Melbourne’s Salesian College[24] after he used AI to create sexually explicit images of a female teacher.

5. Boost your online safety

The eSafety commissioner has step-by-step video guides[25] on a range of online safety topics, from how to change your privacy settings on social media, to how to choose strong passwords.

For women experiencing family or domestic violence, the following resources may also be helpful:

References

  1. ^ shared without their consent (www.abc.net.au)
  2. ^ our team calls (arxiv.org)
  3. ^ Celebrities and public figures (theconversation.com)
  4. ^ surveyed (arxiv.org)
  5. ^ federal child sexual abuse material (www.criminalsolicitorsmelbourne.com.au)
  6. ^ digital platforms (www.imagebasedabuse.com)
  7. ^ Australian Centre to Counter Child Exploitation (www.accce.gov.au)
  8. ^ eSafety Commissioner (www.esafety.gov.au)
  9. ^ LBeddoe/Shutterstock (www.shutterstock.com)
  10. ^ Police Assistance Line (www.police.vic.gov.au)
  11. ^ Online Safety Act (www.esafety.gov.au)
  12. ^ relevant support services (www.imagebasedabuse.com)
  13. ^ eSafety commissioner’s website (www.esafety.gov.au)
  14. ^ fizkes/Shutterstock (www.shutterstock.com)
  15. ^ Umibot (umi.rmit.edu.au)
  16. ^ WellMob (wellmob.org.au)
  17. ^ ReachOut (au.reachout.com)
  18. ^ Beyond Blue (www.beyondblue.org.au)
  19. ^ Youth Law Australia (yla.org.au)
  20. ^ Kids Helpline (kidshelpline.com.au)
  21. ^ Stop NCII (stopncii.org)
  22. ^ Take It Down (tidstart.ncmec.org)
  23. ^ companies participating (stopncii.org)
  24. ^ Melbourne’s Salesian College (www.dailymail.co.uk)
  25. ^ video guides (www.esafety.gov.au)

Read more https://theconversation.com/what-to-do-if-you-or-someone-you-know-is-targeted-with-deepfake-porn-or-ai-nudes-232175

The Times Features

Group Adventures Made Easy: How to Coordinate Shuttle Services from DCA to IAD

Traveling as a large group can be both exciting and challenging, especially when navigating busy airports like DCA (Ronald Reagan Washington National Airport) and IAD (Washington...

From Anxiety to Assurance: Proven Strategies to Support Your Child's Emotional Health

Navigating the intricate landscape of childhood emotions can be a daunting task for any parent, especially when faced with common fears and anxieties. However, transforming anxie...

The Rise of Meal Replacement Shakes in Australia: Why The Lady Shake Is Leading the Pack

Source Meal replacement shakes are having a moment in Australia, and it’s not hard to see why. They’re quick, convenient, and packed with nutrition, making them the perfect solu...

HCF’s Healthy Hearts Roadshow Wraps Up 2024 with a Final Regional Sprint

Next week marks the final leg of the HCF Healthy Hearts Roadshow for 2024, bringing free heart health checks to some of NSW’s most vibrant regional communities. As Australia’s ...

The Budget-Friendly Traveler: How Off-Airport Car Hire Can Save You Money

When planning a trip, transportation is one of the most crucial considerations. For many, the go-to option is renting a car at the airport for convenience. But what if we told ...

Air is an overlooked source of nutrients – evidence shows we can inhale some vitamins

You know that feeling you get when you take a breath of fresh air in nature? There may be more to it than a simple lack of pollution. When we think of nutrients, we think of t...

Times Magazine

Motorhome Repair Advice All Owners Can Benefit From

When a motorhome owner has specialised knowledge that enhances their travelling experience, their confidence grows. One of these abilities is auto repair. The ability to do so stems from the knowledge that not all issues with motorhomes require the...

Young Academics Early Learning Centre partners with The Wiggles

With Hot Potato, Fruit Salad and Dippy Do Dinosaur Dance, The Wiggles have topped the charts. Parents know, love and trust their content to uplift young minds in the early developmental years, which is why Young Academics Early Learning Centre ha...

An Introductory Guide to Electrical Sub Boards

Advantages of Installing an Electrical SubBoard Installing an electrical subboard is a great way to keep your home or business safe and properly wired. By adding a subboard to your existing wiring system, you can increase the safety and efficien...

How Listening To The Radio Can Improve Your Lifestyle

A cherished pastime, tuning into the radio carries a history over a century deep. Picture those youthful moments spent eagerly awaiting favourite shows. Imagine the reassuring hum during long car rides. That’s the magic of radio, the distant voice ...

AURA BUY INVEST DONATE

Aura Buy Invest Donate, one of Australia’s newest cash back reward programs, is thrilled to announce Shell Coles Express as its Exclusive Fuel Partner, in a new deal that will see Aura Members receive money back into their Aura Investment Account...

How To Know If Your Phone Is Being Tracked: Full Guide

Suppose one day you are in a meeting and suddenly your phone starts ringing. You are not expecting any calls, so you ignore them. However, the caller leaves a voice mail, and you check it out. The voice message is empty, and you wonder why someon...