The Times Australia
Fisher and Paykel Appliances
The Times World News

.

Deepfake, AI or real? It’s getting harder for police to protect children from sexual exploitation online

  • Written by Terry Goldsworthy, Associate Professor in Criminal Justice and Criminology, Bond University
Deepfake, AI or real? It’s getting harder for police to protect children from sexual exploitation online

Artificial intelligence (AI), now an integral part of our everyday lives, is becoming increasingly accessible and ubiquitous. Consequently, there’s a growing trend of AI advancements being exploited for criminal activities.

One significant concern is the ability AI provides to offenders to produce images[1] and videos depicting real or deepfake child sexual exploitation material.

This is particularly important here in Australia. The CyberSecurity Cooperative Research Centre has identified the country as the third-largest market[2] for online sexual abuse material.

So, how is AI being used to create child sexual exploitation material? Is it becoming more common? And importantly, how do we combat this crime to better protect children?

Spreading faster and wider

In the United States, the Department of Homeland Security[3] refers to AI-created child sexual abuse material as being:

the production, through digital media, of child sexual abuse material and other wholly or partly artificial or digitally created sexualised images of children.

The agency has recognised a variety[4] of ways in which AI is used to create this material. This includes generated images or videos that contain real children, or using deepfake technologies, such as de-aging[5] or misuse of a person’s innocent images (or audio or video) to generate offending content.

Deepfakes[6] refer to hyper-realistic multimedia content generated using AI techniques and algorithms. This means any given material could be partially or completely fake.

The Department of Homeland Security has also found guides on how to use AI to generate child sexual exploitation material on the dark web[7].

The child safety technology company Thorn[8] has also identified a range of ways AI is used in creating this material. It noted in a report[9] that AI can impede victim identification. It can also create new ways to victimise and revictimise children.

Concerningly, the ease with which the technology can be used helps generate more demand. Criminals can then share information about how to make this material (as the Department of Homeland Security found), further proliferating the abuse.

How common is it?

In 2023, an Internet Watch Foundation investigation revealed alarming statistics[10]. Within a month, a dark web forum hosted 20,254 AI-generated images. Analysts assessed that 11,108 of these images were most likely criminal. Using UK laws, they identified 2,562 that satisfied the legal requirements for child sexual exploitation material. A further 416 were criminally prohibited images.

Similarly, the Australian Centre to Counter Child Exploitation, set up in 2018, received more than 49,500 reports[11] of child sexual exploitation material in the 2023–2024 financial year, an increase of about 9,300 over the previous year.

About 90% of deepfake materials[12] online are believed to be explicit. While we don’t exactly know how many include children, the previous statistics indicate many would.

A defocused computer screen with sexually explicit imagery
Australia has recorded thousands of reports of child sexual exploitation. Shutterstock[13]

These data highlight the rapid proliferation of AI in producing realistic and damaging child sexual exploitation material that is difficult to distinguish from genuine images.

This has become a significant national concern. The issue was particularly highlighted during the COVID pandemic when there was a marked increase in the production and distribution of exploitation material.

This trend has prompted an inquiry and a subsequent submission[14] to the Parliamentary Joint Committee on Law Enforcement by the Cyber Security Cooperative Research Centre. As AI technologies become even more advanced and accessible, the issue will only get worse.

Detective Superintendent Frank Rayner from the research centre has said[15]:

the tools that people can access online to create and modify using AI are expanding and they’re becoming more sophisticated, as well. You can jump onto a web browser and enter your prompts in and do text-to-image or text-to-video and have a result in minutes.

Making policing harder

Traditional methods of identifying child sexual exploitation material, which rely on recognising known images and tracking their circulation, are inadequate[16] in the face of AI’s ability to rapidly generate new, unique content.

Moreover, the growing realism of AI-generated exploitation material is adding to the workload of the victim identification unit of the Australian Federal Police. Federal Police Commander Helen Schneider has said[17]

it’s sometimes difficult to discern fact from fiction and therefore we can potentially waste resources looking at images that don’t actually contain real child victims. It means there are victims out there that remain in harmful situations for longer.

However, emerging strategies[18] are being developed to address these challenges.

One promising approach involves leveraging AI technology itself[19] to combat AI-generated content. Machine learning algorithms can be trained to detect subtle anomalies and patterns specific to AI-generated images, such as inconsistencies in lighting, texture or facial features the human eye might miss.

AI technology can also be used to detect exploitation material, including content[20] that was previously hidden. This is done by gathering large data sets from across the internet, which is then assessed by experts.

Collaboration is key

According to Thorn[21], any response to the use of AI in child sexual exploitation material should involve AI developers and providers, data hosting platforms, social platforms and search engines. Working together would help minimise the possibility of generative AI being further misused.

In 2024, major social media companies such as Google, Meta and Amazon came together to form an alliance to fight the use of AI for such abusive material. The chief executives of the major social media companies[22] also faced a US senate committee on how they are preventing online child sexual exploitation and the use of AI to create these images.

The collaboration between technology companies and law enforcement is essential in the fight against the further proliferation of this material. By leveraging their technological capabilities and working together proactively, they can address this serious national concern more effectively than working on their own.

References

  1. ^ produce images (www.dhs.gov)
  2. ^ third-largest market (cybersecuritycrc.org.au)
  3. ^ Department of Homeland Security (www.dhs.gov)
  4. ^ a variety (www.sciencedirect.com)
  5. ^ de-aging (www.respeecher.com)
  6. ^ Deepfakes (asistdl.onlinelibrary.wiley.com)
  7. ^ dark web (theconversation.com)
  8. ^ Thorn (www.thorn.org)
  9. ^ report (www.nist.gov)
  10. ^ alarming statistics (www.iwf.org.uk)
  11. ^ more than 49,500 reports (www.abc.net.au)
  12. ^ 90% of deepfake materials (www.abc.net.au)
  13. ^ Shutterstock (www.shutterstock.com)
  14. ^ submission (cybersecuritycrc.org.au)
  15. ^ has said (www.abc.net.au)
  16. ^ are inadequate (www.abc.net.au)
  17. ^ said (www.abc.net.au)
  18. ^ emerging strategies (stacks.stanford.edu)
  19. ^ leveraging AI technology itself (www.thorn.org)
  20. ^ content (www.sciencedirect.com)
  21. ^ Thorn (www.thorn.org)
  22. ^ chief executives of the major social media companies (time.com)

Read more https://theconversation.com/deepfake-ai-or-real-its-getting-harder-for-police-to-protect-children-from-sexual-exploitation-online-232820

Times Magazine

Q&A with Freya Alexander – the young artist transforming co-working spaces into creative galleries

As the current Artist in Residence at Hub Australia, Freya Alexander is bringing colour and creativi...

This Christmas, Give the Navman Gift That Never Stops Giving – Safety

Protect your loved one’s drives with a Navman Dash Cam.  This Christmas don’t just give – prote...

Yoto now available in Kmart and The Memo, bringing screen-free storytelling to Australian families

Yoto, the kids’ audio platform inspiring creativity and imagination around the world, has launched i...

Kool Car Hire

Turn Your Four-Wheeled Showstopper into Profit (and Stardom) Have you ever found yourself stand...

EV ‘charging deserts’ in regional Australia are slowing the shift to clean transport

If you live in a big city, finding a charger for your electric vehicle (EV) isn’t hard. But driv...

How to Reduce Eye Strain When Using an Extra Screen

Many professionals say two screens are better than one. And they're not wrong! A second screen mak...

The Times Features

Are mental health issues genetic? New research identifies brain cells linked to depression

Scientists from McGill University and the Douglas Institute recently published new research find...

What do we know about climate change? How do we know it? And where are we headed?

The 2025 United Nations Climate Change Conference (sometimes referred to as COP30) is taking pla...

The Industry That Forgot About Women - Until Now

For years, women in trades have started their days pulling on uniforms made for someone else. Th...

Q&A with Freya Alexander – the young artist transforming co-working spaces into creative galleries

As the current Artist in Residence at Hub Australia, Freya Alexander is bringing colour and creativi...

Indo-Pacific Strength Through Economic Ties

The defence treaty between Australia and Indonesia faces its most difficult test because of econ...

Understanding Kerbside Valuation: A Practical Guide for Property Owners

When it comes to property transactions, not every situation requires a full, detailed valuation. I...

What’s been happening on the Australian stock market today

What moved, why it moved and what to watch going forward. 📉 Market overview The benchmark S&am...

The NDIS shifts almost $27m a year in mental health costs alone, our new study suggests

The National Disability Insurance Scheme (NDIS) was set up in 2013[1] to help Australians with...

Why Australia Is Ditching “Gym Hop Culture” — And Choosing Fitstop Instead

As Australians rethink what fitness actually means going into the new year, a clear shift is emergin...