Google AI
The Times Australia
The Times World News

.

Deepfake, AI or real? It’s getting harder for police to protect children from sexual exploitation online

  • Written by: Terry Goldsworthy, Associate Professor in Criminal Justice and Criminology, Bond University
Deepfake, AI or real? It’s getting harder for police to protect children from sexual exploitation online

Artificial intelligence (AI), now an integral part of our everyday lives, is becoming increasingly accessible and ubiquitous. Consequently, there’s a growing trend of AI advancements being exploited for criminal activities.

One significant concern is the ability AI provides to offenders to produce images[1] and videos depicting real or deepfake child sexual exploitation material.

This is particularly important here in Australia. The CyberSecurity Cooperative Research Centre has identified the country as the third-largest market[2] for online sexual abuse material.

So, how is AI being used to create child sexual exploitation material? Is it becoming more common? And importantly, how do we combat this crime to better protect children?

Spreading faster and wider

In the United States, the Department of Homeland Security[3] refers to AI-created child sexual abuse material as being:

the production, through digital media, of child sexual abuse material and other wholly or partly artificial or digitally created sexualised images of children.

The agency has recognised a variety[4] of ways in which AI is used to create this material. This includes generated images or videos that contain real children, or using deepfake technologies, such as de-aging[5] or misuse of a person’s innocent images (or audio or video) to generate offending content.

Deepfakes[6] refer to hyper-realistic multimedia content generated using AI techniques and algorithms. This means any given material could be partially or completely fake.

The Department of Homeland Security has also found guides on how to use AI to generate child sexual exploitation material on the dark web[7].

The child safety technology company Thorn[8] has also identified a range of ways AI is used in creating this material. It noted in a report[9] that AI can impede victim identification. It can also create new ways to victimise and revictimise children.

Concerningly, the ease with which the technology can be used helps generate more demand. Criminals can then share information about how to make this material (as the Department of Homeland Security found), further proliferating the abuse.

How common is it?

In 2023, an Internet Watch Foundation investigation revealed alarming statistics[10]. Within a month, a dark web forum hosted 20,254 AI-generated images. Analysts assessed that 11,108 of these images were most likely criminal. Using UK laws, they identified 2,562 that satisfied the legal requirements for child sexual exploitation material. A further 416 were criminally prohibited images.

Similarly, the Australian Centre to Counter Child Exploitation, set up in 2018, received more than 49,500 reports[11] of child sexual exploitation material in the 2023–2024 financial year, an increase of about 9,300 over the previous year.

About 90% of deepfake materials[12] online are believed to be explicit. While we don’t exactly know how many include children, the previous statistics indicate many would.

A defocused computer screen with sexually explicit imagery
Australia has recorded thousands of reports of child sexual exploitation. Shutterstock[13]

These data highlight the rapid proliferation of AI in producing realistic and damaging child sexual exploitation material that is difficult to distinguish from genuine images.

This has become a significant national concern. The issue was particularly highlighted during the COVID pandemic when there was a marked increase in the production and distribution of exploitation material.

This trend has prompted an inquiry and a subsequent submission[14] to the Parliamentary Joint Committee on Law Enforcement by the Cyber Security Cooperative Research Centre. As AI technologies become even more advanced and accessible, the issue will only get worse.

Detective Superintendent Frank Rayner from the research centre has said[15]:

the tools that people can access online to create and modify using AI are expanding and they’re becoming more sophisticated, as well. You can jump onto a web browser and enter your prompts in and do text-to-image or text-to-video and have a result in minutes.

Making policing harder

Traditional methods of identifying child sexual exploitation material, which rely on recognising known images and tracking their circulation, are inadequate[16] in the face of AI’s ability to rapidly generate new, unique content.

Moreover, the growing realism of AI-generated exploitation material is adding to the workload of the victim identification unit of the Australian Federal Police. Federal Police Commander Helen Schneider has said[17]

it’s sometimes difficult to discern fact from fiction and therefore we can potentially waste resources looking at images that don’t actually contain real child victims. It means there are victims out there that remain in harmful situations for longer.

However, emerging strategies[18] are being developed to address these challenges.

One promising approach involves leveraging AI technology itself[19] to combat AI-generated content. Machine learning algorithms can be trained to detect subtle anomalies and patterns specific to AI-generated images, such as inconsistencies in lighting, texture or facial features the human eye might miss.

AI technology can also be used to detect exploitation material, including content[20] that was previously hidden. This is done by gathering large data sets from across the internet, which is then assessed by experts.

Collaboration is key

According to Thorn[21], any response to the use of AI in child sexual exploitation material should involve AI developers and providers, data hosting platforms, social platforms and search engines. Working together would help minimise the possibility of generative AI being further misused.

In 2024, major social media companies such as Google, Meta and Amazon came together to form an alliance to fight the use of AI for such abusive material. The chief executives of the major social media companies[22] also faced a US senate committee on how they are preventing online child sexual exploitation and the use of AI to create these images.

The collaboration between technology companies and law enforcement is essential in the fight against the further proliferation of this material. By leveraging their technological capabilities and working together proactively, they can address this serious national concern more effectively than working on their own.

References

  1. ^ produce images (www.dhs.gov)
  2. ^ third-largest market (cybersecuritycrc.org.au)
  3. ^ Department of Homeland Security (www.dhs.gov)
  4. ^ a variety (www.sciencedirect.com)
  5. ^ de-aging (www.respeecher.com)
  6. ^ Deepfakes (asistdl.onlinelibrary.wiley.com)
  7. ^ dark web (theconversation.com)
  8. ^ Thorn (www.thorn.org)
  9. ^ report (www.nist.gov)
  10. ^ alarming statistics (www.iwf.org.uk)
  11. ^ more than 49,500 reports (www.abc.net.au)
  12. ^ 90% of deepfake materials (www.abc.net.au)
  13. ^ Shutterstock (www.shutterstock.com)
  14. ^ submission (cybersecuritycrc.org.au)
  15. ^ has said (www.abc.net.au)
  16. ^ are inadequate (www.abc.net.au)
  17. ^ said (www.abc.net.au)
  18. ^ emerging strategies (stacks.stanford.edu)
  19. ^ leveraging AI technology itself (www.thorn.org)
  20. ^ content (www.sciencedirect.com)
  21. ^ Thorn (www.thorn.org)
  22. ^ chief executives of the major social media companies (time.com)

Read more https://theconversation.com/deepfake-ai-or-real-its-getting-harder-for-police-to-protect-children-from-sexual-exploitation-online-232820

Times Magazine

GLOBAL SPORTS MARKETING HEAVYWEIGHTS CONVERGE IN BRISBANE FOR INAUGURAL VICTORY LAP

Australia’s premier sports marketing and creative summit, Victory Lap, has revealed its lineup of in...

The 2026 Met Gala: Fashion, Power and the Theatre of Exclusivity

Each year, on the first Monday in May, the global fashion industry converges on the steps of Metro...

Australian Wine Guide

A Quick but Informed Guide to the Varieties and Popular Brands of Australian WinesDon’t let a wine...

What next from Apple

The question of what comes next for Apple Inc. is no longer theoretical. With leadership transitio...

Leapmotor Hybrid EV Review

The Leapmotor hybrid EV—most notably the Leapmotor C10 REEV (range-extended electric vehicle)—has ...

Navman Gets Even Smarter with 2026 MiVue™ Dash Cams

Introducing NEW Integrated Smart Parking and Australia-First Extended Recording Mode Navman to...

The Times Features

Wrong Corridor Killed Queensland's Inland Rail

The decision by the Albanese Labor Government to abandon the Queensland leg of the Inland Rail pro...

GLOBAL SPORTS MARKETING HEAVYWEIGHTS CONVERGE IN BRISB…

Australia’s premier sports marketing and creative summit, Victory Lap, has revealed its lineup of in...

Australia’s Luxury Property Divide: Should Homes Be Res…

Australia is home to some of the world’s most desirable residential real estate. From harbourfront...

Labor derails regional freight to fund high-speed piped…

The Albanese Labor Government’s decision to abandon the critical New South Wales to  Queensland leg ...

GraceX Launches Psychological Safety Platform as Psych…

Australia’s approach to workplace mental health has entered a  new and consequential chapter. Work H...

Australia Pays the Price for Labor’s City-Centric Infra…

The Albanese Labor Government’s decision to abandon the Gladstone connection to Inland Rail is ano...

Fast Food Is Called “Sometimes Food” For Children. Ther…

For generations, parents were told that fast food should be “sometimes food” for children rather t...

KMS x Daisy Edgar Jones Met Gala

For the 2026 Met Gala red carpet, Celebrity Stylist, Bryce Scarlett, created a voluminous, polished ...

Sweet success as Council green-lights $150 million Choc…

Glenorchy City Council has approved the $150 million Chocolate Experience at Cadbury, clearing the w...