The Times Australia
Google AI
The Times World News

.

Virtual child sexual abuse material depicts fictitious children – but can be used to disguise real abuse

  • Written by Larissa Christensen, Senior Lecturer in Criminology & Justice | Co-leader of the Sexual Violence Research and Prevention Unit (SVRPU), University of the Sunshine Coast
Virtual child sexual abuse material depicts fictitious children – but can be used to disguise real abuse

Child sexual abuse material (previously known as child pornography) can be a confronting and uncomfortable topic.

Child sexual abuse material specifically refers to the possession, viewing, sharing, and creation of images or videos containing sexual or offensive material involving children.

But less publicised is another form of child sexual abuse material: virtual child sexual abuse material (VCSAM).

Read more: What's in a name? Online child abuse material is not 'pornography'[1]

What’s virtual child sexual abuse material (VCSAM)?

VCSAM is sexual content depicting fictitious children in formats such as text, drawings, deepfakes[2], or computer-generated graphics[3]. It’s also known as fictional child pornography, pseudo pornography, or fantasy images.

Recent technological advancements mean fictitious children can now be virtually indistinguishable[4] from real children in child sexual abuse material.

Some offenders create VCSAM through a morphing technique which uses technology to transform real images[5] into exploitative ones.

A non-sexual image of a real child could be visually altered[6] to include sexual content. For example a child holding a toy altered to depict the child holding adult genitals.

Morphing can also happen in the reverse, where an image of an adult is morphed[7] to look like a child – for example adult breasts are altered to look prepubescent.

A darkened picture of a hooded man sitting alone at a laptop.
Offenders can manipulate both real and fictitious images to produce VCSAM. John Williams RUS/Shutterstock

Another type of VCSAM includes photo-editing multiple images[8] to create a final, more realistic airbrushed image.

But what might be most troubling about VCSAM is it may still feature images and videos of real children being sexually abused.

In fact, certain software can be used to make images and videos of real victims look like “fictional” drawings or cartoons[9].

In this way, this allows offenders to effectively disguise a real act of child sexual abuse[10], potentially preventing law enforcement from bringing victims to safety.

It may also enable repeat offenders to avoid detection.

Read more: How the world's biggest dark web platform spreads millions of items of child sex abuse material — and why it's hard to stop[11]

Why do some people engage with VCSAM?

There’s limited evidence revealing why some people might engage with VCSAM.

To learn more about this offending group, we recently investigated the possible psychological basis[12] for people who engage with such material.

We discovered several potential reasons[13] why offenders might use VCSAM.

Some used it for relationship-building.

Despite the diverse offending group, some offenders who use child sexual abuse material have been found to have limited intimate relationships[14] and heightened loneliness[15].

Online communities of other deviant but like-minded people may therefore provide offenders with a greater sense of belonging, social validation, and support[16]. Such interactions may also, in turn, serve as positive reinforcement for their criminal behaviour.

A phone screen showing social media apps like Facebook, WhatsApp, Instagram etc.
Online communities may positively reinforce criminal behaviour among child sexual abuse offenders. Ritchie B. Tongo/AAP

Others may use this material to achieve sexual arousal.

It could be argued the material may also normalise[17] the sexualisation of children.

In fact, professionals in child welfare and law enforcement seem to share the concern that VCSAM may “fuel the abuse” of children[18] by framing the offenders’ criminal behaviour as acceptable.

Sometimes the material is used for “grooming”.

Adult offenders may show child sexual abuse material to children, breaking down the child’s inhibitions[19] to falsely normalise[20] the abusive act being depicted.

This is one form of grooming – that is, predatory conduct[21] aimed to facilitate later sexual activity with a child.

Such material can also be used to teach children[22] how to engage in sexual activities.

For example, offenders may use VCSAM to show children material depicting young – and, most alarmingly, happy – cartoon characters engaging in sexual activities[23].

An urgent cause for concern

Clearly, VCSAM is incredibly harmful.

It can be used to disguise the abuse[24] of real children, as a gateway to “contact offending[25]” against children (meaning abusing them in real life), and as a grooming[26] technique.

Child welfare and law enforcement officials have sounded the alarm about the increasing creation and distribution of VCSAM for over a decade.

And it seems this problem will only escalate with the development of increasingly sophisticated software[27] and digital technologies.

So while VCSAM remains illegal and offenders are frequently prosecuted, detecting – and ultimately preventing – these often obscure acts of abuse remains a challenge.

Read more: It takes a village: law reform can't be the only response to online child abuse material[28]

References

  1. ^ What's in a name? Online child abuse material is not 'pornography' (theconversation.com)
  2. ^ text, drawings, deepfakes (theconversation.com)
  3. ^ computer-generated graphics (link.springer.com)
  4. ^ virtually indistinguishable (heinonline.org)
  5. ^ transform real images (link.springer.com)
  6. ^ visually altered (heinonline.org)
  7. ^ image of an adult is morphed (heinonline.org)
  8. ^ photo-editing multiple images (heinonline.org)
  9. ^ victims look like “fictional” drawings or cartoons (webarchive.nationalarchives.gov.uk)
  10. ^ disguise a real act of child sexual abuse (webarchive.nationalarchives.gov.uk)
  11. ^ How the world's biggest dark web platform spreads millions of items of child sex abuse material — and why it's hard to stop (theconversation.com)
  12. ^ we recently investigated the possible psychological basis (link.springer.com)
  13. ^ several potential reasons (link.springer.com)
  14. ^ limited intimate relationships (www.tandfonline.com)
  15. ^ heightened loneliness (www.tandfonline.com)
  16. ^ sense of belonging, social validation, and support (link.springer.com)
  17. ^ normalise (webarchive.nationalarchives.gov.uk)
  18. ^ “fuel the abuse” of children (webarchive.nationalarchives.gov.uk)
  19. ^ breaking down the child’s inhibitions (onlinelibrary.wiley.com)
  20. ^ falsely normalise (www.researchgate.net)
  21. ^ predatory conduct (www.justice.vic.gov.au)
  22. ^ used to teach children (onlinelibrary.wiley.com)
  23. ^ cartoon characters engaging in sexual activities (link.springer.com)
  24. ^ disguise the abuse (www.tandfonline.com)
  25. ^ contact offending (www.researchgate.net)
  26. ^ grooming (www.tandfonline.com)
  27. ^ increasingly sophisticated software (webarchive.nationalarchives.gov.uk)
  28. ^ It takes a village: law reform can't be the only response to online child abuse material (theconversation.com)

Read more https://theconversation.com/virtual-child-sexual-abuse-material-depicts-fictitious-children-but-can-be-used-to-disguise-real-abuse-180248

Times Magazine

Narwal Freo Z Ultra Robotic Vacuum and Mop Cleaner

Rating: ★★★★☆ (4.4/5)Category: Premium Robot Vacuum & Mop ComboBest for: Busy households, ha...

Shark launches SteamSpot - the shortcut for everyday floor mess

Shark introduces the Shark SteamSpot Steam Mop, a lightweight steam mop designed to make everyda...

Game Together, Stay Together: Logitech G Reveals Gaming Couples Enjoy Higher Relationship Satisfaction

With Valentine’s Day right around the corner, many lovebirds across Australia are planning for the m...

AI threatens to eat business software – and it could change the way we work

In recent weeks, a range of large “software-as-a-service” companies, including Salesforce[1], Se...

Worried AI means you won’t get a job when you graduate? Here’s what the research says

The head of the International Monetary Fund, Kristalina Georgieva, has warned[1] young people ...

How Managed IT Support Improves Security, Uptime, And Productivity

Managed IT support is a comprehensive, subscription model approach to running and protecting your ...

The Times Features

Small, realistic increases in physical activity shown to significantly reduce risk of early death

Just Five Minutes More a Day Could Prevent Thousands of Deaths, Landmark Study Finds Small, rea...

Inside One Global resorts: The Sydney Stay Hosting This Season of MAFS Australia

As Married At First Sight returns to Australian screens in 2026, viewers are once again getting a ...

Migraine is more than just a headache. A neurologist explains the 4 stages

A migraine attack[1] is not just a “bad headache”. Migraine is a debilitating neurological co...

Marketers: Forget the Black Box. If You Aren't Moving the Needle, What Are You Doing?

Two years ago, I entered the digital marketing space with the mindset of an engineering student ...

Extreme weather growing threat to Australian businesses in storm and fire season

  Australian small businesses are being hit harder than ever by costly disruptions...

Join Macca’s in supporting Clean Up Australia Day

McDonald’s Australia is once again rolling up its sleeves for Clean Up Australia Day, marking 36...

IFTAR Turns Up The Heat With The Return of Ramadan Nights From 18 February

Iftar returns to IFTAR, with the Western Sydney favourite opening after dark for Ramadan  IFTA...

What causes depression? What we know, don’t know and suspect

Depression is a complex and deeply personal experience. While almost everyone has periods of s...

5 Cool Ways to Transform Your Interior in 2026

We are at the end of the great Australian summer, and this is the perfect time to start thinking a...