The Times Australia
The Times World News

.
Times Media

.

Australia has fined X Australia over child sex abuse material concerns. How severe is the issue – and what happens now?

  • Written by Marten Risius, Senior Lecturer in Business Information Systems, The University of Queensland
Australia has fined X Australia over child sex abuse material concerns. How severe is the issue – and what happens now?

Australia’s eSafety Commissioner, Julie Grant, has found X (formerly Twitter) guilty of serious non-compliance to a transparency notice on child sex abuse material. The commissioner has issued X with an infringement notice for A$610,500.

The commissioner first issued[1] transparency notices to Google, X (then Twitter), Twitch, TikTok and Discord in February under the Online Safety Act 2021. Under this legislation, the commissioner has powers to require online service providers to report on how they are mitigating unlawful or harmful content[2].

The commissioner determined Google and X did not sufficiently comply[3] with the notices given to them. Google was warned for providing overly generic responses to specific questions, while X’s non-compliance was found to be more serious.

For several key questions, X’s response was blank, incomplete or inaccurate. For example[4], X did not adequately disclose:

  • the time it takes to respond to reports of child sexual exploitation material
  • the measures in place to detect child sexual exploitation material in live streams
  • the tools and technologies used to detect this material
  • the teams and resources used to ensure safety.

How severe is the issue?

In June, the Stanford Internet Observatory released a crucial report on child sex abuse material. It was the first quantitative analysis[5] of child sex abuse material on the public sites of the most popular social media platforms.

The researchers’ findings highlighted Instagram and X (then Twitter) are particularly prolific platforms for advertising the sale of self-generated child sex abuse material.

These materials, and the accounts posting them, are often marked by specific recurring features. They may mention particular words or phrases paired with variations on the term “pedo[6]”. Or they might have certain hashtags or emojis in their bios. Using these features, the researchers identified 405 accounts advertising the sale of self-generated child sex abuse material on Instagram, and 128 on Twitter.

They found searching for such content on Instagram may result in an alert of potential child sex abuse material. However, the prompt still presents a clickthrough to “see results anyway”:

Instagram presents a prompt that alerts users to potential child sex abuse material, but let’s them click through to see it anyway. Thiel, D., DiResta, R., and Stamos, A. (2023). Stanford Digital Repository, CC BY-NC-ND[7][8]

Stanford’s analysis found Instagram’s recommendation algorithms are particularly effective in promoting child sex abuse material once it has been accessed.

Although the researchers focused on publicly available networks and content, they also found some platforms implicitly allow[9] the trading of child sex abuse material in private channels.

As for X, they found the platform even allowed the public posting of known, automatically identifiable child sex abuse material.

Why does X have this content?

The creation and trading of this content is commonly regarded as one of the most harmful abuses of online services.

All major platforms - including X[10] - have policies that ban child sex abuse material from their public services. Most sites also explicitly prohibit related activities such as posting this content in private chats, and the sexualisation or grooming of children.

Even self-proclaimed free-speech advocate Elon Musk declared that removing child exploitation material[11] was the top priority, after he took over the platform late last year.

Moderating child sex abuse material is challenging work, and can’t be done through user reporting alone. Platforms that allow nudity, such as X[12], have a responsibility to distinguish between minors and adults – both in terms of who is depicted in the content and who is sharing it.

They should scrutinise content shared voluntarily by minors, and ideally should also weed out any AI-generated[13] child sex abuse material.

Musk fired hundreds[14] of employees responsible for content moderation after taking over at X. It would seem likely the gutting of X’s trust and safety workforce would have reduced its ability to respond to both the harmful material and the eSafety notices.

Platforms could advance their moderation mechanisms[15] by transparently sharing data with researchers. Instead, X has made this unaffordable[16].

Read more: How the world's biggest dark web platform spreads millions of items of child sex abuse material — and why it's hard to stop[17]

Does the fine go far enough?

After years of leniency towards social media platforms, governments are now demanding increased accountability[18] from them for their content, as well as data privacy and child protection matters.

Non-compliance now attracts hefty fines in many jurisdictions. For instance, last year US federal regulators imposed[19] a US$150 million (A$236.3 million) fine on X to settle claims it had misleadingly used email addresses and phone numbers for targeting advertising.

This year, Ireland’s privacy regulator slapped Meta, Facebook’s parent company, with a €1.2 billion (almost A$2 billion) fine for mishandling user information[20].

This year the Australian Federal Court also ordered[21] two subsidiaries of Meta, Facebook Israel and Onavo Inc, to pay A$10 million each for engaging in conduct liable to mislead in breach of Australian consumer law.

The latest fine of A$610,500, though small in comparison, is a blow to X’s reputation given its declining revenue and dwindling advertiser trust due to poor content moderation and the reinstating of banned accounts.

What happens now?

X has 28 days to settle the fine. If it doesn’t, eSafety can initiate civil penalty proceedings and bring it to court. Depending on the court’s decision, the cumulative fine could escalate to A$780,000 per day, retroactive to the initial non-compliance in March.

But the fine’s impact extends beyond just financial implications. By spotlighting the issue of child sex abuse material on X, it could increase pressure on advertisers to pull their ads, or empower other governments to follow suit.

Earlier this month, India’s Ministry of Electronics and IT sent notices[22] to X, YouTube and Telegram, instructing them to remove child sex abuse material for users accessing the sites from India – while threatening heavy fines and penalties for non-compliance.

It seems X is in hot water. To get out, it’ll need to make a 180-degree turn on its approach to moderating content – especially that which harms and exploits minors.

Read more: Beginning of the end: how Elon Musk’s removal of the block function on X could trigger its hellish demise[23]

References

  1. ^ first issued (www.esafety.gov.au)
  2. ^ unlawful or harmful content (www.esafety.gov.au)
  3. ^ did not sufficiently comply (media.licdn.com)
  4. ^ For example (www.esafety.gov.au)
  5. ^ first quantitative analysis (stacks.stanford.edu)
  6. ^ pedo (www.theverge.com)
  7. ^ Thiel, D., DiResta, R., and Stamos, A. (2023). Stanford Digital Repository (purl.stanford.edu)
  8. ^ CC BY-NC-ND (creativecommons.org)
  9. ^ platforms implicitly allow (stacks.stanford.edu)
  10. ^ including X (help.twitter.com)
  11. ^ child exploitation material (www.wired.com)
  12. ^ such as X (help.twitter.com)
  13. ^ AI-generated (stacks.stanford.edu)
  14. ^ hundreds (fortune.com)
  15. ^ their moderation mechanisms (www.npr.org)
  16. ^ made this unaffordable (www.wired.co.uk)
  17. ^ How the world's biggest dark web platform spreads millions of items of child sex abuse material — and why it's hard to stop (theconversation.com)
  18. ^ demanding increased accountability (journals.sagepub.com)
  19. ^ federal regulators imposed (www.washingtonpost.com)
  20. ^ mishandling user information (www.theguardian.com)
  21. ^ also ordered (www.accc.gov.au)
  22. ^ sent notices (pib.gov.in)
  23. ^ Beginning of the end: how Elon Musk’s removal of the block function on X could trigger its hellish demise (theconversation.com)

Read more https://theconversation.com/australia-has-fined-x-australia-over-child-sex-abuse-material-concerns-how-severe-is-the-issue-and-what-happens-now-215696

The Times Features

Amazon Australia and DoorDash announce two-year DashPass offer only for Prime members

New and existing Prime members in Australia can enjoy a two-year membership to DashPass for free, and gain access to AU$0 delivery fees on eligible DoorDash orders New offer co...

6 things to do if your child’s weight is beyond the ideal range – and 1 thing to avoid

One of the more significant challenges we face as parents is making sure our kids are growing at a healthy rate. To manage this, we take them for regular check-ups with our GP...

Joykids Australia Presents the Joykids Family Rave: A Weekend Adventure Like No Other

Get ready to kick off the first day of summer and the festive season with an unforgettable family adventure! Joykids Australia is excited to announce the Joykids Family Rave—an...

New study suggests weight loss drugs like Ozempic could help with knee pain. Here’s why there may be a link

The drug semaglutide, commonly known by the brand names Ozempic or Wegovy, was originally developed[1] to help people with type 2 diabetes manage their blood sugar levels. How...

Maintaining Your Pool After a Marble Interior Upgrade

After upgrading your pool with a marble interior, it’s crucial to understand that maintenance is key to preserving its elegance and longevity. You’ll want to regularly skim for d...

Labor using explanatory document to hide true powers of Misinformation Bill

The opinions and commentary of individuals could be deemed misinformation under Labor’s proposed legislation changes, according to James McComish of Victorian Bar. Appearing in...

Times Magazine

How to Choose the Right Collar for Your Cat

It's easy to buy any old collar for your cat to wear, but how do you find one that provides you with peace of mind knowing your cat is comfortable and secure? Here's a handy guide to choosing a cat collar that caters for your cat's specific needs...

Abstract blues and cute otters – the unlikely art of Aussie love

Online dating site eharmony asked 12 regular Australians to paint what love and compatibility looked like, with a view to understanding if stereotypical symbols of love were still relevant, or if they varied greatly across ages and walks of life. ...

5 Myths about Retirement Village

Retiring from your job doesn't mean the end of your active lifestyle. If you're retiring soon, you can opt for a retirement village where you get to live with people at the same stage of life as you. Retirement villages are for senior citizens s...

Unlocking Your Business Potential with The Power of Custom Software Development Services

Businesses are constantly looking for new and inventive ways of gaining an advantage by using the latest innovations in technology. Engaging with custom software development service providers is one of the best approaches to accomplishing this. O...

Considerations When Deciding on JC Tuition

As a crucial aspect of the academic journey of many students in Singapore, JC tuition plays a vital role in helping them achieve their educational goals and fulfil their academic interests. Junior College education is a financial investment that pr...

The Ultimate Pet Handbook

An essential guide for young people with a passion for pets  “As you read this book it will soon be obvious that Ben Dessen is no ordinary individual. From a very young age Ben has had a fascination with animals of all kinds. He has the greatest e...