The Times Australia
Fisher and Paykel Appliances
The Times News

.

A new online safety bill could allow censorship of anyone who engages with sexual content on the internet

  • Written by Zahra Zsuzsanna Stardust, Adjunct Lecturer, Centre for Social Research in Health, Research Assistant, Faculty of Law and Justice, UNSW

Under new draft laws[1], the eSafety Commissioner[2] could order your nude selfies, sex education or slash fiction[3] to be taken down from the internet with just 24 hours notice.

Officially, the Morrison government’s new bill aims to improve online safety.

But in doing so, it gives broad, discretionary powers to the commissioner, with serious ramifications for anyone who engages with sexual content online.

Broad new powers

After initial consultation in 2019, the federal government released the draft[4] online safety bill last December. Public submissions closed on the weekend.

The bill contains several new initiatives, from cyberbullying protections for children to new ways to remove non-consensual intimate imagery[5].

eSafety Commissioner Julie Inman Grant Julie Inman Grant was appointed as the government’s eSafety Commissioner in 2016. Lukas Coch/AAP

Crucially, it gives the eSafety Commissioner[6] — a federal government appointee[7] — a range of new powers.

It contains rapid website-blocking provisions to prevent the circulation of “abhorrent violent material” (such as live-streaming terror attacks). It reduces the timeframe for “takedown notices[8]” (where a hosting provider is directed to remove content) from 48 to 24 hours. It can also require search engines to delete links and app stores to prevent downloads, with civil penalties of up to $111,000 for non-compliance.

But one concerning element of the bill that has not received wide public attention is its takedown notices for so-called “harmful online content”.

A move towards age verification

Due to the impracticality of classifying the entire internet[9], regulators are now moving towards systems that require access restrictions for certain content and make use of user complaints to identify harmful material.

In this vein, the proposed bill will require online service providers to use technologies to prevent children gaining access to sexual material.

Read more: Coalition plans to improve online safety don't address the root cause of harms: the big tech business model[10]

Controversially, the bill gives the commissioner power to impose their own specific “restricted access system[11]”.

This means the commissioner could decide that, to access sexual content, users must upload their identity documents, scan their fingerprints, undergo facial recognition technology[12] or have their age estimated by artificial intelligence based on behavioural signals.

But there are serious issues[13] with online verification systems. This has already been considered and abandoned by similar countries[14]. The United Kingdom dropped its plans in 2019, following implementation difficulties and privacy concerns.

The worst-case scenario here is governments collect databases of people’s sexual preferences and browsing histories that can be leaked, hacked[15], sold or misused.

eSafety Commissioner as ‘chief censor’

The bill also creates an “online content scheme[16]”, which identifies content that users can complain about.

The bill permits any Australian internet user to make complaints about “class 1” and “class 2”[17] content that is not subject to a restricted access system. These categories are extremely broad, ranging from actual, to simulated, to implied sexual activity, as well as explicit nudity.

In practice, people can potentially complain about any material depicting sex that they find on the internet, even on specific adult sites, if there is no mechanism to verify the user’s age.

Screen shot of YouPorn website The potential for complaints about sexual material online is very broad under the proposed laws. www.shutterstock.com

The draft laws then allow the commissioner to conduct investigations and order removal notices as they “think fit”. There are no criteria for what warrants removal, no requirement to give reasons, and no process for users to be notified or have opportunity to respond to complaints.

Without the requirement to publish transparent enforcement data, the commissioner can simply remove content that is neither harmful nor unlawful and is specifically exempt from liability for damages or civil proceedings.

This means users will have little clarity on how to actually comply with the scheme.

Malicious complaints and self-censorship

The potential ramifications of the bill are broad. They are likely to affect sex workers, sex educators, LGBTIQ health organisations, kink communities, online daters, artists and anyone who shares or accesses sexual content online.

While previous legislation was primarily concerned with films, print publications, computer games and broadcast media, this bill applies to social media, instant messaging, online games, websites, apps and a range of electronic and internet service providers.

Open palms holding a heart shape and a condom. Sex education material may be subject to complaints. www.shutterstock.com

It means links to sex education and harm reduction material for young people could be deleted by search engines. Hook up apps such as Grindr or Tinder could be made unavailable for download. Escort advertising platforms could be removed. Online kink communities like Fetlife could be taken down.

The legislation could embolden users - including anti-pornography advocates, disgruntled customers or ex-partners - to make vexatious complaints about sexual content, even where there is nothing harmful about it.

The complaints system is also likely to have a disproportionate impact on sex workers, especially those who turned to online work[18] during the pandemic, and who already face a high level of malicious complaints.

Sex workers consistently report restrictive terms of service[19] as well as shadowbanning and deplatforming[20], where their content is stealthily or selectively removed from social media.

Read more: How the 'National Cabinet of Whores' is leading Australia's coronavirus response for sex workers[21]

The requirement for service providers to restrict children’s access to sexual content also provides a financial incentive to take an over-zealous approach. Providers may employ artificial intelligence at scale to screen and detect nudity[22] (which can confuse sex education with pornography[23]), apply inappropriate age verification mechanisms that compromise user privacy[24], or, where this is too onerous or expensive, take the simpler route of prohibiting sexual content altogether[25].

In this sense, the bill may operate in a similar way to United States “FOSTA-SESTA” anti-trafficking legislation[26], which prohibits websites from promoting or facilitating prostitution. This resulted in the pre-emptive closure of essential sites[27] for sex worker safety, education and community building.

New frameworks for sexual content moderation

Platforms have been notoriously poor[28] when it comes to dealing with sexual content. But governments have not been any better[29].

We need new ways to think about moderating sexual content.

Historically, obscenity legislation has treated all sexual content as if it was lacking in value unless it was redeemed by literary, artistic or scientific merit[30]. Our current classification framework of “offensiveness” is also based on outdated notions of “morality, decency and propriety[31]”.

Read more: The Chatterley Trial 60 years on: a court case that secured free expression in 1960s Britain[32]

Research[33] into sex and social media[34] suggests we should not simply conflate sex with risk[35].

Instead, some have proposed human rights approaches[36]. These draw on a growing body of literature[37] that sees sexual health, pleasure and satisfying sexual experiences as compatible with bodily autonomy, safety and freedom from violence.

Others have pointed to the need for improved sex education, consent skills and media literacy[38] to equip users to navigate online space.

What’s obvious is we need a more nuanced approach to decision-making that imagines sex beyond “harm”, thinks more comprehensively about safer spaces, and recognises the cultural value in sexual content.

References

  1. ^ new draft laws (www.communications.gov.au)
  2. ^ eSafety Commissioner (www.esafety.gov.au)
  3. ^ slash fiction (www.bustle.com)
  4. ^ released the draft (minister.infrastructure.gov.au)
  5. ^ non-consensual intimate imagery (theconversation.com)
  6. ^ eSafety Commissioner (www.esafety.gov.au)
  7. ^ appointee (www.esafety.gov.au)
  8. ^ takedown notices (www.esafety.gov.au)
  9. ^ classifying the entire internet (www.alrc.gov.au)
  10. ^ Coalition plans to improve online safety don't address the root cause of harms: the big tech business model (theconversation.com)
  11. ^ restricted access system (www.esafety.gov.au)
  12. ^ facial recognition technology (www.nytimes.com)
  13. ^ serious issues (www.tandfonline.com)
  14. ^ considered and abandoned by similar countries (www.theguardian.com)
  15. ^ leaked, hacked (www.smh.com.au)
  16. ^ online content scheme (www.communications.gov.au)
  17. ^ “class 1” and “class 2” (www.businessinsider.com.au)
  18. ^ turned to online work (redbook.scarletalliance.org.au)
  19. ^ restrictive terms of service (medium.com)
  20. ^ shadowbanning and deplatforming (hackinghustling.org)
  21. ^ How the 'National Cabinet of Whores' is leading Australia's coronavirus response for sex workers (theconversation.com)
  22. ^ screen and detect nudity (www.theguardian.com)
  23. ^ confuse sex education with pornography (mashable.com)
  24. ^ compromise user privacy (www.computerweekly.com)
  25. ^ prohibiting sexual content altogether (nypost.com)
  26. ^ United States “FOSTA-SESTA” anti-trafficking legislation (hackinghustling.org)
  27. ^ pre-emptive closure of essential sites (hackinghustling.org)
  28. ^ notoriously poor (www.highsnobiety.com)
  29. ^ not been any better (www.starobserver.com.au)
  30. ^ literary, artistic or scientific merit (www.mtsu.edu)
  31. ^ morality, decency and propriety (www.artslaw.com.au)
  32. ^ The Chatterley Trial 60 years on: a court case that secured free expression in 1960s Britain (theconversation.com)
  33. ^ Research (vimeo.com)
  34. ^ sex and social media (books.emeraldinsight.com)
  35. ^ conflate sex with risk (mitpress.mit.edu)
  36. ^ human rights approaches (vimeo.com)
  37. ^ body of literature (worldsexualhealth.net)
  38. ^ media literacy (www.tandfonline.com)

Read more https://theconversation.com/a-new-online-safety-bill-could-allow-censorship-of-anyone-who-engages-with-sexual-content-on-the-internet-154739

Times Magazine

Australia’s electric vehicle surge — EVs and hybrids hit record levels

Australians are increasingly embracing electric and hybrid cars, with 2025 shaping up as the str...

Tim Ayres on the AI rollout’s looming ‘bumps and glitches’

The federal government released its National AI Strategy[1] this week, confirming it has dropped...

Seven in Ten Australian Workers Say Employers Are Failing to Prepare Them for AI Future

As artificial intelligence (AI) accelerates across industries, a growing number of Australian work...

Mapping for Trucks: More Than Directions, It’s Optimisation

Daniel Antonello, General Manager Oceania, HERE Technologies At the end of June this year, Hampden ...

Can bigger-is-better ‘scaling laws’ keep AI improving forever? History says we can’t be too sure

OpenAI chief executive Sam Altman – perhaps the most prominent face of the artificial intellig...

A backlash against AI imagery in ads may have begun as brands promote ‘human-made’

In a wave of new ads, brands like Heineken, Polaroid and Cadbury have started hating on artifici...

The Times Features

The way Australia produces food is unique. Our updated dietary guidelines have to recognise this

You might know Australia’s dietary guidelines[1] from the famous infographics[2] showing the typ...

Why a Holiday or Short Break in the Noosa Region Is an Ideal Getaway

Few Australian destinations capture the imagination quite like Noosa. With its calm turquoise ba...

How Dynamic Pricing in Accommodation — From Caravan Parks to Hotels — Affects Holiday Affordability

Dynamic pricing has quietly become one of the most influential forces shaping the cost of an Aus...

The rise of chatbot therapists: Why AI cannot replace human care

Some are dubbing AI as the fourth industrial revolution, with the sweeping changes it is propellin...

Australians Can Now Experience The World of Wicked Across Universal Studios Singapore and Resorts World Sentosa

This holiday season, Resorts World Sentosa (RWS), in partnership with Universal Pictures, Sentosa ...

Mineral vs chemical sunscreens? Science shows the difference is smaller than you think

“Mineral-only” sunscreens are making huge inroads[1] into the sunscreen market, driven by fears of “...

Here’s what new debt-to-income home loan caps mean for banks and borrowers

For the first time ever, the Australian banking regulator has announced it will impose new debt-...

Why the Mortgage Industry Needs More Women (And What We're Actually Doing About It)

I've been in fintech and the mortgage industry for about a year and a half now. My background is i...

Inflation jumps in October, adding to pressure on government to make budget savings

Annual inflation rose[1] to a 16-month high of 3.8% in October, adding to pressure on the govern...