A new online safety bill could allow censorship of anyone who engages with sexual content on the internet
- Written by Zahra Zsuzsanna Stardust, Adjunct Lecturer, Centre for Social Research in Health, Research Assistant, Faculty of Law and Justice, UNSW
Under new draft laws[1], the eSafety Commissioner[2] could order your nude selfies, sex education or slash fiction[3] to be taken down from the internet with just 24 hours notice.
Officially, the Morrison government’s new bill aims to improve online safety.
But in doing so, it gives broad, discretionary powers to the commissioner, with serious ramifications for anyone who engages with sexual content online.
Broad new powers
After initial consultation in 2019, the federal government released the draft[4] online safety bill last December. Public submissions closed on the weekend.
The bill contains several new initiatives, from cyberbullying protections for children to new ways to remove non-consensual intimate imagery[5].
Lukas Coch/AAPCrucially, it gives the eSafety Commissioner[6] — a federal government appointee[7] — a range of new powers.
It contains rapid website-blocking provisions to prevent the circulation of “abhorrent violent material” (such as live-streaming terror attacks). It reduces the timeframe for “takedown notices[8]” (where a hosting provider is directed to remove content) from 48 to 24 hours. It can also require search engines to delete links and app stores to prevent downloads, with civil penalties of up to $111,000 for non-compliance.
But one concerning element of the bill that has not received wide public attention is its takedown notices for so-called “harmful online content”.
A move towards age verification
Due to the impracticality of classifying the entire internet[9], regulators are now moving towards systems that require access restrictions for certain content and make use of user complaints to identify harmful material.
In this vein, the proposed bill will require online service providers to use technologies to prevent children gaining access to sexual material.
Read more: Coalition plans to improve online safety don't address the root cause of harms: the big tech business model[10]
Controversially, the bill gives the commissioner power to impose their own specific “restricted access system[11]”.
This means the commissioner could decide that, to access sexual content, users must upload their identity documents, scan their fingerprints, undergo facial recognition technology[12] or have their age estimated by artificial intelligence based on behavioural signals.
But there are serious issues[13] with online verification systems. This has already been considered and abandoned by similar countries[14]. The United Kingdom dropped its plans in 2019, following implementation difficulties and privacy concerns.
The worst-case scenario here is governments collect databases of people’s sexual preferences and browsing histories that can be leaked, hacked[15], sold or misused.
eSafety Commissioner as ‘chief censor’
The bill also creates an “online content scheme[16]”, which identifies content that users can complain about.
The bill permits any Australian internet user to make complaints about “class 1” and “class 2”[17] content that is not subject to a restricted access system. These categories are extremely broad, ranging from actual, to simulated, to implied sexual activity, as well as explicit nudity.
In practice, people can potentially complain about any material depicting sex that they find on the internet, even on specific adult sites, if there is no mechanism to verify the user’s age.
www.shutterstock.comThe draft laws then allow the commissioner to conduct investigations and order removal notices as they “think fit”. There are no criteria for what warrants removal, no requirement to give reasons, and no process for users to be notified or have opportunity to respond to complaints.
Without the requirement to publish transparent enforcement data, the commissioner can simply remove content that is neither harmful nor unlawful and is specifically exempt from liability for damages or civil proceedings.
This means users will have little clarity on how to actually comply with the scheme.
Malicious complaints and self-censorship
The potential ramifications of the bill are broad. They are likely to affect sex workers, sex educators, LGBTIQ health organisations, kink communities, online daters, artists and anyone who shares or accesses sexual content online.
While previous legislation was primarily concerned with films, print publications, computer games and broadcast media, this bill applies to social media, instant messaging, online games, websites, apps and a range of electronic and internet service providers.
www.shutterstock.comIt means links to sex education and harm reduction material for young people could be deleted by search engines. Hook up apps such as Grindr or Tinder could be made unavailable for download. Escort advertising platforms could be removed. Online kink communities like Fetlife could be taken down.
The legislation could embolden users - including anti-pornography advocates, disgruntled customers or ex-partners - to make vexatious complaints about sexual content, even where there is nothing harmful about it.
The complaints system is also likely to have a disproportionate impact on sex workers, especially those who turned to online work[18] during the pandemic, and who already face a high level of malicious complaints.
Sex workers consistently report restrictive terms of service[19] as well as shadowbanning and deplatforming[20], where their content is stealthily or selectively removed from social media.
Read more: How the 'National Cabinet of Whores' is leading Australia's coronavirus response for sex workers[21]
The requirement for service providers to restrict children’s access to sexual content also provides a financial incentive to take an over-zealous approach. Providers may employ artificial intelligence at scale to screen and detect nudity[22] (which can confuse sex education with pornography[23]), apply inappropriate age verification mechanisms that compromise user privacy[24], or, where this is too onerous or expensive, take the simpler route of prohibiting sexual content altogether[25].
In this sense, the bill may operate in a similar way to United States “FOSTA-SESTA” anti-trafficking legislation[26], which prohibits websites from promoting or facilitating prostitution. This resulted in the pre-emptive closure of essential sites[27] for sex worker safety, education and community building.
New frameworks for sexual content moderation
Platforms have been notoriously poor[28] when it comes to dealing with sexual content. But governments have not been any better[29].
We need new ways to think about moderating sexual content.
Historically, obscenity legislation has treated all sexual content as if it was lacking in value unless it was redeemed by literary, artistic or scientific merit[30]. Our current classification framework of “offensiveness” is also based on outdated notions of “morality, decency and propriety[31]”.
Read more: The Chatterley Trial 60 years on: a court case that secured free expression in 1960s Britain[32]
Research[33] into sex and social media[34] suggests we should not simply conflate sex with risk[35].
Instead, some have proposed human rights approaches[36]. These draw on a growing body of literature[37] that sees sexual health, pleasure and satisfying sexual experiences as compatible with bodily autonomy, safety and freedom from violence.
Others have pointed to the need for improved sex education, consent skills and media literacy[38] to equip users to navigate online space.
What’s obvious is we need a more nuanced approach to decision-making that imagines sex beyond “harm”, thinks more comprehensively about safer spaces, and recognises the cultural value in sexual content.
References
- ^ new draft laws (www.communications.gov.au)
- ^ eSafety Commissioner (www.esafety.gov.au)
- ^ slash fiction (www.bustle.com)
- ^ released the draft (minister.infrastructure.gov.au)
- ^ non-consensual intimate imagery (theconversation.com)
- ^ eSafety Commissioner (www.esafety.gov.au)
- ^ appointee (www.esafety.gov.au)
- ^ takedown notices (www.esafety.gov.au)
- ^ classifying the entire internet (www.alrc.gov.au)
- ^ Coalition plans to improve online safety don't address the root cause of harms: the big tech business model (theconversation.com)
- ^ restricted access system (www.esafety.gov.au)
- ^ facial recognition technology (www.nytimes.com)
- ^ serious issues (www.tandfonline.com)
- ^ considered and abandoned by similar countries (www.theguardian.com)
- ^ leaked, hacked (www.smh.com.au)
- ^ online content scheme (www.communications.gov.au)
- ^ “class 1” and “class 2” (www.businessinsider.com.au)
- ^ turned to online work (redbook.scarletalliance.org.au)
- ^ restrictive terms of service (medium.com)
- ^ shadowbanning and deplatforming (hackinghustling.org)
- ^ How the 'National Cabinet of Whores' is leading Australia's coronavirus response for sex workers (theconversation.com)
- ^ screen and detect nudity (www.theguardian.com)
- ^ confuse sex education with pornography (mashable.com)
- ^ compromise user privacy (www.computerweekly.com)
- ^ prohibiting sexual content altogether (nypost.com)
- ^ United States “FOSTA-SESTA” anti-trafficking legislation (hackinghustling.org)
- ^ pre-emptive closure of essential sites (hackinghustling.org)
- ^ notoriously poor (www.highsnobiety.com)
- ^ not been any better (www.starobserver.com.au)
- ^ literary, artistic or scientific merit (www.mtsu.edu)
- ^ morality, decency and propriety (www.artslaw.com.au)
- ^ The Chatterley Trial 60 years on: a court case that secured free expression in 1960s Britain (theconversation.com)
- ^ Research (vimeo.com)
- ^ sex and social media (books.emeraldinsight.com)
- ^ conflate sex with risk (mitpress.mit.edu)
- ^ human rights approaches (vimeo.com)
- ^ body of literature (worldsexualhealth.net)
- ^ media literacy (www.tandfonline.com)