The Times Australia
Fisher and Paykel Appliances
The Times World News

.

Tech companies’ proposed new safety codes won’t protect all kids online

  • Written by Toby Murray, Professor of Cybersecurity, School of Computing and Information Systems, The University of Melbourne

In July last year, Australia’s eSafety Commissioner, Julie Inman Grant, directed[1] tech companies to develop codes of practice to keep children safe from online porn and harmful content. Now, after seven months, the industry has submitted draft codes[2] to eSafety for approval.

eSafety is currently assessing the draft codes.

Assuming Grant approves the new codes, what can we expect the future to look like for children and teens online? And how effective will the proposed codes be at protecting children?

A coordinated approach

The codes submitted for approval were developed by a group of industry associations[3].

They cover social media platforms such as Facebook and Snapchat. But they also cover internet service providers, search engines such as Google, online messaging services such as WhatsApp, online gaming platforms, as well as the manufacturers of the computers, mobile phones and software we use to access online services.

The codes will also cover online app stores such as those operated by Apple and Google. However, app store codes aren’t expected to be released until late March.

As well as covering a range of companies, the codes also cover a range of harms. They aim to protect kids not only from online pornography but also content that promotes self-harm, eating disorders, suicide and violence.

Given the difficulty of protecting kids from this kind of content, this coordinated approach is absolutely essential[4].

If the draft codes are approved, companies will have six months[5] to implement the proposed safety measures. They will face fines of up to A$50 million for non-compliance.

What’s in store?

The draft codes are broken up across different parts of the tech ecosystem. The requirements they place on individual tech platforms depend on the danger harmful content on each platform poses to children.

Large social media platforms such as Facebook, Instagram and X (formerly Twitter) are likely to be categorised among the most dangerous. That’s because it’s possible for users to access extremely harmful content such as child sexual abuse or terrorist material on these platforms. Plus, these platforms serve millions of people and also allow users to create public profiles, maintain “friend” lists, and share content widely.

According to the draft codes, these platforms will need to implement the most stringent safety measures. These include using age-assurance[6] measures to prevent children under the minimum age allowed to access the service from doing so, having an appropriately resourced trust and safety[7] team, and using automated systems to detect and remove child abuse and pro-terror material.

On the other hand, less risky platforms won’t be subject to any requirements under the draft codes. These include online platforms that allow only limited communication within a specific group of people and without social media features such as friends lists and public profiles. Platforms for communication within a primary school such as Compass[8] would be among the least risky.

Online search engines such as Google and Bing – which provide access to adult and self-harm content, but are legitimately used by children – will be required to implement appropriate measures to prevent children accessing that content.

This may include enabling safe-search features and establishing child-user accounts. These accounts would include features that automatically blur harmful content and filter such content from search results and recommendation algorithms

The codes also cover emerging harmful technology, such as deepfake porn apps powered by generative artificial intelligence. Like traditional porn sites, these will be required to implement age-assurance technology to prevent children using these services.

A woman with blond hair, wearing a pink blazer, seated before a microphone.
The eSafety commissioner, Julie Inman Grant, is currently assessing the proposed online safety codes. Mick Tsikas/AAP

What about age assurance?

The codes specifically define[9] what age-assurance measures are considered “appropriate”.

Importantly, just because an age-checking system can be bypassed doesn’t disqualify it. Instead, age assurance measures must include “reasonable steps” to ensure someone is of age, while balancing privacy concerns[10].

Requiring users to self-declare their age is not appropriate. So expect to see porn sites do away with click-through dialogs asking visitors to declare they are really adults.

Instead, sites will have a range of options for assuring their users’ ages, including photo ID, estimating age based on facial images or video, having a parent attest to a child’s age, leveraging credit card checks, or AI-based methods for age inference.

Different measures are likely to be used by different companies and systems.

For example, Apple has already announced a range of new child safety measures[11] that appear to align with many parts of the draft codes. These include making it easier for parents to set up child safety features on kids’ iPads and iPhones, using a parent’s payment information to ensure they can safely attest to their child’s age, as well as app store integration of child safety features to enable app developers to make their apps safer for children.

On the other hand, adult sites and apps are likely to adopt age-assurance mechanisms that users perceive to be more private. For paying subscribers, they are likely to leverage the credit information already stored to assure the users’ age.

Non-subscribers may instead be required to submit to a facial scan or other AI-based methods to estimate their age.

Publicly available data[12] on state-of-the-art systems for age estimation from facial images suggests the best systems have an average error of 3.7 years.

Whether eSafety will agree such technology is “appropriate” remains to be seen. However, if it is adopted, there is a real risk many teens will remain able to access online porn and harmful deepfake apps despite these new codes.

References

  1. ^ directed (theconversation.com)
  2. ^ draft codes (onlinesafety.org.au)
  3. ^ group of industry associations (onlinesafety.org.au)
  4. ^ essential (theconversation.com)
  5. ^ six months (www.theguardian.com)
  6. ^ age-assurance (theconversation.com)
  7. ^ trust and safety (www.forbes.com.au)
  8. ^ Compass (www.compass.education)
  9. ^ define (onlinesafety.org.au)
  10. ^ privacy concerns (pursuit.unimelb.edu.au)
  11. ^ new child safety measures (techcrunch.com)
  12. ^ Publicly available data (archive.is)

Read more https://theconversation.com/tech-companies-proposed-new-safety-codes-wont-protect-all-kids-online-251266

Times Magazine

Q&A with Freya Alexander – the young artist transforming co-working spaces into creative galleries

As the current Artist in Residence at Hub Australia, Freya Alexander is bringing colour and creativi...

This Christmas, Give the Navman Gift That Never Stops Giving – Safety

Protect your loved one’s drives with a Navman Dash Cam.  This Christmas don’t just give – prote...

Yoto now available in Kmart and The Memo, bringing screen-free storytelling to Australian families

Yoto, the kids’ audio platform inspiring creativity and imagination around the world, has launched i...

Kool Car Hire

Turn Your Four-Wheeled Showstopper into Profit (and Stardom) Have you ever found yourself stand...

EV ‘charging deserts’ in regional Australia are slowing the shift to clean transport

If you live in a big city, finding a charger for your electric vehicle (EV) isn’t hard. But driv...

How to Reduce Eye Strain When Using an Extra Screen

Many professionals say two screens are better than one. And they're not wrong! A second screen mak...

The Times Features

Are mental health issues genetic? New research identifies brain cells linked to depression

Scientists from McGill University and the Douglas Institute recently published new research find...

What do we know about climate change? How do we know it? And where are we headed?

The 2025 United Nations Climate Change Conference (sometimes referred to as COP30) is taking pla...

The Industry That Forgot About Women - Until Now

For years, women in trades have started their days pulling on uniforms made for someone else. Th...

Q&A with Freya Alexander – the young artist transforming co-working spaces into creative galleries

As the current Artist in Residence at Hub Australia, Freya Alexander is bringing colour and creativi...

Indo-Pacific Strength Through Economic Ties

The defence treaty between Australia and Indonesia faces its most difficult test because of econ...

Understanding Kerbside Valuation: A Practical Guide for Property Owners

When it comes to property transactions, not every situation requires a full, detailed valuation. I...

What’s been happening on the Australian stock market today

What moved, why it moved and what to watch going forward. 📉 Market overview The benchmark S&am...

The NDIS shifts almost $27m a year in mental health costs alone, our new study suggests

The National Disability Insurance Scheme (NDIS) was set up in 2013[1] to help Australians with...

Why Australia Is Ditching “Gym Hop Culture” — And Choosing Fitstop Instead

As Australians rethink what fitness actually means going into the new year, a clear shift is emergin...