Social media giants are not complying with under-16s social media ban, new report finds
- Written by Lisa M. Given, Professor of Information Sciences & Director, Social Change Enabling Impact Platform, RMIT University

Nearly four months into Australia’s social media ban for under-16s, the online regulator today released its first detailed compliance update report[1] on how the world-first policy is progressing.
eSafety’s report comes at a crucial time, with many other countries eyeing the progress of the ban. Since the ban took effect on December 10 last year, I have spoken with journalists from Canada, France, Germany, Japan, New Zealand, the United Kingdom and elsewhere. Everyone asks two questions: how successful is the ban, and are children still accessing social media platforms?
The new report paints a complicated picture – and leaves other key questions about the social media ban unanswered.
A number of compliance concerns
The report acknowledges social media companies have taken “some steps” to comply with the social media legislation (which restricts account holders to those aged 16 and older). Some 4.7 million accounts were removed by mid-January and another 310,000 by early March.
However, the report also highlights “compliance concerns” in four key areas:
-
Messaging to under-16s on some platforms encouraged children to attempt age assurance even where they declared themselves to be underage
-
Some platforms enabled under-16s to repeatedly attempt the same age-assurance method to ultimately pass age checks
-
Pathways for reporting age-restricted accounts have generally not been accessible and effective, particularly for parents
-
Some platforms appear not to have done enough to prevent under-16s having accounts.
The report explains the eSafety Commissioner, Julie Inman Grant, is now investigating Facebook, Instagram, Snapchat, TikTok and YouTube for “potential non-compliance”. None of these companies has yet been fined. A decision about any enforcement action will be made by the middle of the year.
The report comes a week after the Australian government registered a new legislative rule[2] to ensure the definition of social media platforms includes those “that have addictive or otherwise harmful design features”. These include:
- infinite scroll, which shows new content with no end point
- feedback features, such as displaying “likes” or “upvotes”, which can pressure people to compare themselves to others, and
- time-limited features such as disappearing “stories” that create a sense of urgency and encourage constant checking.
This rule change was implemented in the same week Meta and Google (parent companies of Instagram and YouTube) were found liable[3] by a jury in the United States for the addictive features of their social media platforms.
A ‘constantly evolving’ landscape
The removal of more than 5 million accounts in four months sounds impressive. But this does not equal the number of social media users.
Many people hold several social media accounts. So it remains unclear how many children under 16 still remain on one or more platforms. The report also doesn’t detail how many new accounts children created since the legislation was implemented.
The report also does not estimate the number of under-16s who now use alternative platforms. However, there have been reports of a significant spike[4] in downloads of non-mainstream platforms (such as RedNote, Yope and Lemon8[5]) since December.
The report acknowledges the social media landscape is “constantly evolving” and that it’s impossible to maintain a complete list of platforms that fall under the age restrictions. However, eSafety does maintain a list[6] of the initial platforms included under the ban legislation, and those that have self-identified and agreed to comply. These include Bluesky, dating platforms (such as Tinder) and Lemon8, but other platforms remain accessible to under-16s.
Since December, there have also been questions about whether Australia’s ban should extend to other platforms.
Reports point to the legislation’s “loophole” for gaming apps[7] and exclusions for messaging apps[8] such as WhatsApp and Messenger, as well as other platforms that include social networking features.
Roblox, which was initially considered under the ban and then exempted, has also made headlines[9] related to child safety.
It is currently being reviewed by the government over concerns about child grooming[10].
Unanswered questions
As eSafety continues to investigate issues related to compliance with the legislation, several key questions remain unanswered.
One is to do with the “reasonable steps” social media companies must take to comply with social media age restrictions. The report says this is “ultimately a question for the courts to determine”. It also explains that defining what steps are reasonable must be considered “in the context of the platform’s service, technological feasibility, and the regulatory landscape”.
But if a company uses age-assurance technologies, whose inbuilt error rates[11] allow some children to slip through the checks, will that company be considered to have taken reasonable steps to control account access?
A second question is whether eSafety will extend its compliance checks beyond the five mainstream platforms currently being investigated.
As new platforms are launched, and as children continue to seek new ways to connect with peers online, the potential spaces where they can encounter harm continues to grow. Is self-assessment by technology companies sufficient to enforce legislation intended to apply to all platforms that meet the definition of an age-restricted platform?
Finally, will the government continue to add new rules to keep kids safe?
One key limitation experts like me have highlighted since 2024[12] is that restricting access to accounts does not address the actual harms posed by content, algorithms and other platform features.
The government has completed consultation on its digital duty of care legislation[13]. But it is still unclear when this legislation will be introduced.
The new report on social media restrictions shows there is a long road ahead for compliance. And if we want to fully address the harms posed by these platforms, new legislation that actually targets the root problems is needed.
References
- ^ compliance update report (www.esafety.gov.au)
- ^ new legislative rule (www.infrastructure.gov.au)
- ^ found liable (theconversation.com)
- ^ significant spike (www.ft.com)
- ^ RedNote, Yope and Lemon8 (theconversation.com)
- ^ maintain a list (www.esafety.gov.au)
- ^ “loophole” for gaming apps (www.smh.com.au)
- ^ exclusions for messaging apps (www.9news.com.au)
- ^ made headlines (theconversation.com)
- ^ concerns about child grooming (www.abc.net.au)
- ^ inbuilt error rates (theconversation.com)
- ^ highlighted since 2024 (theconversation.com)
- ^ digital duty of care legislation (www.infrastructure.gov.au)
















