The Times Australia
Google AI
The Times World News

.

Australia’s new approach to tackling fake news on digital platforms

  • Written by Andrea Carson, Professor of Political Communication, Department of Politics, Media and Philosophy, La Trobe University
Australia’s new approach to tackling fake news on digital platforms

An urgent problem for governments around the world in the digital age is how to tackle the harms caused by mis- and disinformation, and Australia is no exception.

Together, mis- and disinformation fall under the umbrella term of “fake news”. While this phenomenon isn’t new, the internet makes its rapid, vast spread unprecedented.

It’s a tricky problem and hard to police because of the sheer amount of misinformation online. But, left unchecked, public health and safety, electoral integrity, social cohesion and ultimately democracy are at risk. The COVID-19 pandemic taught us not to be complacent, as fake news about COVID treatments led to deadly consequences[1].

Read more: COVID misinformation is a health risk – tech companies need to remove harmful content not tweak their algorithms[2]

But what’s the best way to manage fake news spread? How can it be done without government overreach, which risks the freedom and diversity of expression necessary for deliberation in healthy democracies?

Last month, Minister for Communications Michelle Rowland released a draft exposure bill[3] to step up Australia’s fight against harmful online mis- and disinformation.

It offers more stick (hefty penalties) and less carrot (voluntary participation) than the current approach to managing online content.

If passed, the bill will see Australia shift from a voluntary to a mandatory co-regulatory model.

Following an EU model

According to the draft, disinformation is spread intentionally, while misinformation is not.

But both can cause serious harms including hate speech, financial harm and disruption of public order, according to the Australian Communications and Media Authority (ACMA).

To date, research[4] shows countries tend to approach this problem in three distinct ways:

  • non-regulatory “supporting activities” such as digital literacy campaigns and fact-checking units to debunk falsehoods

  • voluntary or mandatory co-regulatory measures involving digital platforms and existing media authorities

  • anti-fake news laws.

The Albanese government’s draft bill will bring us closer to the European Union-style model of mandatory co-regulation.

Platforms remain responsible, not government

Initial opinions about the bill are divided. Some commentators[5] have called the proposed changes “censorship”, arguing it will have a chilling effect on free speech.

These comments are often unhelpful because they conflate co-regulation with more draconian measures such anti-fake news laws adopted in illiberal states like Russia, whereby governments arbitrarily rule what information is “fake”.

For example, Russia amended its Criminal Code[6] in 2022 to make the spread of “fake” information an offence punishable with jail terms of up to 15 years, to suppress the media and political dissent about its war in Ukraine.

To be clear, under the proposed Australian bill, platforms continue to be responsible for the content on their services – not governments.

The new powers allow ACMA to look under the platform’s hood to see how they deal with online mis- and disinformation that can cause serious harm, and to request changes to processes (not content). ACMA can set industry standards as a last resort.

Read more: Why public trust in elections is being undermined by global disinformation campaigns[7]

The proposed changes don’t give ACMA arbitrary powers to determine what content is true or false, nor can it direct specific posts to be removed. Content of private messages, authorised electoral communications, parody and satire, and news media all remain outside the scope of the proposed changes.

None of this is new. Since 2021, Australia has had a voluntary Code of Practice on Disinformation and Misinformation[8], developed for digital platforms by their industry association (known as DIGI).

This followed government recommendations arising out of a lengthy Australian Competition and Consumer Commission (ACCC) inquiry into digital platforms. This first effort at online regulation was a good start to stem harmful content using an opt-in model.

But voluntary codes have shortfalls. The obvious being that not all platforms decide to participate, and some cherry-pick the areas of the code they will respond to.

The proposed changes

The Australian government is now seeking to deliver on a bipartisan[9] promise to strengthen the regulators’ powers to tackle online mis- and disinformation by shifting to a mandatory co-regulatory model.

Under the proposed changes, ACMA will be given new information gathering powers and capacity to formally request an industry association (such as DIGI) vary or replace codes that aren’t up to scratch.

Platform participation with registered codes will be compulsory and attract warnings, fines and, if unresolved, hefty court-approved penalties for noncompliance.

These penalties are steep – as much as 5% of a platform’s annual global turnover if repeatedly in breach of industry standards.

The move from voluntary to mandatory regulation in Australia is logical given the EU[10] has set the foundation for other countries to hold digital technology companies responsible for curbing mis- and disinformation on their platforms.

Questions remain

But the draft bill raises important questions to address before it’s legislated as planned for later this year. Among them are:

  • how to best define mis- and disinformation? (at present the definitions are different to DIGI’s)

  • how to deal with the interrelationship between mis- and disinformation, especially regarding election content? There’s a potential issue because research[11] shows the same content labelled “disinformation” can also be labelled “misinformation” depending on the online user’s motive, which can be hard to divine

  • and why exclude online news media content? Research has shown news media can also be a source of harmful misinformation (such as 2019 election stories about the “Death Tax[12]”).

While aiming to mitigate harmful mis- and disinformation is noble, how it will work in practice remains to be seen.

An important guard against unintended consequences is to ensure ACMA’s powers are carefully defined along with terms and likely circumstances requiring action, with mechanisms for appeal.

Public submissions close[13] August 6.

References

  1. ^ deadly consequences (www.ncbi.nlm.nih.gov)
  2. ^ COVID misinformation is a health risk – tech companies need to remove harmful content not tweak their algorithms (theconversation.com)
  3. ^ draft exposure bill (www.infrastructure.gov.au)
  4. ^ research (eprints.qut.edu.au)
  5. ^ commentators (www.skynews.com.au)
  6. ^ Criminal Code (cpj.org)
  7. ^ Why public trust in elections is being undermined by global disinformation campaigns (theconversation.com)
  8. ^ Code of Practice on Disinformation and Misinformation (digi.org.au)
  9. ^ bipartisan (www.paulfletcher.com.au)
  10. ^ the EU (digital-strategy.ec.europa.eu)
  11. ^ research (benjamins.com)
  12. ^ Death Tax (benjamins.com)
  13. ^ close (www.infrastructure.gov.au)

Read more https://theconversation.com/more-stick-less-carrot-australias-new-approach-to-tackling-fake-news-on-digital-platforms-209599

Times Magazine

Shark launches SteamSpot - the shortcut for everyday floor mess

Shark introduces the Shark SteamSpot Steam Mop, a lightweight steam mop designed to make everyda...

Game Together, Stay Together: Logitech G Reveals Gaming Couples Enjoy Higher Relationship Satisfaction

With Valentine’s Day right around the corner, many lovebirds across Australia are planning for the m...

AI threatens to eat business software – and it could change the way we work

In recent weeks, a range of large “software-as-a-service” companies, including Salesforce[1], Se...

Worried AI means you won’t get a job when you graduate? Here’s what the research says

The head of the International Monetary Fund, Kristalina Georgieva, has warned[1] young people ...

How Managed IT Support Improves Security, Uptime, And Productivity

Managed IT support is a comprehensive, subscription model approach to running and protecting your ...

AI is failing ‘Humanity’s Last Exam’. So what does that mean for machine intelligence?

How do you translate ancient Palmyrene script from a Roman tombstone? How many paired tendons ...

The Times Features

5 Cool Ways to Transform Your Interior in 2026

We are at the end of the great Australian summer, and this is the perfect time to start thinking a...

What First-Time Buyers Must Know About Mortgages and Home Ownership

The reality is, owning a home isn’t for everyone. It’s a personal lifestyle decision rather than a...

SHOP 2026’s HOTTEST HOME TRENDS AT LOW PRICES WITH KMART’S FEBRUARY LIVING COLLECTION

Kmart’s fresh new February Living range brings affordable style to every room, showcasing an  insp...

Holafly report finds top global destinations for remote and hybrid workers

Data collected by Holafly found that 8 in 10 professionals plan to travel internationally in 202...

Will Ozempic-style patches help me lose weight? Two experts explain

Could a simple patch, inspired by the weight-loss drug Ozempic[1], really help you shed excess k...

Parks Victoria launches major statewide recruitment drive

The search is on for Victoria's next generation of rangers, with outdoor enthusiasts encouraged ...

Labour crunch to deepen in 2026 as regional skills crisis escalates

A leading talent acquisition expert is warning Australian businesses are facing an unprecedented r...

Technical SEO Fundamentals Every Small Business Website Must Fix in 2026

Technical SEO Fundamentals often sound intimidating to small business owners. Many Melbourne busin...

Most Older Australians Want to Stay in Their Homes Despite Pressure to Downsize

Retirees need credible alternatives to downsizing that respect their preferences The national con...