The Times Australia
Fisher and Paykel Appliances
The Times World News

.

The AEC wants to stop AI and misinformation. But it’s up against a problem that is deep and dark

  • Written by Susan Grantham, Lecturer in Communication, Griffith University




From the moment you open your social media feed, you’re stepping into a digital battleground where not all political messages are what they seem.

The upcoming federal election will see an influx of deepfakes, doctored images, and tailored narratives that blur the line between fact and fiction.

Last week, the Australian Electoral Commission (AEC) relaunched its Stop and Consider[1] campaign. The campaign urges voters to pause and reflect, particularly regarding information about how to vote. But its message applies to all forms of misinformation.

‘Stop and Consider’ factsheet. Australian Electoral Commission

AEC Commissioner Jeff Pope warns:

A federal election must be held in the next few months, so now is the perfect time to encourage all Australians to have a healthy degree of scepticism when it comes to what they see, hear or read.

The simple directives outlined in this campaign are designed to slow the spread of misleading information in a digital age where algorithms boost engagement at speed.

So how effective is it likely to be in helping voters sift the real from the fake? While the campaign benefits from the AEC’s credibility and its accessible message, it also faces significant hurdles.

Digital deception in action

In 2024, AI made a notable impact on international political campaigns.

In the US, the Federal Communications Commission fined a political consultant[2] $6 million for orchestrating fake robocalls that featured an AI-generated deepfake of President Joe Biden’s voice.

During India’s 2024 election, Meta (which owns Facebook) approved AI-manipulated ads spreading disinformation and hate[3]. This exacerbated divisive narratives and failing to regulate harmful content.

Meanwhile, the Australian Labor Party deployed an AI-generated video[4] of opposition leader Peter Dutton as part of its online efforts.

Additionally, the Liberal Party has again engaged duo Topham Guerin[5], who are known for their use of AI and controversial political tactics.

Political leaders[6] are increasingly turning to platforms like TikTok[7] to attract votes. But one of the problems with TikTok for users is that it encourages endless scrolling and can cause users to miss subtle inaccuracies.

Adding to these concerns is a recent scam in which[8] doctored images and fabricated celebrity headlines were circulated. It created the illusion of legitimacy and defrauded many Australians of their money.

These incidents are a stark reminder of how quickly digital manipulation can mislead, whether in commercial scams or political messaging.

Sophie Monk was one of the celebrities who featured in a recent online scam. Daily Mail

But are we taking it seriously?

South Korea has taken a decisive stance against AI-generated deepfakes in political campaigns[9] by banning them outright. Penalties include up to seven years in prison or fines of 50 million won (A$55,400). This measure forms part of a broader legal framework designed to enforce transparency, accountability, and ethical AI use.

In Australia, teal independents are calling for stricter truth in political advertising laws[10]. The proposed laws aim to impose civil penalties for misleading political ads, including disinformation and hate speech.

However, combating misinformation created by anonymous or unknown parties, such as AI-generated deepfakes, remains a challenge that may require further regulatory measures and technological solutions[11].

All of this is unfolding at a time when the approach to fact-checking is itself in flux. In January, Meta made headlines by scrapping its third-party fact-checking program in the US. This was done in favour of a “community notes” system. The change was championed by CEO Mark Zuckerberg as a way to reduce censorship and protect free expression.

However, critics warn that without independent oversight[12], misinformation could spread more easily, potentially leading to a surge in hate speech and harmful rhetoric. These shifts in digital policy only add to the challenge of ensuring that voters receive reliable information.

So, will the AEC’s campaign have any effect?

Amid these challenges, the “Stop and Consider” campaign arrives at a critical moment. Yet despite scholars’ repeated calls[13] to embed digital literacy in school curriculums[14] and community programs, these recommendations often go unheard.

The campaign is a positive step, offering guidance in an era of rapid digital manipulation. The simple message – to pause and verify political content — can help foster a more discerning electorate.

However, given the volume of misinformation and sophisticated targeting techniques, the campaign alone is unlikely to be a silver bullet. Political campaigns are growing ever more sophisticated. With the introduction of anonymous deepfakes, voters, educators, regulators, and platforms must work together to ensure the truth isn’t lost in digital noise.

A robust foundation in digital literacy is vital. Not only for this campaign to work but to help society distinguish credible sources from deceptive content. We must empower future voters to navigate the complexities of our digital world and engage more fully in democracy.

Globally, diverse strategies provide valuable insights.

While Australia’s “Stop and Consider” campaign takes a reflective approach, Sweden’s “Bli inte lurad[15]” initiative is refreshingly direct. It warns citizens: “Don’t be fooled.”

By delivering clear, actionable tips to spot scams and misleading content[16], the Swedish model leverages its strong tradition of public education and consumer protection.

This no-nonsense strategy reinforces digital literacy efforts. It also highlights that safeguarding the public from digital manipulation requires both proactive education and robust regulatory measures.

It may be time for Australian regulators to act decisively to protect the integrity of democracy.

References

  1. ^ Stop and Consider (www.aec.gov.au)
  2. ^ the Federal Communications Commission fined a political consultant (www.reuters.com)
  3. ^ Meta (which owns Facebook) approved AI-manipulated ads spreading disinformation and hate (www.theguardian.com)
  4. ^ AI-generated video (theconversation.com)
  5. ^ again engaged duo Topham Guerin (7ampodcast.com.au)
  6. ^ Political leaders (theconversation.com)
  7. ^ platforms like TikTok (www.tandfonline.com)
  8. ^ recent scam in which (www.dailymail.co.uk)
  9. ^ decisive stance against AI-generated deepfakes in political campaigns (www.nec.go.kr)
  10. ^ truth in political advertising laws (www.afr.com)
  11. ^ may require further regulatory measures and technological solutions (www.theguardian.com)
  12. ^ critics warn that without independent oversight (theconversation.com)
  13. ^ Yet despite scholars’ repeated calls (theconversation.com)
  14. ^ embed digital literacy in school curriculums (theconversation.com)
  15. ^ Bli inte lurad (bliintelurad.se)
  16. ^ clear, actionable tips to spot scams and misleading content (www.theguardian.com)

Read more https://theconversation.com/the-aec-wants-to-stop-ai-and-misinformation-but-its-up-against-a-problem-that-is-deep-and-dark-248773

Times Magazine

Yoto now available in Kmart and The Memo, bringing screen-free storytelling to Australian families

Yoto, the kids’ audio platform inspiring creativity and imagination around the world, has launched i...

Kool Car Hire

Turn Your Four-Wheeled Showstopper into Profit (and Stardom) Have you ever found yourself stand...

EV ‘charging deserts’ in regional Australia are slowing the shift to clean transport

If you live in a big city, finding a charger for your electric vehicle (EV) isn’t hard. But driv...

How to Reduce Eye Strain When Using an Extra Screen

Many professionals say two screens are better than one. And they're not wrong! A second screen mak...

Is AI really coming for our jobs and wages? Past predictions of a ‘robot apocalypse’ offer some clues

The robots were taking our jobs – or so we were told over a decade ago. The same warnings are ...

Myer celebrates 70 years of Christmas windows magic with the LEGO Group

To mark the 70th anniversary of the Myer Christmas Windows, Australia’s favourite department store...

The Times Features

What’s been happening on the Australian stock market today

What moved, why it moved and what to watch going forward. 📉 Market overview The benchmark S&am...

The NDIS shifts almost $27m a year in mental health costs alone, our new study suggests

The National Disability Insurance Scheme (NDIS) was set up in 2013[1] to help Australians with...

Why Australia Is Ditching “Gym Hop Culture” — And Choosing Fitstop Instead

As Australians rethink what fitness actually means going into the new year, a clear shift is emergin...

Everyday Radiance: Bevilles’ Timeless Take on Versatile Jewellery

There’s an undeniable magic in contrast — the way gold catches the light while silver cools it down...

From The Stage to Spotify, Stanhope singer Alyssa Delpopolo Reveals Her Meteoric Rise

When local singer Alyssa Delpopolo was crowned winner of The Voice last week, the cheers were louder...

How healthy are the hundreds of confectionery options and soft drinks

Walk into any big Australian supermarket and the first thing that hits you isn’t the smell of fr...

The Top Six Issues Australians Are Thinking About Today

Australia in 2025 is navigating one of the most unsettled periods in recent memory. Economic pre...

How Net Zero Will Adversely Change How We Live — and Why the Coalition’s Abandonment of That Aspiration Could Be Beneficial

The drive toward net zero emissions by 2050 has become one of the most defining political, socia...

Menulog is closing in Australia. Could food delivery soon cost more?

It’s been a rocky road for Australia’s food delivery sector. Over the past decade, major platfor...