In a year of global elections, how do we stop the spread of misinformation? ‘Prebunking’ is part of the solution
- Written by Christopher Arnott, PhD Candidate, Griffith University
Half the global population are voting in elections in 2024[1]. Many already have. This has prompted concerns about fairness and electoral integrity, particularly with the growth of generative AI. A global tracker[2] has identified dozens of instances of AI-generated misinformation being used in elections so far this year.
One such example was in January, when residents of New Hampshire received a robocall[3] impersonating US President Joe Biden. AI platforms such as Eleven Labs[4] can produce convincing reproductions of anyone’s voice. In response, the US Federal Communications Commission effectively banned AI-generated robocalls[5].
However, banning AI-generated content is difficult, if not impossible. Speaking[6] about his concerns about misinformation and generative AI in elections, Tom Rogers, Australia’s electoral commissioner, acknowledged the risks, but also emphasised the importance of “prebunking” as an essential ingredient to prevent misinformation.
So, what is prebunking and how will it help protect electoral integrity?
What is prebunking?
Prebunking is similar to debunking, but as the name suggests, occurs before misinformation is received.
Prebunking is based on the idea of psychological inoculation[7]. If we anticipate misinformation, and the tactics used, we can be better at identifying it. Similar to how a vaccine works, prebunking gives your brain the ability to recognise misinformation tactics.
Professor of social psychology Sander Van der Linden and his colleagues have developed a game called Bad News[8] to better identify these tactics. Players act as a fake news tycoon who has 15 minutes to gain followers without losing credibility.
Studies[9] show 15 minutes of playing Bad News increases someone’s ability and confidence to detect misinformation.
The long-term efficacy[10] remains to be seen. However, what these studies demonstrate is that knowledge of misinformation tactics makes them easier to spot. And unfortunately, they are all too common.
Tactics in plain sight
For example, in last year’s Voice to Parliament referendum, Liberal National Party Senator Jacinta Nampijinpa Price suggested the Australian Electoral Commission remote booths had rigged the results[11]. Remote polling booths recorded a majority “yes” vote. This example demonstrates both attempts to create a conspiracy and discredit the electoral commission.
Earlier this year, the Tasmanian Liberal Party sought to impersonate Jacqui Lambie’s party[12]. In 2019, the Liberal Party also admitted[13] Chinese language signs were supposed to look like official electoral commission signs. Both are examples of impersonation.
Labor, too, has used these tactics in the past. In 2022, the party claimed[14] the cashless debit card will be expanded to aged pensioners. And in 2016 and 2022, there was the infamous Mediscare campaign[15], which said there were secret plans to privatise Medicare. Both campaigns used conspiracy and appeals to emotion.
Prebunking ensures voters can be vigilant. Unlike debunking, prebunking gives voters the capacity to recognise potential deception and manipulation. In March 2022, the electoral commission launched a disinformation register[16] to help combat political misinformation at the 2022 election. It’s focused on disinformation which undermines electoral integrity and confidence in Australia’s democracy. To help voters, the AEC also helped voters understand disinformation tactics[17].
The prebunking attempts by the electoral commission do not comment on misinformation which deceives voters about candidates and policies. Recent elections show that misinformation tactics at Australian elections are as common as a cane toad. And just as ugly.
But what about debunking?
Debunking can be effective[18] in preventing people believing misinformation.
However, this is not effective when people have reasons to accept misinformation as true. Put more simply, preexisting attitudes will shape a person’s evaluation of new information in deciding whether the information is false or misleading. People believe what they want to believe.
Further, some people strongly distrust[19] media, and this attitude increases hostility towards fact checkers, who they perceive as acting as propagandists.
Repeated exposure to false claims can lead to people believing them. After all, we only use 10% of our brains. Just kidding! This stat about brain use is a common example of false claims[20] becoming accepted knowledge.
Studies[21] have shown that repeated exposure to misinformation can increase false and inaccurate beliefs, even if the stories point out the falsity.
A bit of both
Unfortunately, prebunking, like debunking, is not a silver bullet. Both show some effectiveness[22].
Prebunking can help teach people to spot manipulation. Unlike debunking, prebunking provides a framework for the sceptical to remain vigilant without resorting to conspiracies. Prebunking allows people to examine the motivations of persuaders. In doing so, it builds cognitive skills[23].
However, the research to date indicates prebunking effects may be short-lived[24]. A potential factor which explains this might be that participants have spent to spend sufficient time engaging with prebunking materials for it to become a habit[25].
In contrast, while debunking is helpful, the effects are more pronounced among those who already believe and trust that fact checkers are not part of a government conspiracy. Emerging evidence[26] suggests repeated exposure to corrected information can produce changes in attitudes over time.
Read more: Can we be inoculated against climate misinformation? Yes – if we prebunk rather than debunk[27]
Cognitive psychology indicates “belief updating” occurs when beliefs and attitudes are weighed against new information. Known as Bayesian inference[28], this process takes new information and assesses how it reflects existing beliefs.
An example of this updating is climate change. In 2012, 64%[29] of Australians surveyed accepted climate change as real. In 2021, that figure grew to 81%. Over time, the Australian population has updated its views on climate change. This is likely due, at least in part, to a mixture of both prebunking and debunking.
While the next federal election isn’t likely to be held until 2025, prebunking can build confidence in voters’ ability to identify misinformation. Luckily, these simple techniques are easily spotted. With ten months before the next election, there’s plenty of time to practice.
References
- ^ elections in 2024 (theconversation.com)
- ^ global tracker (www.wired.com)
- ^ robocall (www.nbcnews.com)
- ^ Eleven Labs (elevenlabs.io)
- ^ banned AI-generated robocalls (www.forbes.com)
- ^ Speaking (www.abc.net.au)
- ^ psychological inoculation (www.ncbi.nlm.nih.gov)
- ^ Bad News (www.getbadnews.com)
- ^ Studies (www.tandfonline.com)
- ^ long-term efficacy (api.repository.cam.ac.uk)
- ^ rigged the results (www.theguardian.com)
- ^ impersonate Jacqui Lambie’s party (theconversation.com)
- ^ admitted (www.theguardian.com)
- ^ claimed (www.abc.net.au)
- ^ Mediscare campaign (www.abc.net.au)
- ^ disinformation register (www.aec.gov.au)
- ^ tactics (www.aec.gov.au)
- ^ effective (journals.sagepub.com)
- ^ distrust (www.tandfonline.com)
- ^ false claims (www.psychologicalscience.org)
- ^ Studies (wires.onlinelibrary.wiley.com)
- ^ effectiveness (www.taylorfrancis.com)
- ^ cognitive skills (journals.sagepub.com)
- ^ short-lived (api.repository.cam.ac.uk)
- ^ habit (jyx.jyu.fi)
- ^ evidence (link.springer.com)
- ^ Can we be inoculated against climate misinformation? Yes – if we prebunk rather than debunk (theconversation.com)
- ^ Bayesian inference (seeing-theory.brown.edu)
- ^ 64% (australiainstitute.org.au)