The Times Australia

The Times World News
The Times

Conspiracy theorist tactics show it’s too easy to get around Facebook’s content policies

  • Written by Amelia Johns, Associate Professor, Digital and Social Media, School of Communication, University of Technology Sydney

During the COVID pandemic, social media platforms were swarmed by far-right and anti-vaccination communities that spread dangerous conspiracy theories.

These included the false claims that vaccines are a form of population control[1], and that the virus was a “deep state” plot[2]. Governments and the World Health Organization redirected precious resources from vaccination campaigns to debunk these falsehoods.

As the tide of misinformation grew, platforms were accused of not doing enough to stop the spread. To address these concerns, Meta, the parent company of Facebook, made several policy announcements in 2020–21. However, it hesitated to remove “borderline[3]” content, or content that didn’t cause direct physical harm, save for one policy change[4] in February 2021 that expanded the content removal lists.

To stem the tide, Meta continued to rely more heavily on algorithmic moderation techniques to reduce the visibility of misinformation in users’ feeds, search and recommendations – known as shadowbanning. They also used fact-checkers to label misinformation.

While shadowbanning is widely seen as a concerningly opaque technique[5], our new research[6], published in the journal Media International Australia, instead asks: was it effective?

What did we investigate?

We used two measures to answer this question. First, after identifying 18 Australian far-right and anti-vaccination accounts that consistently shared misinformation between January 2019 and July 2021, we analysed the performance of these accounts using key metrics.

Second, we mapped this performance against five content moderation policy announcements for Meta’s flagship platform, Facebook.

The findings revealed two divergent trends. After March 2020 the overall performance of the accounts – that is, their median performance – suffered a decline. And yet their mean performance shows increasing levels after October 2020.

This is because, while the majority of the monitored accounts underperformed, a few accounts overperformed instead, and strongly so. In fact, they continued to overperform and attract new followers even after the alleged policy change in February 2021.

Shadowbanning as a badge of pride

To examine why, we scraped and thematically analysed comments and user reactions from posts on these accounts. We found users had a high motivation to stay engaged with problematic content. Labelling and shadowbanning were viewed as motivating challenges.

Specifically, users frequently used “social steganography[7]” – using deliberate typos or code words for key terms – to evade algorithmic detection. We also saw conspiracy “seeding”[8] where users add links to archiving sites or less moderated sites in comments to re-distribute content Facebook labelled as misinformation, and to avoid detection.

In one example, a user added a link to a BitChute[9] video with keywords that dog-whistled support for QAnon style conspiracies. As terms such as “vaccine” were believed to trigger algorithmic detection, emoji or other code names were used in their place:

A friend sent me this link, it’s [sic.] refers to over 4000 deaths of individuals after getting 💉 The true number will not come out, it’s not in the public’s interest to disclose the amount of people that have died within day’s [sic.] of jab.

While many conspiracy theories were targeted at government and public health authorities, platform suppression of content fuelled further conspiracies regarding big tech and their complicity with “Big Pharma” and governments.

This was evident in the use of keywords such as MSM (“mainstream media”) to reference QAnon style agendas:

MSM are in on this whole thing, only report on what the elites tell them to. Clearly you are not doing any research but listening to msm […] This is a completely experimental ‘vaccine’.

Another comment thread showed reactions to Meta’s dangerous organisations policy update[10], where accounts that regularly shared QAnon-content were labelled “extremist”. In the reactions, MSM and “the agenda” appeared frequently.

Read more: QAnon is spreading outside the US – a conspiracy theory expert explains what that could mean[11]

Some users recommended that sensitive content be moved to alternative platforms. We observed one anti-vaccination influencer complain that their page was being shadowbanned by Facebook, and calling on their followers to recommend a “good, censorship free, livestreaming platform”.

The replies suggested moderation-lite sites such as Rumble[12]. Similar recommendations were made for Twitch, a livestreaming site popular with gamers which has since attracted far-right political influencers[13].

As one user said:

I know so many people who get censored on so many apps especially Facebook and Twitch seems to work for them.

How can content moderation fix the problem?

These tactics of coordination to detect shadowbans, resist labelling and fight the algorithm provide some insight into why engagement didn’t dim on some of these “overperforming” accounts despite all the policies Meta put in place.

This shows that Meta’s suppression techniques, while partially effective in containing the spread, do nothing to prevent those invested in sharing (and finding) misinformation from doing so.

Firmer policies on content removal and user banning would help address the problem. However, Meta’s announcement last year suggests[14] the company has little appetite for this. Any loosening of policy changes will all but ensure this misinformation playground will continue to thrive.

Read more: A researcher asked COVID anti-vaxxers how they avoid Facebook moderation. Here's what they found[15]

Read more https://theconversation.com/conspiracy-theorist-tactics-show-its-too-easy-to-get-around-facebooks-content-policies-226118

How can we improve public health communication for the next pandemic? Tackling distrust and misinformation is key

There’s a common thread linking our experience of pandemics[1] over the past 700 years. From the...

Times Lifestyle

How to Ensure You Don’t Miss Out on a Ticket for the Next Huge Ev…

It can be a moment of huge excitement when a concert or huge event is announced to be coming to a nearby venue. There are l...

Coast of Gold Bursts into Australian Market with Award-Winning Sh…

An Australian brand centred on authentic West African flavours is making massive waves in the premium foods and condiment...

Kinder Joy & Harry Potter Sydney Pop-Up: Sunday 29 September

Kinder is bringing a touch of magic to Australian shores with its NEW Kinder Joy Harry Potter Funko POP! collection. To c...

Times Magazine

The Ethical Considerations of AI Chatbots: Balancing Innovation with Responsibility

The rise of AI chatbots has dramatically transformed how businesses interact with customers. These intelligent tools can handle inquiries, provide support, and even personalize user experiences. However, with this innovation comes a host of ethical c...

Segway ZT3 Pro All-Terrain Electric Scooter

Segway-Ninebot, the global leader in the micromobility transportation solutions and robotic service industries is announcing its brand-new ZT series of electric scooters with the ZT3 Pro in Australia. The Segway ZT3 Pro combines cutting-edge smar...

Elevate Your Off-Road Experience with Ozzytyres’ 4x4 Wheel and Tyre Packages

The right wheel and tyre package can make all the difference between a thrilling adventure and a frustrating experience. An extensive range of high-quality 4x4 wheel and tyre packages from Ozzytyres can help you. They are designed to elevate your v...