The Times Australia
The Times World News

.

What is shadowbanning? How do I know if it has happened to me, and what can I do about it?

  • Written by Marten Risius, Senior Lecturer in Business Information Systems, The University of Queensland

Tech platforms use recommender algorithms to control society’s key resource: attention[1]. With these algorithms they can quietly demote or hide certain content instead of just blocking or deleting it[2]. This opaque practice is called “shadowbanning”.

While platforms will often deny they engage in shadowbanning, there’s plenty of evidence it’s well and truly present. And it’s a problematic form of content moderation[3] that desperately needs oversight.

What is shadowbanning?

Simply put, shadowbanning is when a platform reduces the visibility of content without alerting the user[4]. The content may still be potentially accessed, but with conditions on how it circulates.

It may no longer appear as a recommendation, in a search result, in a news feed, or in other users’ content queues[5]. One example would be burying a comment underneath many[6] others[7].

The term “shadowbanning” first appeared in 2001, when it referred to making posts invisible to everyone except the poster in an online forum[8]. Today’s version of it (where content is demoted through algorithms) is much more nuanced[9].

Shadowbans are distinct from other moderation approaches in a number of ways. They are:

  • usually algorithmically enforced
  • informal, in that they are not explicitly communicated[10]
  • ambiguous, since they don’t decisively punish users who violate platform policies.

Which platforms shadowban content?

Platforms such as Instagram, Facebook[11] and Twitter[12] generally deny performing shadowbans, but typically do so by referring to the original 2001 understanding of it[13].

When shadowbanning has been reported, platforms have explained this away by citing technical glitches, users’ failure to create engaging content, or as a matter of chance through black-box algorithms[14].

That said, most platforms will admit to visibility reduction[15] or “demotion” of content. And that’s still shadowbanning as the term is now used.

In 2018, Facebook and Instagram became the first major platforms to admit[16] they algorithmically reduced user engagement with “borderline” content[17] – which in Meta CEO Mark Zuckerberg’s words included “sensationalist and provocative content”.

YouTube, Twitter, LinkedIn and TikTok[18] have since announced similar strategies to deal with sensitive content[19].

In one survey[20] of 1,006 social media users, 9.2% reported they had been shadowbanned. Of these 8.1% were on Facebook, 4.1% on Twitter, 3.8% on Instagram, 3.2% on TikTok, 1.3% on Discord, 1% on Tumblr and less than 1% on YouTube, Twitch, Reddit, NextDoor, Pinterest, Snapchat and LinkedIn.

Further evidence for shadowbanning comes from surveys[21], interviews[22], internal whistle-blowers[23], information leaks[24], investigative[25] journalism[26] and empirical analyses[27] by researchers[28].

Why do platforms shadowban?

Experts think shadowbanning by platforms likely increased in response to criticism of big tech’s inadequate handling of misinformation[29]. Over time moderation has become an increasingly politicised issue, and shadowbanning offers an easy way out.

The goal is to mitigate content that’s “lawful but awful”. This content trades under different names across platforms, whether it’s dubbed[30] “borderline”, “sensitive”, “harmful”, “undesirable” or “objectionable”.

Through shadowbanning, platforms can dodge accountability and avoid outcries over “censorship”. At the same time, they still benefit financially from shadowbanned content that’s perpetually sought out[31].

Who gets shadowbanned?

Recent[32] studies[33] have found between 3% and 6.2% of sampled Twitter accounts had been shadowbanned at least once.

The research identified specific characteristics that increased the likelihood of posts or accounts being shadowbanned:

  • new accounts (less than two weeks old) with fewer followers (below 200)
  • uncivil language being used, such as negative or offensive terms
  • pictures being posted without text
  • accounts displaying bot-like behavior.

On Twitter, having a verified account (a blue checkmark) reduced the chances of being[34] shadowbanned[35].

Of particular concern is evidence that shadowbanning disproportionately targets people in marginalised groups. In 2020 TikTok had to apologise for marginalising the black community through its “Black Lives Matter” filter[36]. In 2021, TikTok users reported that using the word “Black” in their bio page would lead to their content being flagged as “inappropriate[37]”. And in February 2022, keywords related to the LGBTQ+ movement[38] were found to be shadowbanned.

Overall, Black, LQBTQ+ and Republican users report more frequent and harsher content moderation across Facebook, Twitter, Instagram and TikTok[39].

How can you know if you’ve been shadowbanned?

Detecting shadowbanning is difficult. However, there are some ways you can try to figure out if it has happened to you:

  • rank the performance of the content in question against your “normal” engagement[40] levels[41] – if a certain post has greatly under-performed for no obvious reason, it may have been shadowbanned

  • ask others to use their accounts to search for your content – but keep in mind if they’re a “friend” or “follower” they may still be able to see your shadowbanned content, whereas other users may not

  • benchmark your content’s reach against content from others who have comparable engagement – for instance, a black content creator can compare their TikTok views to those of a white creator with a similar following

  • refer to shadowban detection tools available for different platforms such as Reddit[42] (r/CommentRemovalChecker) or Twitter (hisubway[43]).

Read more: Deplatforming online extremists reduces their followers – but there's a price[44]

What can users do about shadowbanning?

Shadowbans last for varying amounts of time depending on the demoted content and platform. On TikTok, they’re said to[45] last about two weeks. If your account or content is shadowbanned, there aren’t many options to immediately reverse this.

But some strategies can help reduce the chance of it happening, as researchers have found[46]. One is to self-censor. For instance, users may avoid ethnic identification labels such as “AsianWomen”.

Users can also experiment with external tools that estimate the likelihood of content being flagged, and then manipulate the content so it’s less likely to be picked up by algorithms. If certain terms are likely to be flagged, they’ll use phonetically similar alternatives, like “S-E-G-G-S” instead of “sex”.

Shadowbanning impairs the free exchange of ideas and excludes minorities. It can be exploited by trolls falsely flagging content. It can cause financial harm to users trying to monetise content. It can even trigger emotional distress[47] through isolation.

As a first step, we need to demand transparency from platforms on their shadowbanning policies and enforcement. This practice has potentially severe ramifications for individuals and society. To fix it, we’ll need to scrutinise it with the thoroughness it deserves.

What is shadowbanning? How do I know if it has happened to me, and what can I do about it?

References

  1. ^ attention (onlinelibrary.wiley.com)
  2. ^ blocking or deleting it (ieeexplore.ieee.org)
  3. ^ moderation (law.yale.edu)
  4. ^ without alerting the user (journals.sagepub.com)
  5. ^ content queues (law.yale.edu)
  6. ^ many (yahootechpulse.easychair.org)
  7. ^ others (apo.org.au)
  8. ^ in an online forum (journals.sagepub.com)
  9. ^ much more nuanced (law.yale.edu)
  10. ^ not explicitly communicated (journals.sagepub.com)
  11. ^ Instagram, Facebook (www.businessinsider.com)
  12. ^ Twitter (blog.twitter.com)
  13. ^ 2001 understanding of it (law.yale.edu)
  14. ^ through black-box algorithms (www.tandfonline.com)
  15. ^ visibility reduction (ieeexplore.ieee.org)
  16. ^ platforms to admit (www.facebook.com)
  17. ^ borderline” content (journals.sagepub.com)
  18. ^ TikTok (onlinelibrary.wiley.com)
  19. ^ sensitive content (journals.sagepub.com)
  20. ^ In one survey (cdt.org)
  21. ^ surveys (files.osf.io)
  22. ^ interviews (journals.sagepub.com)
  23. ^ whistle-blowers (www.technologyreview.com)
  24. ^ leaks (journals.sagepub.com)
  25. ^ investigative (www.vice.com)
  26. ^ journalism (www.vice.com)
  27. ^ analyses (papers.ssrn.com)
  28. ^ researchers (ieeexplore.ieee.org)
  29. ^ inadequate handling of misinformation (journals.sagepub.com)
  30. ^ it’s dubbed (www.tandfonline.com)
  31. ^ sought out (journals.sagepub.com)
  32. ^ Recent (papers.ssrn.com)
  33. ^ studies (ieeexplore.ieee.org)
  34. ^ chances of being (papers.ssrn.com)
  35. ^ shadowbanned (www.washingtonpost.com)
  36. ^ filter (newsroom.tiktok.com)
  37. ^ inappropriate (www.nbcnews.com)
  38. ^ to the LGBTQ+ movement (www.dw.com)
  39. ^ TikTok (files.osf.io)
  40. ^ engagement (files.osf.io)
  41. ^ levels (law.yale.edu)
  42. ^ Reddit (www.online-tech-tips.com)
  43. ^ hisubway (files.osf.io)
  44. ^ Deplatforming online extremists reduces their followers – but there's a price (theconversation.com)
  45. ^ said to (blog.hootsuite.com)
  46. ^ as researchers have found (journals.sagepub.com)
  47. ^ trigger emotional distress (files.osf.io)

Read more https://theconversation.com/what-is-shadowbanning-how-do-i-know-if-it-has-happened-to-me-and-what-can-i-do-about-it-192735

Times Magazine

Headless CMS in Digital Twins and 3D Product Experiences

Image by freepik As the metaverse becomes more advanced and accessible, it's clear that multiple sectors will use digital twins and 3D product experiences to visualize, connect, and streamline efforts better. A digital twin is a virtual replica of ...

The Decline of Hyper-Casual: How Mid-Core Mobile Games Took Over in 2025

In recent years, the mobile gaming landscape has undergone a significant transformation, with mid-core mobile games emerging as the dominant force in app stores by 2025. This shift is underpinned by changing user habits and evolving monetization tr...

Understanding ITIL 4 and PRINCE2 Project Management Synergy

Key Highlights ITIL 4 focuses on IT service management, emphasising continual improvement and value creation through modern digital transformation approaches. PRINCE2 project management supports systematic planning and execution of projects wit...

What AI Adoption Means for the Future of Workplace Risk Management

Image by freepik As industrial operations become more complex and fast-paced, the risks faced by workers and employers alike continue to grow. Traditional safety models—reliant on manual oversight, reactive investigations, and standardised checklist...

From Beach Bops to Alpine Anthems: Your Sonos Survival Guide for a Long Weekend Escape

Alright, fellow adventurers and relaxation enthusiasts! So, you've packed your bags, charged your devices, and mentally prepared for that glorious King's Birthday long weekend. But hold on, are you really ready? Because a true long weekend warrior kn...

Effective Commercial Pest Control Solutions for a Safer Workplace

Keeping a workplace clean, safe, and free from pests is essential for maintaining productivity, protecting employee health, and upholding a company's reputation. Pests pose health risks, can cause structural damage, and can lead to serious legal an...

The Times Features

Prefab’s Second Life: Why Australia’s Backyard Boom Needs a Circular Makeover

The humble granny flat is being reimagined not just as a fix for housing shortages, but as a cornerstone of circular, factory-built architecture. But are our systems ready to s...

Melbourne’s Burglary Boom: Break-Ins Surge Nearly 25%

Victorian homeowners are being warned to act now, as rising break-ins and falling arrest rates paint a worrying picture for suburban safety. Melbourne residents are facing an ...

Exploring the Curriculum at a Modern Junior School in Melbourne

Key Highlights The curriculum at junior schools emphasises whole-person development, catering to children’s physical, emotional, and intellectual needs. It ensures early year...

Distressed by all the bad news? Here’s how to stay informed but still look after yourself

If you’re feeling like the news is particularly bad at the moment, you’re not alone. But many of us can’t look away – and don’t want to. Engaging with news can help us make ...

The Role of Your GP in Creating a Chronic Disease Management Plan That Works

Living with a long-term condition, whether that is diabetes, asthma, arthritis or heart disease, means making hundreds of small decisions every day. You plan your diet against m...

Troubleshooting Flickering Lights: A Comprehensive Guide for Homeowners

Image by rawpixel.com on Freepik Effectively addressing flickering lights in your home is more than just a matter of convenience; it's a pivotal aspect of both home safety and en...