The Times Australia
The Times World News

.
The Times Real Estate

.

How 'engagement' makes you vulnerable to manipulation and misinformation on social media

  • Written by Filippo Menczer, Luddy Distinguished Professor of Informatics and Computer Science, Indiana University
How 'engagement' makes you vulnerable to manipulation and misinformation on social media

Facebook has been quietly experimenting[1] with reducing the amount of political content it puts in users’ news feeds. The move is a tacit acknowledgment that the way the company’s algorithms work can be a problem[2].

The heart of the matter is the distinction between provoking a response and providing content people want. Social media algorithms – the rules their computers follow in deciding the content that you see – rely heavily on people’s behavior to make these decisions. In particular, they watch for content that people respond to or “engage” with by liking, commenting and sharing.

As a computer scientist[3] who studies the ways large numbers of people interact using technology, I understand the logic of using the wisdom of the crowds[4] in these algorithms. I also see substantial pitfalls in how the social media companies do so in practice.

From lions on the savanna to likes on Facebook

The concept of the wisdom of crowds assumes that using signals from others’ actions, opinions and preferences as a guide will lead to sound decisions. For example, collective predictions[5] are normally more accurate than individual ones. Collective intelligence is used to predict financial markets, sports[6], elections[7] and even disease outbreaks[8].

Throughout millions of years of evolution, these principles have been coded into the human brain in the form of cognitive biases that come with names like familiarity[9], mere-exposure[10] and bandwagon effect[11]. If everyone starts running, you should also start running; maybe someone saw a lion coming and running could save your life. You may not know why, but it’s wiser to ask questions later.

Your brain picks up clues from the environment – including your peers – and uses simple rules[12] to quickly translate those signals into decisions: Go with the winner, follow the majority, copy your neighbor. These rules work remarkably well in typical situations because they are based on sound assumptions. For example, they assume that people often act rationally, it is unlikely that many are wrong, the past predicts the future, and so on.

Technology allows people to access signals from much larger numbers of other people, most of whom they do not know. Artificial intelligence applications make heavy use of these popularity or “engagement” signals, from selecting search engine results to recommending music and videos, and from suggesting friends to ranking posts on news feeds.

Not everything viral deserves to be

Our research shows that virtually all web technology platforms, such as social media and news recommendation systems, have a strong popularity bias[13]. When applications are driven by cues like engagement rather than explicit search engine queries, popularity bias can lead to harmful unintended consequences.

Social media like Facebook, Instagram, Twitter, YouTube and TikTok rely heavily on AI algorithms to rank and recommend content. These algorithms take as input what you “like,” comment on and share – in other words, content you engage with. The goal of the algorithms is to maximize engagement by finding out what people like and ranking it at the top of their feeds.

A primer on the Facebook algorithm.

On the surface this seems reasonable. If people like credible news, expert opinions and fun videos, these algorithms should identify such high-quality content. But the wisdom of the crowds makes a key assumption here: that recommending what is popular will help high-quality content “bubble up.”

We tested this assumption[14] by studying an algorithm that ranks items using a mix of quality and popularity. We found that in general, popularity bias is more likely to lower the overall quality of content. The reason is that engagement is not a reliable indicator of quality when few people have been exposed to an item. In these cases, engagement generates a noisy signal, and the algorithm is likely to amplify this initial noise. Once the popularity of a low-quality item is large enough, it will keep getting amplified.

Algorithms aren’t the only thing affected by engagement bias – it can affect people[15], too. Evidence shows that information is transmitted via “complex contagion[16],” meaning the more times someone is exposed to an idea online, the more likely they are to adopt and reshare it. When social media tells people an item is going viral, their cognitive biases kick in and translate into the irresistible urge to pay attention to it and share it.

Not-so-wise crowds

We recently ran an experiment using a news literacy app called Fakey[17]. It is a game developed by our lab, which simulates a news feed like those of Facebook and Twitter. Players see a mix of current articles from fake news, junk science, hyper-partisan and conspiratorial sources, as well as mainstream sources. They get points for sharing or liking news from reliable sources and for flagging low-credibility articles for fact-checking.

We found that players are more likely to like or share and less likely to flag[18] articles from low-credibility sources when players can see that many other users have engaged with those articles. Exposure to the engagement metrics thus creates a vulnerability.

The wisdom of the crowds fails because it is built on the false assumption that the crowd is made up of diverse, independent sources. There may be several reasons this is not the case.

First, because of people’s tendency to associate with similar people, their online neighborhoods are not very diverse. The ease with which a social media user can unfriend those with whom they disagree pushes people into homogeneous communities, often referred to as echo chambers[19].

Second, because many people’s friends are friends of each other, they influence each other. A famous experiment[20] demonstrated that knowing what music your friends like affects your own stated preferences. Your social desire to conform distorts your independent judgment.

Third, popularity signals can be gamed. Over the years, search engines have developed sophisticated techniques to counter so-called “link farms[21]” and other schemes to manipulate search algorithms. Social media platforms, on the other hand, are just beginning to learn about their own vulnerabilities[22].

People aiming to manipulate the information market have created fake accounts[23], like trolls and social bots[24], and organized[25] fake networks[26]. They have flooded the network[27] to create the appearance that a conspiracy theory[28] or a political candidate[29] is popular, tricking both platform algorithms and people’s cognitive biases at once. They have even altered the structure of social networks[30] to create illusions about majority opinions[31].

[Over 110,000 readers rely on The Conversation’s newsletter to understand the world. Sign up today[32].]

Dialing down engagement

What to do? Technology platforms are currently on the defensive. They are becoming more aggressive[33] during elections in taking down fake accounts and harmful misinformation[34]. But these efforts can be akin to a game of whack-a-mole[35].

A different, preventive approach would be to add friction[36]. In other words, to slow down the process of spreading information. High-frequency behaviors such as automated liking and sharing could be inhibited by CAPTCHA[37] tests or fees. This would not only decrease opportunities for manipulation, but with less information people would be able to pay more attention to what they see. It would leave less room for engagement bias to affect people’s decisions.

It would also help if social media companies adjusted their algorithms to rely less on engagement to determine the content they serve you.

References

  1. ^ quietly experimenting (about.fb.com)
  2. ^ can be a problem (www.wired.com)
  3. ^ computer scientist (scholar.google.com)
  4. ^ wisdom of the crowds (www.penguinrandomhouse.com)
  5. ^ collective predictions (www.investopedia.com)
  6. ^ financial markets, sports (augur.net)
  7. ^ elections (iemweb.biz.uiowa.edu)
  8. ^ disease outbreaks (www.centerforhealthsecurity.org)
  9. ^ familiarity (doi.org)
  10. ^ mere-exposure (socialpsychonline.com)
  11. ^ bandwagon effect (www.psychologytoday.com)
  12. ^ simple rules (global.oup.com)
  13. ^ popularity bias (doi.org)
  14. ^ tested this assumption (doi.org)
  15. ^ affect people (www.scientificamerican.com)
  16. ^ complex contagion (doi.org)
  17. ^ a news literacy app called Fakey (fakey.iuni.iu.edu)
  18. ^ more likely to like or share and less likely to flag (doi.org)
  19. ^ echo chambers (doi.org)
  20. ^ famous experiment (doi.org)
  21. ^ link farms (www.webopedia.com)
  22. ^ vulnerabilities (theconversation.com)
  23. ^ fake accounts (www.washingtonpost.com)
  24. ^ social bots (cacm.acm.org)
  25. ^ organized (ojs.aaai.org)
  26. ^ fake networks (www.washingtonpost.com)
  27. ^ flooded the network (doi.org)
  28. ^ conspiracy theory (www.newsweek.com)
  29. ^ political candidate (ojs.aaai.org)
  30. ^ altered the structure of social networks (doi.org)
  31. ^ illusions about majority opinions (doi.org)
  32. ^ Sign up today (theconversation.com)
  33. ^ aggressive (www.nytimes.com)
  34. ^ taking down fake accounts and harmful misinformation (www.socialmediatoday.com)
  35. ^ whack-a-mole (www.marketplace.org)
  36. ^ friction (www.theguardian.com)
  37. ^ CAPTCHA (www.cloudflare.com)

Read more https://theconversation.com/how-engagement-makes-you-vulnerable-to-manipulation-and-misinformation-on-social-media-145375

The Times Features

Why Staying Safe at Home Is Easier Than You Think

Staying safe at home doesn’t have to be a daunting task. Many people think creating a secure living space is expensive or time-consuming, but that’s far from the truth. By focu...

Lauren’s Journey to a Healthier Life: How Being a Busy Mum and Supportive Wife Helped Her To Lose 51kg with The Lady Shake

For Lauren, the road to better health began with a small and simple but significant decision. As a busy wife and mother, she noticed her husband skipping breakfast and decided ...

How to Manage Debt During Retirement in Australia: Best Practices for Minimising Interest Payments

Managing debt during retirement is a critical step towards ensuring financial stability and peace of mind. Retirees in Australia face unique challenges, such as fixed income st...

hMPV may be spreading in China. Here’s what to know about this virus – and why it’s not cause for alarm

Five years on from the first news of COVID, recent reports[1] of an obscure respiratory virus in China may understandably raise concerns. Chinese authorities first issued warn...

Black Rock is a popular beachside suburb

Black Rock is indeed a popular beachside suburb, located in the southeastern suburbs of Melbourne, Victoria, Australia. It’s known for its stunning beaches, particularly Half M...

What factors affect whether or not a person is approved for a property loan

Several factors determine whether a person is approved for a real estate loan. These factors help lenders assess the borrower’s ability to repay the loan and the risk involved...

Times Magazine

Lessons from the Past: Historical Maritime Disasters and Their Influence on Modern Safety Regulations

Maritime history is filled with tales of bravery, innovation, and, unfortunately, tragedy. These historical disasters serve as stark reminders of the challenges posed by the seas and have driven significant advancements in maritime safety regulat...

What workers really think about workplace AI assistants

Imagine starting your workday with an AI assistant that not only helps you write emails[1] but also tracks your productivity[2], suggests breathing exercises[3], monitors your mood and stress levels[4] and summarises meetings[5]. This is not a f...

Aussies, Clear Out Old Phones –Turn Them into Cash Now!

Still, holding onto that old phone in your drawer? You’re not alone. Upgrading to the latest iPhone is exciting, but figuring out what to do with the old one can be a hassle. The good news? Your old iPhone isn’t just sitting there it’s potential ca...

Rain or Shine: Why Promotional Umbrellas Are a Must-Have for Aussie Brands

In Australia, where the weather can swing from scorching sun to sudden downpours, promotional umbrellas are more than just handy—they’re marketing gold. We specialise in providing wholesale custom umbrellas that combine function with branding power. ...

Why Should WACE Students Get a Tutor?

The Western Australian Certificate of Education (WACE) is completed by thousands of students in West Australia every year. Each year, the pressure increases for students to perform. Student anxiety is at an all time high so students are seeking suppo...

What Are the Risks of Hiring a Private Investigator

I’m a private investigator based in Melbourne, Australia. Being a Melbourne Pi always brings interesting clients throughout Melbourne. Many of these clients always ask me what the risks are of hiring a private investigator.  Legal Risks One of the ...

LayBy Shopping