The Times Australia
The Times World News

.

Facebook became Meta – and the company's dangerous behavior came into sharp focus in 2021: 4 essential reads

  • Written by Eric Smalley, Science + Technology Editor
Facebook became Meta – and the company's dangerous behavior came into sharp focus in 2021: 4 essential reads

Meta, née Facebook, had a rough year in 2021, in public opinion[1] if not financially[2]. Revelations from whistleblower Frances Haugen, first detailed in a Wall Street Journal investigative series[3] and then presented in congressional testimony[4], show that the company was aware of the harm it was causing.

Growing concerns about misinformation, emotional manipulation and psychological harm came to a head this year when Haugen released internal company documents showing that the company’s own research confirmed the societal and individual harm its Facebook, Instagram and WhatsApp platforms cause.

The Conversation gathered four articles from our archives that delve into research that explains Meta’s problematic behavior.

1. Addicted to engagement

At the root of Meta’s harmfulness is its set of algorithms, the rules the company uses to choose what content you see. The algorithms are designed to boost the company’s profits, but they also allow misinformation to thrive.

The algorithms work by increasing engagement – in other words, by provoking a response from the company’s users. Indiana University’s Filippo Menczer[5], who studies the spread of information and misinformation in social networks, explains that engagement plays into people’s tendency to favor posts that seem popular. “When social media tells people an item is going viral, their cognitive biases kick in[6] and translate into the irresistible urge to pay attention to it and share it,” he wrote.

One result is that low-quality information that gets an initial boost can garner more attention than it otherwise deserves. Worse, this dynamic can be gamed by people aiming to spread misinformation.

“People aiming to manipulate the information market have created fake accounts, like trolls and social bots, and organized fake networks,” Menczer wrote. “They have flooded the network to create the appearance that a conspiracy theory or a political candidate is popular, tricking both platform algorithms and people’s cognitive biases at once.”

Read more: Facebook whistleblower Frances Haugen testified that the company's algorithms are dangerous – here's how they can manipulate you[7]

2. Kneecapping teen girls’ self-esteem

Some of the most disturbing revelations concern the harm Meta’s Instagram social media platform causes adolescents, particularly teen girls. University of Kentucky psychologist Christia Spears Brown[8] explains that Instagram can lead teens to objectify themselves by focusing on how their bodies appear to others. It also can lead them to make unrealistic comparisons of themselves with celebrities and filtered and retouched images of their peers.

Even when teens know the comparisons are unrealistic, they end up feeling worse about themselves. “Even in studies in which participants knew the photos they were shown on Instagram were retouched and reshaped, adolescent girls still felt worse about their bodies after viewing them[9],” she wrote.

“The choices being made inside of Facebook are disastrous for our children,” whistleblower Frances Haugen told Congress.

The problem is widespread because Instagram is where teens tend to hang out online. “Teens are more likely to log on to Instagram than any other social media site. It is a ubiquitous part of adolescent life,” Brown writes. “Yet studies consistently show that the more often teens use Instagram, the worse their overall well-being, self-esteem, life satisfaction, mood and body image.”

Read more: Facebook has known for a year and a half that Instagram is bad for teens despite claiming otherwise – here are the harms researchers have been documenting for years[10]

3. Fudging the numbers on harm

Meta has, not surprisingly, pushed back against claims of harm despite the revelations in the leaked internal documents. The company has provided research that shows that its platforms do not cause harm[11] in the way many researchers describe, and claims that the overall picture from all research on harm is unclear.

University of Washington computational social scientist Joseph Bak-Coleman[12] explains that Meta’s research can be both accurate and misleading. The explanation lies in averages. Meta’s studies look at effects on the average user. Given that Meta’s social media platforms have billions of users, harm to many thousands of people can be lost[13] when all of the users’ experiences are averaged together.

“The inability of this type of research to capture the smaller but still significant numbers of people at risk – the tail of the distribution – is made worse by the need to measure a range of human experiences in discrete increments,” he wrote.

Read more: The thousands of vulnerable people harmed by Facebook and Instagram are lost in Meta's 'average user' data[14]

4. Hiding the numbers on misinformation

Just as evidence of emotional and psychological harm can be lost in averages, evidence of the spread of misinformation can be lost without the context of another type of math: fractions. Despite substantial efforts to track misinformation on social media, it’s impossible to know the scope of the problem without knowing the number of overall posts social media users see each day. And that’s information Meta doesn’t make available to researchers.

The overall number of posts is the denominator to the misinformation numerator in the fraction that tells you how bad the misinformation problem is, explains UMass Amherst’s Ethan Zuckerman[15], who studies social and civic media.

[Over 140,000 readers rely on The Conversation’s newsletters to understand the world. Sign up today[16].]

The denominator problem is compounded by the distribution problem, which is the need to figure out where misinformation is concentrated. “Simply counting instances of misinformation found on a social media platform leaves two key questions unanswered[17]: How likely are users to encounter misinformation, and are certain users especially likely to be affected by misinformation?” he wrote.

This lack of information isn’t unique to Meta. “No social media platform makes it possible for researchers to accurately calculate how prominent a particular piece of content is across its platform,” Zuckerman wrote.

Read more: Facebook has a misinformation problem, and is blocking access to data about how much there is and who is affected[18]

Editor’s note: This story is a roundup of articles from The Conversation’s archives.

References

  1. ^ public opinion (www.cnn.com)
  2. ^ financially (investor.fb.com)
  3. ^ investigative series (www.wsj.com)
  4. ^ congressional testimony (www.c-span.org)
  5. ^ Filippo Menczer (scholar.google.com)
  6. ^ their cognitive biases kick in (theconversation.com)
  7. ^ Facebook whistleblower Frances Haugen testified that the company's algorithms are dangerous – here's how they can manipulate you (theconversation.com)
  8. ^ Christia Spears Brown (scholar.google.com)
  9. ^ adolescent girls still felt worse about their bodies after viewing them (theconversation.com)
  10. ^ Facebook has known for a year and a half that Instagram is bad for teens despite claiming otherwise – here are the harms researchers have been documenting for years (theconversation.com)
  11. ^ its platforms do not cause harm (about.fb.com)
  12. ^ Joseph Bak-Coleman (scholar.google.com)
  13. ^ harm to many thousands of people can be lost (theconversation.com)
  14. ^ The thousands of vulnerable people harmed by Facebook and Instagram are lost in Meta's 'average user' data (theconversation.com)
  15. ^ Ethan Zuckerman (scholar.google.com)
  16. ^ Sign up today (memberservices.theconversation.com)
  17. ^ leaves two key questions unanswered (theconversation.com)
  18. ^ Facebook has a misinformation problem, and is blocking access to data about how much there is and who is affected (theconversation.com)

Read more https://theconversation.com/facebook-became-meta-and-the-companys-dangerous-behavior-came-into-sharp-focus-in-2021-4-essential-reads-173417

Times Magazine

Headless CMS in Digital Twins and 3D Product Experiences

Image by freepik As the metaverse becomes more advanced and accessible, it's clear that multiple sectors will use digital twins and 3D product experiences to visualize, connect, and streamline efforts better. A digital twin is a virtual replica of ...

The Decline of Hyper-Casual: How Mid-Core Mobile Games Took Over in 2025

In recent years, the mobile gaming landscape has undergone a significant transformation, with mid-core mobile games emerging as the dominant force in app stores by 2025. This shift is underpinned by changing user habits and evolving monetization tr...

Understanding ITIL 4 and PRINCE2 Project Management Synergy

Key Highlights ITIL 4 focuses on IT service management, emphasising continual improvement and value creation through modern digital transformation approaches. PRINCE2 project management supports systematic planning and execution of projects wit...

What AI Adoption Means for the Future of Workplace Risk Management

Image by freepik As industrial operations become more complex and fast-paced, the risks faced by workers and employers alike continue to grow. Traditional safety models—reliant on manual oversight, reactive investigations, and standardised checklist...

From Beach Bops to Alpine Anthems: Your Sonos Survival Guide for a Long Weekend Escape

Alright, fellow adventurers and relaxation enthusiasts! So, you've packed your bags, charged your devices, and mentally prepared for that glorious King's Birthday long weekend. But hold on, are you really ready? Because a true long weekend warrior kn...

Effective Commercial Pest Control Solutions for a Safer Workplace

Keeping a workplace clean, safe, and free from pests is essential for maintaining productivity, protecting employee health, and upholding a company's reputation. Pests pose health risks, can cause structural damage, and can lead to serious legal an...

The Times Features

Duke of Dural to Get Rooftop Bar as New Owners Invest in Venue Upgrade

The Duke of Dural, in Sydney’s north-west, is set for a major uplift under new ownership, following its acquisition by hospitality group Good Beer Company this week. Led by resp...

Prefab’s Second Life: Why Australia’s Backyard Boom Needs a Circular Makeover

The humble granny flat is being reimagined not just as a fix for housing shortages, but as a cornerstone of circular, factory-built architecture. But are our systems ready to s...

Melbourne’s Burglary Boom: Break-Ins Surge Nearly 25%

Victorian homeowners are being warned to act now, as rising break-ins and falling arrest rates paint a worrying picture for suburban safety. Melbourne residents are facing an ...

Exploring the Curriculum at a Modern Junior School in Melbourne

Key Highlights The curriculum at junior schools emphasises whole-person development, catering to children’s physical, emotional, and intellectual needs. It ensures early year...

Distressed by all the bad news? Here’s how to stay informed but still look after yourself

If you’re feeling like the news is particularly bad at the moment, you’re not alone. But many of us can’t look away – and don’t want to. Engaging with news can help us make ...

The Role of Your GP in Creating a Chronic Disease Management Plan That Works

Living with a long-term condition, whether that is diabetes, asthma, arthritis or heart disease, means making hundreds of small decisions every day. You plan your diet against m...