The Times Australia
The Times World News

.
The Times Real Estate

.

Facebook became Meta – and the company's dangerous behavior came into sharp focus in 2021: 4 essential reads

  • Written by Eric Smalley, Science + Technology Editor
Facebook became Meta – and the company's dangerous behavior came into sharp focus in 2021: 4 essential reads

Meta, née Facebook, had a rough year in 2021, in public opinion[1] if not financially[2]. Revelations from whistleblower Frances Haugen, first detailed in a Wall Street Journal investigative series[3] and then presented in congressional testimony[4], show that the company was aware of the harm it was causing.

Growing concerns about misinformation, emotional manipulation and psychological harm came to a head this year when Haugen released internal company documents showing that the company’s own research confirmed the societal and individual harm its Facebook, Instagram and WhatsApp platforms cause.

The Conversation gathered four articles from our archives that delve into research that explains Meta’s problematic behavior.

1. Addicted to engagement

At the root of Meta’s harmfulness is its set of algorithms, the rules the company uses to choose what content you see. The algorithms are designed to boost the company’s profits, but they also allow misinformation to thrive.

The algorithms work by increasing engagement – in other words, by provoking a response from the company’s users. Indiana University’s Filippo Menczer[5], who studies the spread of information and misinformation in social networks, explains that engagement plays into people’s tendency to favor posts that seem popular. “When social media tells people an item is going viral, their cognitive biases kick in[6] and translate into the irresistible urge to pay attention to it and share it,” he wrote.

One result is that low-quality information that gets an initial boost can garner more attention than it otherwise deserves. Worse, this dynamic can be gamed by people aiming to spread misinformation.

“People aiming to manipulate the information market have created fake accounts, like trolls and social bots, and organized fake networks,” Menczer wrote. “They have flooded the network to create the appearance that a conspiracy theory or a political candidate is popular, tricking both platform algorithms and people’s cognitive biases at once.”

Read more: Facebook whistleblower Frances Haugen testified that the company's algorithms are dangerous – here's how they can manipulate you[7]

2. Kneecapping teen girls’ self-esteem

Some of the most disturbing revelations concern the harm Meta’s Instagram social media platform causes adolescents, particularly teen girls. University of Kentucky psychologist Christia Spears Brown[8] explains that Instagram can lead teens to objectify themselves by focusing on how their bodies appear to others. It also can lead them to make unrealistic comparisons of themselves with celebrities and filtered and retouched images of their peers.

Even when teens know the comparisons are unrealistic, they end up feeling worse about themselves. “Even in studies in which participants knew the photos they were shown on Instagram were retouched and reshaped, adolescent girls still felt worse about their bodies after viewing them[9],” she wrote.

“The choices being made inside of Facebook are disastrous for our children,” whistleblower Frances Haugen told Congress.

The problem is widespread because Instagram is where teens tend to hang out online. “Teens are more likely to log on to Instagram than any other social media site. It is a ubiquitous part of adolescent life,” Brown writes. “Yet studies consistently show that the more often teens use Instagram, the worse their overall well-being, self-esteem, life satisfaction, mood and body image.”

Read more: Facebook has known for a year and a half that Instagram is bad for teens despite claiming otherwise – here are the harms researchers have been documenting for years[10]

3. Fudging the numbers on harm

Meta has, not surprisingly, pushed back against claims of harm despite the revelations in the leaked internal documents. The company has provided research that shows that its platforms do not cause harm[11] in the way many researchers describe, and claims that the overall picture from all research on harm is unclear.

University of Washington computational social scientist Joseph Bak-Coleman[12] explains that Meta’s research can be both accurate and misleading. The explanation lies in averages. Meta’s studies look at effects on the average user. Given that Meta’s social media platforms have billions of users, harm to many thousands of people can be lost[13] when all of the users’ experiences are averaged together.

“The inability of this type of research to capture the smaller but still significant numbers of people at risk – the tail of the distribution – is made worse by the need to measure a range of human experiences in discrete increments,” he wrote.

Read more: The thousands of vulnerable people harmed by Facebook and Instagram are lost in Meta's 'average user' data[14]

4. Hiding the numbers on misinformation

Just as evidence of emotional and psychological harm can be lost in averages, evidence of the spread of misinformation can be lost without the context of another type of math: fractions. Despite substantial efforts to track misinformation on social media, it’s impossible to know the scope of the problem without knowing the number of overall posts social media users see each day. And that’s information Meta doesn’t make available to researchers.

The overall number of posts is the denominator to the misinformation numerator in the fraction that tells you how bad the misinformation problem is, explains UMass Amherst’s Ethan Zuckerman[15], who studies social and civic media.

[Over 140,000 readers rely on The Conversation’s newsletters to understand the world. Sign up today[16].]

The denominator problem is compounded by the distribution problem, which is the need to figure out where misinformation is concentrated. “Simply counting instances of misinformation found on a social media platform leaves two key questions unanswered[17]: How likely are users to encounter misinformation, and are certain users especially likely to be affected by misinformation?” he wrote.

This lack of information isn’t unique to Meta. “No social media platform makes it possible for researchers to accurately calculate how prominent a particular piece of content is across its platform,” Zuckerman wrote.

Read more: Facebook has a misinformation problem, and is blocking access to data about how much there is and who is affected[18]

Editor’s note: This story is a roundup of articles from The Conversation’s archives.

References

  1. ^ public opinion (www.cnn.com)
  2. ^ financially (investor.fb.com)
  3. ^ investigative series (www.wsj.com)
  4. ^ congressional testimony (www.c-span.org)
  5. ^ Filippo Menczer (scholar.google.com)
  6. ^ their cognitive biases kick in (theconversation.com)
  7. ^ Facebook whistleblower Frances Haugen testified that the company's algorithms are dangerous – here's how they can manipulate you (theconversation.com)
  8. ^ Christia Spears Brown (scholar.google.com)
  9. ^ adolescent girls still felt worse about their bodies after viewing them (theconversation.com)
  10. ^ Facebook has known for a year and a half that Instagram is bad for teens despite claiming otherwise – here are the harms researchers have been documenting for years (theconversation.com)
  11. ^ its platforms do not cause harm (about.fb.com)
  12. ^ Joseph Bak-Coleman (scholar.google.com)
  13. ^ harm to many thousands of people can be lost (theconversation.com)
  14. ^ The thousands of vulnerable people harmed by Facebook and Instagram are lost in Meta's 'average user' data (theconversation.com)
  15. ^ Ethan Zuckerman (scholar.google.com)
  16. ^ Sign up today (memberservices.theconversation.com)
  17. ^ leaves two key questions unanswered (theconversation.com)
  18. ^ Facebook has a misinformation problem, and is blocking access to data about how much there is and who is affected (theconversation.com)

Read more https://theconversation.com/facebook-became-meta-and-the-companys-dangerous-behavior-came-into-sharp-focus-in-2021-4-essential-reads-173417

The Times Features

Why Staying Safe at Home Is Easier Than You Think

Staying safe at home doesn’t have to be a daunting task. Many people think creating a secure living space is expensive or time-consuming, but that’s far from the truth. By focu...

Lauren’s Journey to a Healthier Life: How Being a Busy Mum and Supportive Wife Helped Her To Lose 51kg with The Lady Shake

For Lauren, the road to better health began with a small and simple but significant decision. As a busy wife and mother, she noticed her husband skipping breakfast and decided ...

How to Manage Debt During Retirement in Australia: Best Practices for Minimising Interest Payments

Managing debt during retirement is a critical step towards ensuring financial stability and peace of mind. Retirees in Australia face unique challenges, such as fixed income st...

hMPV may be spreading in China. Here’s what to know about this virus – and why it’s not cause for alarm

Five years on from the first news of COVID, recent reports[1] of an obscure respiratory virus in China may understandably raise concerns. Chinese authorities first issued warn...

Black Rock is a popular beachside suburb

Black Rock is indeed a popular beachside suburb, located in the southeastern suburbs of Melbourne, Victoria, Australia. It’s known for its stunning beaches, particularly Half M...

What factors affect whether or not a person is approved for a property loan

Several factors determine whether a person is approved for a real estate loan. These factors help lenders assess the borrower’s ability to repay the loan and the risk involved...

Times Magazine

Lessons from the Past: Historical Maritime Disasters and Their Influence on Modern Safety Regulations

Maritime history is filled with tales of bravery, innovation, and, unfortunately, tragedy. These historical disasters serve as stark reminders of the challenges posed by the seas and have driven significant advancements in maritime safety regulat...

What workers really think about workplace AI assistants

Imagine starting your workday with an AI assistant that not only helps you write emails[1] but also tracks your productivity[2], suggests breathing exercises[3], monitors your mood and stress levels[4] and summarises meetings[5]. This is not a f...

Aussies, Clear Out Old Phones –Turn Them into Cash Now!

Still, holding onto that old phone in your drawer? You’re not alone. Upgrading to the latest iPhone is exciting, but figuring out what to do with the old one can be a hassle. The good news? Your old iPhone isn’t just sitting there it’s potential ca...

Rain or Shine: Why Promotional Umbrellas Are a Must-Have for Aussie Brands

In Australia, where the weather can swing from scorching sun to sudden downpours, promotional umbrellas are more than just handy—they’re marketing gold. We specialise in providing wholesale custom umbrellas that combine function with branding power. ...

Why Should WACE Students Get a Tutor?

The Western Australian Certificate of Education (WACE) is completed by thousands of students in West Australia every year. Each year, the pressure increases for students to perform. Student anxiety is at an all time high so students are seeking suppo...

What Are the Risks of Hiring a Private Investigator

I’m a private investigator based in Melbourne, Australia. Being a Melbourne Pi always brings interesting clients throughout Melbourne. Many of these clients always ask me what the risks are of hiring a private investigator.  Legal Risks One of the ...

LayBy Shopping