The Times Australia
Google AI
The Times World News

.

Could Apple's child safety feature backfire? New research shows warnings can increase risky sharing

  • Written by Bennett Bertenthal, Professor of Psychological and Brain Sciences, Indiana University
Could Apple's child safety feature backfire? New research shows warnings can increase risky sharing

Apple’s plan to roll out tools to limit the spread of child sexual abuse material has drawn praise from some privacy and security experts as well as by child protection advocacy groups. There has also been an outcry about invasions of privacy[1].

These concerns have obscured another even more troublesome problem that has received very little attention: Apple’s new feature uses design elements shown by research to backfire.

One of these new features adds a parental control option to Messages that blocks the viewing of sexually explicit pictures. The expectation is that parental surveillance of the child’s behavior will decrease the viewing or sending of sexually explicit photos, but this is highly debatable.

We are[2] two psychologists[3] and a computer scientist[4]. We have conducted extensive research on why people share risky images online. Our recent research reveals that warnings about privacy on social media do not reduce photo-sharing nor increase concern about privacy. In fact, these warnings, including Apple’s new child safety features, can increase rather than reduce[5] risky sharing of photos.

Apple’s child safety features

Apple announced on Aug. 5, 2021 that it plans to introduce new child safety features in three areas[6]. The first, relatively uncontroversial feature is that Apple’s search app and virtual assistant Siri will provide parents and children with resources and help[7] if they encounter potentially harmful material.

The second feature will scan images on people’s devices that are also stored in iCloud Photos to look for matches in a database of child sexual abuse images provided by the National Center for Missing and Exploited Children and other child safety organizations. After a threshold for these matches is reached, Apple manually reviews each machine match to confirm the content of the photo, and then disables the user’s account and sends a report to the center. This feature has generated much controversy[8].

The last feature adds a parental control option to Messages, Apple’s texting app, that blurs sexually explicit pictures when children attempt to view them. It also warns the children about the content, presents helpful resources and assures them it is OK if they do not want to view the photo. If the child is 12 or under, parents will get a message if the child views or shares a risky photo.

There has been little public discussion of this feature, perhaps because the conventional wisdom is that parental control is necessary and effective. This is not always the case, however, and such warnings can backfire[9].

When warnings backfire

In general, people are more likely than not to avoid risky sharing, but it’s important to reduce the sharing that does occur. An analysis of 39 studies[10] found that 12% of young people forwarded a sext, or sexually explicit image or video, without consent, and 8.4% had a sext of themselves forwarded without consent. Warnings might seem like an appropriate way to do so. Contrary to expectation, we have found that warnings about privacy violations often backfire.

Teens talk about sharing nude photos.

In one series of experiments, we tried to decrease the likelihood of sharing embarrassing or degrading photos on social media by reminding participants that they should consider the privacy and security of others. Across multiple studies, we have tried different reminders about the consequences of sharing photos, similar to the warnings to be introduced in Apple’s new child safety tools.

Remarkably, our research often reveals paradoxical effects[11]. Participants who received warnings as simple as stating that they should take others’ privacy into account were more likely to share photos than participants who did not receive this warning. When we began this research, we were sure that these privacy nudges would reduce risky photo sharing, but they didn’t.

The results have been consistent since our first two studies showed that warnings backfired. We have now observed this effect multiple times, and have found that several factors, such as a person’s humor style or photo sharing experience on social media[12], influence their willingness to share photos and how they might respond to warnings.

Although it’s not clear why warnings backfire, one possibility is that individuals’ concerns about privacy are lessened[13] when they underestimate the risks of sharing. Another possibility is reactance, or the tendency for seemingly unnecessary rules or prompts to elicit the opposite effect from what was intended[14]. Just as a forbidden fruit becomes sweeter, so too might constant reminders about privacy concerns make risky photo sharing more attractive.

Will Apple’s warnings work?

It is possible that some children will be more inclined to send or receive sexually explicit photos after receiving a warning from Apple. There are numerous reasons why this behavior may occur, ranging from curiosity – adolescents often learn about sex from peers[15] – to challenging parents’ authority and reputational concerns, such as being seen as cool by sharing apparently risky photos. During a stage of life when risk-taking tends to peak[16], it’s not hard to see how adolescents might find earning a warning from Apple to be a badge of honor rather than a genuine cause for concern.

[Over 110,000 readers rely on The Conversation’s newsletter to understand the world. Sign up today[17].]

Apple announced on Sept. 3, 2021 that it is delaying the rollout of these new CSAM tools[18] because of concerns expressed by the privacy and security community. The company plans to take additional time over the coming months to collect input and make improvements before releasing these child safety features.

This plan is not sufficient, however, without also knowing whether Apple’s new features will have the desired effect on children’s behavior. We encourage Apple to engage with researchers to ensure that their new tools will reduce rather than encourage problematic photo sharing.

References

  1. ^ outcry about invasions of privacy (arstechnica.com)
  2. ^ are (scholar.google.com)
  3. ^ psychologists (scholar.google.co.uk)
  4. ^ computer scientist (scholar.google.com)
  5. ^ can increase rather than reduce (ieeexplore.ieee.org)
  6. ^ new child safety features in three areas (www.apple.com)
  7. ^ will provide parents and children with resources and help (www.theverge.com)
  8. ^ generated much controversy (www.wsj.com)
  9. ^ and such warnings can backfire (www.whitbyschool.org)
  10. ^ analysis of 39 studies (dx.doi.org)
  11. ^ our research often reveals paradoxical effects (ieeexplore.ieee.org)
  12. ^ such as a person’s humor style or photo sharing experience on social media (dl.acm.org)
  13. ^ individuals’ concerns about privacy are lessened (dx.doi.org)
  14. ^ elicit the opposite effect from what was intended (dx.doi.org)
  15. ^ learn about sex from peers (dx.doi.org)
  16. ^ risk-taking tends to peak (doi.org)
  17. ^ Sign up today (theconversation.com)
  18. ^ delaying the rollout of these new CSAM tools (www.wsj.com)

Read more https://theconversation.com/could-apples-child-safety-feature-backfire-new-research-shows-warnings-can-increase-risky-sharing-167035

Times Magazine

Epson launches ELPCS01 mobile projector cart

Designed for the EB-810E[1] projector and provides easy setup for portable displays in flexible ...

Governance Models for Headless CMS in Large Organizations

Where headless CMS is adopted by large enterprises, governance is the single most crucial factor d...

Narwal Freo Z Ultra Robotic Vacuum and Mop Cleaner

Rating: ★★★★☆ (4.4/5)Category: Premium Robot Vacuum & Mop ComboBest for: Busy households, ha...

Shark launches SteamSpot - the shortcut for everyday floor mess

Shark introduces the Shark SteamSpot Steam Mop, a lightweight steam mop designed to make everyda...

Game Together, Stay Together: Logitech G Reveals Gaming Couples Enjoy Higher Relationship Satisfaction

With Valentine’s Day right around the corner, many lovebirds across Australia are planning for the m...

AI threatens to eat business software – and it could change the way we work

In recent weeks, a range of large “software-as-a-service” companies, including Salesforce[1], Se...

The Times Features

AI could help us more accurately screen for breast cancer – new research

At least 20,000[1] Australian women are diagnosed with breast cancer each year. And more than ...

Housing ACT tenants left in unsafe conditions

An ACT Ombudsman report has found that Housing ACT tenants have been left waiting in unsafe and haza...

Shark SteamSpot S2001 Review: A Chemical-Free Way to Tackle Messes and Stubborn Stains

If you're looking for a reliable steam mop that can handle both everyday spills and stubborn stains ...

How Businesses Are Generating Profits in a High-Inflation Economic Environment

Inflation in Australia and globally has surged to multi-decade highs since 2021, driven by pande...

The Effects of the War in the Middle East on Australian Small Businesses

The war in the Middle East is not a distant geopolitical event for Australia. In an interconnect...

Back at uni? How to help your wellbeing while you study

University can be a time of great opportunities, but it can also be very stressful[1]. Many stud...

Taste Port Douglas celebrates 10 years of world-class flavour in the tropics

30+ events, new sunrise and wellness experiences, 20+ chefs and a headline Michelin-star line-up...

Oztent RV tent range. Buy with caution

A review of the Oztent RV "30 second tent" range. Three years ago we bought an RV-4 from BCF Mack...

Essential Upgrades for a Smarter, Safer Australian Home

As we settle into 2026, the concept of the "dream home" has fundamentally shifted. The focus has m...