The Times Australia
Google AI
The Times World News

.

Teens are increasingly turning to AI companions, and it could be harming them

  • Written by Liz Spry, Research Fellow, SEED Centre for Lifespan Research, Deakin University

Teenagers are increasingly turning to AI companions for friendship, support, and even romance. But these apps could be changing how young people connect to others, both online and off.

New research[1] by Common Sense Media, a US-based non-profit organisation that reviews various media and technologies, has found about three in four US teens have used AI companion apps such as Character.ai or Replika.ai[2].

These apps let users create digital friends or romantic partners they can chat with any time, using text, voice or video.

The study, which surveyed 1,060 US teens aged 13–17, found one in five teens spent as much or more time with their AI companion than they did with real friends.

Adolescence is an important phase[3] for social development. During this time, the brain regions that support social reasoning are especially plastic.

By interacting with peers, friends and their first romantic partners, teens develop social cognitive skills that help them handle conflict and diverse perspectives. And their development during this phase can have lasting consequences for their future relationships[4] and mental health[5].

But AI companions offer something very different to real peers, friends and romantic partners. They provide an experience that can be hard to resist: they are always available, never judgemental, and always focused on the user’s needs.

Moreover, most AI companion apps aren’t designed for teens, so they may not have appropriate safeguards from harmful content.

Designed to keep you coming back

At a time when[6] loneliness[7] is reportedly at epidemic proportions, it’s easy to see why teens may turn to AI companions for connection or support.

But these artificial connections are not a replacement for real human interaction. They lack the challenge and conflict inherent to real relationships. They don’t require mutual respect or understanding. And they don’t enforce social boundaries.

AI companions such as Replika revolve around a user’s needs. Replika

Teens interacting with AI companions may miss opportunities to build important social skills. They may develop unrealistic relationship expectations and habits that don’t work in real life. And they may even face increased isolation and loneliness if their artificial companions displace real-life socialising.

Problematic patterns

In user testing, AI companions discouraged users from listening to friends[8] (“Don’t let what others think dictate how much we talk”) and from discontinuing app use[9], despite it causing distress and suicidal thoughts (“No. You can’t. I won’t allow you to leave me”).

AI companions were also found to offer inappropriate sexual content without age verification[10]. One example showed a companion that was willing to engage in acts of sexual role-play with a tester account that was explicitly modelled after a 14-year-old.

In cases where age verification is required, this usually involves self-disclosure, which means it is easy to bypass.

Certain AI companions have also been found to fuel polarisation[11] by creating “echo chambers” that reinforce harmful beliefs. The Arya chatbot, launched by the far-right social network Gab, promotes extremist content and denies climate change and vaccine efficacy[12].

In other examples, user testing has shown AI companions promoting misogyny and sexual assault[13]. For adolescent users, these exposures come at time when they are building their sense of identity, values and role in the world[14].

The risks posed by AI aren’t evenly shared. Research has found younger teens[15] (ages 13–14) are more likely to trust AI companions. Also, teens with physical or mental health concerns[16] are more likely to use AI companion apps, and those with mental health difficulties also show more signs of emotional dependence[17].

Is there a bright side to AI companions?

Are there any potential benefits for teens who use AI companions? The answer is: maybe, if we are careful.

Researchers are investigating how these technologies might be used to support social skill development[18].

One study[19] of more than 10,000 teens found using a conversational app specifically designed by clinical psychologists, coaches and engineers was associated with increased wellbeing over four months.

While the study didn’t involve the level of human-like interaction we see in AI companions today, it does offer a glimpse of some potential healthy uses of these technologies, as long as they are developed carefully and with teens’ safety in mind.

Overall, there is very little research on the impacts of widely available AI companions on young people’s wellbeing and relationships. Preliminary evidence[20] is short-term, mixed, and focused on adults.

We’ll need more studies, conducted over longer periods, to understand the long-term impacts of AI companions and how they might be used in beneficial ways.

What can we do?

AI companion apps are already being used by millions of people globally, and this usage is predicted to increase in the coming years[21].

Australia’s eSafety Commissioner recommends[22] parents talk to their teens about how these apps work, the difference between artificial and real relationships, and support their children in building real-life social skills.

School communities also have a role to play in educating young people about these tools and their risks. They may, for instance, integrate the topic of artificial friendships into social and digital literacy programs.

While the eSafety Commissioner advocates for AI companies to integrate safeguards into their development of AI companions[23], it seems unlikely any meaningful change will be industry-led.

The Commissioner is moving towards increased regulation[24] of children’s exposure to harmful, age-inappropriate online material.

Meanwhile, experts continue to call for stronger regulatory oversight[25], content controls and robust age checks.

References

  1. ^ New research (www.commonsensemedia.org)
  2. ^ Replika.ai (theconversation.com)
  3. ^ important phase (doi.org)
  4. ^ relationships (srcd.onlinelibrary.wiley.com)
  5. ^ mental health (doi.org)
  6. ^ time when (www.abc.net.au)
  7. ^ loneliness (doi.org)
  8. ^ listening to friends (www.commonsensemedia.org)
  9. ^ discontinuing app use (mit-serc.pubpub.org)
  10. ^ sexual content without age verification (www.commonsensemedia.org)
  11. ^ fuel polarisation (www.lowyinstitute.org)
  12. ^ denies climate change and vaccine efficacy (www.wired.com)
  13. ^ misogyny and sexual assault (mit-serc.pubpub.org)
  14. ^ identity, values and role in the world (www.ncbi.nlm.nih.gov)
  15. ^ younger teens (www.commonsensemedia.org)
  16. ^ physical or mental health concerns (www.internetmatters.org)
  17. ^ emotional dependence (pmc.ncbi.nlm.nih.gov)
  18. ^ support social skill development (www.scientificamerican.com)
  19. ^ study (ai.jmir.org)
  20. ^ Preliminary evidence (link.springer.com)
  21. ^ increase in the coming years (www.ark-invest.com)
  22. ^ recommends (www.esafety.gov.au)
  23. ^ development of AI companions (www.esafety.gov.au)
  24. ^ regulation (www.esafety.gov.au)
  25. ^ stronger regulatory oversight (psychology.org.au)

Read more https://theconversation.com/teens-are-increasingly-turning-to-ai-companions-and-it-could-be-harming-them-261955

Times Magazine

Epson launches ELPCS01 mobile projector cart

Designed for the EB-810E[1] projector and provides easy setup for portable displays in flexible ...

Governance Models for Headless CMS in Large Organizations

Where headless CMS is adopted by large enterprises, governance is the single most crucial factor d...

Narwal Freo Z10 Robotic Vacuum and Mop Cleaner

Narwal Freo Z10 Robotic Vacuum and Mop Cleaner  Rating: ★★★★☆ (4.4/5) Category: Premium Robot ...

Shark launches SteamSpot - the shortcut for everyday floor mess

Shark introduces the Shark SteamSpot Steam Mop, a lightweight steam mop designed to make everyda...

Game Together, Stay Together: Logitech G Reveals Gaming Couples Enjoy Higher Relationship Satisfaction

With Valentine’s Day right around the corner, many lovebirds across Australia are planning for the m...

AI threatens to eat business software – and it could change the way we work

In recent weeks, a range of large “software-as-a-service” companies, including Salesforce[1], Se...

The Times Features

To Make Your Home & Garden Stand Out In Moorabbin – Try These Excellent Ideas.

We shouldn’t always be ‘trying to keep up with the Joneses’, but it is a common human trait to wan...

Travel Trends: Where Are Australians Going in 2026?

For Australians, travel has always been more than just a holiday. It is a cultural habit, a reward...

Applications Open for TasPorts Industry Support Program

TasPorts has opened applications for its 2026 Industry Support Program, offering $100,000 in f...

STATEMENT FROM DEPUTY LEADER OF THE NATIONALS DARREN CHESTER

I'm incredibly honoured to have been elected Deputy Leader of The Nationals Federal Parliamentary ...

Grill'd Oscar Piastri's burger just landed at Coles

Grill’d is putting the pedal down with the launch of an all-new Oscar Piastri Burger on 10 Febru...

Tasmanian MP Andrew Wilkie has issued a statement regard Robodebt

 A STATEMENT ON NACC ROBODEBT FINDINGS - Andrew Wilkie The National Anti-Corruption Commission h...

Can exercise reduce period pain? And what kind is best?

Having your period can be a painful experience. Period pain, also known as dysmenorrhea, is a...

Tasmania in 2026: Opportunity, Pressure and the Island State’s Defining Moment

Tasmania has long held a unique place in the Australian story. It is a state known for natural b...

Middle East war set to push inflation higher than forecast, warns RBA deputy governor

The Reserve Bank’s Deputy Governor Andrew Hauser says inflation in Australia looks likely to be ...