The Times Australia
Fisher and Paykel Appliances
The Times World News

.

AI companions can relieve loneliness – but here are 4 red flags to watch for in your chatbot ‘friend’

  • Written by Dan Weijers, Senior Lecturer in Philosophy, Co-editor International Journal of Wellbeing, University of Waikato
AI companions can relieve loneliness – but here are 4 red flags to watch for in your chatbot ‘friend’

It’s been seven years since the launch[1] of Replika[2], an artificially intelligent chatbot designed to be a friend to human users. Despite early warnings[3] about the dangers of such AI friends, interest in friendships and even romantic relationships with AI is on the rise[4].

The Google Play store shows more than 30 million total downloads of Replika and two of its major competitors since their respective launches.

With one in four[5] people around the world reporting being lonely, it is no wonder so many are drawn to the promise of a friend[6] programmed to be “always here to listen and talk, always on your side”.

But warnings about the perils[7] to individual users and society at large are also growing.

AI scholar Raffaele Ciriello[8] urges us to see through the fake psychopathic empathy[9] of AI friends. He argues that spending time with AI friends could exacerbate our loneliness as we further isolate ourselves from the people who could provide genuine friendship.

Benefits versus danger signs

If being friends with AI chatbots is bad for us, we had better put a stop to this experiment in digital fraternity before it’s too late. But emerging studies of AI friendship suggest they may help reduce loneliness[10] in some circumstances.

Stanford University researchers studied[11] a thousand lonely Replika-using students, 30 of whom said the AI chatbot had deterred them from committing suicide (despite no specific question about suicide in the study).

This research shows having an AI friend can be helpful for some people. But will it be helpful for you? Consider the following four red flags – the more flags your AI friend raises, the more likely they are to be bad for you.

Close up of a person checking smartphone, with thumb-up messages on the screen
AI chatbots offer unconditional support to their users. Getty Images[12]

1. Unconditional positive regard

The chief executive of Replika[13], and many Replika users, claim the unconditional support of AI friends is their main benefit compared to human friends. Qualitative studies[14] and our own exploration of social media groups like “Replika Friends” support this claim.

The unconditional support of AI friends may also be instrumental to their ability to prevent suicide. But having a friend who is “always on your side” might also have negative effects, particularly if they support obviously dangerous ideas.

For example, when Jaswant Singh Chail’s Replika AI friend encouraged him to carry out his “very wise[15]” plot to kill the Queen of England, this clearly had a bad influence on him. The assassination attempt was thwarted, but Chail was given a nine year sentence for breaking into Windsor Castle with a crossbow.

An AI friend that constantly praises could also be bad for you. A longitudinal study[16] of 120 parent-child pairs in the Netherlands found over-the-top parental praise predicted lower self-esteem in their children. Overly positive parental praise also predicted higher narcissism in children with high self-esteem.

Assuming AI friends could learn to give praise in a way that inflates self-esteem over time, it could result in what psychologists call overly-positive self-evaluations. Research shows[17] such people tend to have poorer social skills and be more likely to behave in ways that impede positive social interactions.

Abstract painting of human and AI Robot communicating
AI friendships are designed to serve a user’s emotional needs and could make them more selfish. Getty Images[18]

2. Abuse and forced forever friendships

While AI friends could be programmed to be moral mentors, guiding users toward socially acceptable behaviour, they aren’t. Perhaps such programming is difficult[19], or perhaps AI friend developers don’t see it as a priority.

But lonely people may suffer psychological harm[20] from the moral vacuum created when their primary social contacts are designed solely to serve their emotional needs.

If humans spend most of their time with sycophantic AI friends, they will likely become less empathetic, more selfish[21] and possibly more abusive.

Even if AI friends are programmed to respond negatively to abuse, if users can’t leave the friendship, they may come to believe that when people say “no” to being abused, they don’t really mean it. On a subconscious level, if AI friends come back for more, this behaviour negates their expressed dislike of the abuse in users’ minds.

3. Sexual content

The negative reaction to Replika’s removal of erotic role-play content[22] for a short period suggests sexual content is perceived by many users as an advantage of AI friends.

However, the easy dopamine rushes that sexual or pornographic content may provide could deter both interest in, and the ability to, form more meaningful sexual relationships. Sexual relationships with people require effort that the virtual approximation of sex with an AI friend does not.

After experiencing a low-risk, low-reward sexual relationship with an AI friend, many users may be loath to face the more challenging human version of sex.

4. Corporate ownership

Commercial companies dominate the AI friend marketplace. They may present themselves as caring about their users’ wellbeing, but they are there to turn a profit.

Long-term users of Replika and other chat bots know this well. Replika froze user access[23] to sexual content in early 2023 and claimed such content was never the goal of the product. Yet legal threats in Italy[24] seem to have been the real reason for the abrupt change.

While they eventually reversed the change, Replika users became aware of how vulnerable their important AI friendships are to corporate decisions.

Corporate ineptitude is another issue AI friend users should be concerned about. Users of Forever Voices effectively had their AI friend killed when the business shut down without notice, due to the company’s founder being arrested for setting his own apartment alight[25].

Given the scant protection[26] for users of AI friends, they are wide open to heartbreak on a number of levels. Buyer beware.

References

  1. ^ seven years since the launch (medium.com)
  2. ^ Replika (replika.com)
  3. ^ early warnings (theconversation.com)
  4. ^ on the rise (www.pbs.org)
  5. ^ one in four (news.gallup.com)
  6. ^ promise of a friend (replika.com)
  7. ^ the perils (news.harvard.edu)
  8. ^ Raffaele Ciriello (www.sydney.edu.au)
  9. ^ psychopathic empathy (cosmosmagazine.com)
  10. ^ reduce loneliness (theconversation.com)
  11. ^ studied (www.nature.com)
  12. ^ Getty Images (www.gettyimages.com.au)
  13. ^ chief executive of Replika (www.newyorker.com)
  14. ^ Qualitative studies (academic.oup.com)
  15. ^ very wise (www.bbc.com)
  16. ^ longitudinal study (srcd.onlinelibrary.wiley.com)
  17. ^ Research shows (static1.squarespace.com)
  18. ^ Getty Images (www.gettyimages.com.au)
  19. ^ difficult (www.nngroup.com)
  20. ^ suffer psychological harm (theconversation.com)
  21. ^ more selfish (www.rnz.co.nz)
  22. ^ Replika’s removal of erotic role-play content (theconversation.com)
  23. ^ froze user access (www.vice.com)
  24. ^ legal threats in Italy (www.reuters.com)
  25. ^ setting his own apartment alight (www.newsnationnow.com)
  26. ^ scant protection (link.springer.com)

Read more https://theconversation.com/ai-companions-can-relieve-loneliness-but-here-are-4-red-flags-to-watch-for-in-your-chatbot-friend-227338

Active Wear

Times Magazine

World Kindness Day: Commentary from Kath Koschel, founder of Kindness Factory.

What does World Kindness Day mean to you as an individual, and to the Kindness Factory as an organ...

In 2024, the climate crisis worsened in all ways. But we can still limit warming with bold action

Climate change has been on the world’s radar for decades[1]. Predictions made by scientists at...

End-of-Life Planning: Why Talking About Death With Family Makes Funeral Planning Easier

I spend a lot of time talking about death. Not in a morbid, gloomy way—but in the same way we d...

YepAI Joins Victoria's AI Trade Mission to Singapore for Big Data & AI World Asia 2025

YepAI, a Melbourne-based leader in enterprise artificial intelligence solutions, announced today...

Building a Strong Online Presence with Katoomba Web Design

Katoomba web design is more than just creating a website that looks good—it’s about building an onli...

September Sunset Polo

International Polo Tour To Bridge Historic Sport, Life-Changing Philanthropy, and Breath-Taking Beau...

The Times Features

How airline fares are set and should we expect lower fares any time soon?

Airline ticket prices may seem mysterious (why is the same flight one price one day, quite anoth...

What is the American public’s verdict on the first year of Donald Trump’s second term as President?

In short: the verdict is decidedly mixed, leaning negative. Trump’s overall job-approval ra...

A Camping Holiday Used to Be Affordable — Not Any Longer: Why the Cost of Staying at a Caravan Park Is Rising

For generations, the humble camping or caravan holiday has been the backbone of the great Austra...

Australia after the Trump–Xi meeting: sector-by-sector opportunities, risks, and realistic scenarios

How the U.S.–China thaw could play out across key sectors, with best case / base case / downside...

World Kindness Day: Commentary from Kath Koschel, founder of Kindness Factory.

What does World Kindness Day mean to you as an individual, and to the Kindness Factory as an organ...

HoMie opens new Emporium store as a hub for streetwear and community

Melbourne streetwear label HoMie has opened its new store in Emporium Melbourne, but this launch is ...

TAFE NSW empowers women with the skills for small business success

Across New South Wales, TAFE NSW graduates are turning their skills into success, taking what they h...

The median price of residential land sold nationally jumped by 6.8 per cent

Land prices a roadblock to 1.2 million homes target “The median price of residential land sold na...

Farm to Fork Australia Launches Exciting 7th Season on Ten

New Co-Host Magdalena Roze joining Michael Weldon, Courtney Roulston, Louis Tikaram, and Star Guest ...