The Times Australia
Google AI
The Times World News

.

The latest version of ChatGPT has a feature you’ll fall in love with. And that’s a worry

  • Written by Rob Brooks, Scientia Professor of Evolutionary Ecology; Academic Lead of UNSW's Grand Challenges Program, UNSW Sydney

If you’re a paid subscriber to ChatGPT, you may have noticed the artificial intelligence (AI) large language model has recently started to sound more human when you are having audio interactions with it.

That’s because the company behind the language model-cum-chatbot, OpenAI, is currently running a limited pilot of a new feature known as “advanced voice mode”.

OpenAI says this new mode[1] “features more natural, real-time conversations that pick up on and respond with emotion and non-verbal cues”. It plans[2] for all paid ChatGPT subscribers to have access to the advanced voice mode in coming months.

Advanced voice mode sounds strikingly human. There aren’t the awkward gaps we are used to with voice assistants; instead it seems to take breaths like a human would. It is also unfazed by interruption, conveys appropriate emotion cues and seems to infer the user’s emotional state from voice cues.

But at the same time as making ChatGPT seem more human, OpenAI has expressed concern[3] that users might respond to the chatbot as if it were human – by developing an intimate relationship with it.

This is not a hypothetical. For example, a social media influencer named Lisa Li has coded ChatGPT to be her “boyfriend”[4]. But why exactly do some people develop intimate relationships with a chatbot?

The evolution of intimacy

Humans have a remarkable capacity for friendship and intimacy. This is an extension of the way primates physically groom one another[5] to build alliances that can be called upon in times of strife.

But our ancestors also evolved a remarkable capacity to “groom” one another verbally[6]. This drove the evolutionary cycle in which the language centres in our brains became larger and what we did with language became more complex.

More complex language in turn enabled more complex socialising with larger networks of relatives, friends and allies. It also enlarged the social parts of our brains.

Language evolved alongside human social behaviour. The way we draw an acquaintance into friendship or a friend into intimacy is largely through conversation.

Experiments in the 1990s[7] revealed that conversational back-and-forth, especially when it involves disclosing personal details, builds the intimate sense our conversation partner is somehow part of us.

So I’m not surprised that attempts to replicate this process of “escalating self-disclosure” between humans and chatbots[8] results in humans feeling intimate with the chatbots[9].

And that’s just with text input. When the main sensory experience of conversation – voice – gets involved, the effect is amplified. Even voice-based assistants that don’t sound human, such as Siri and Alexa, still get an avalanche of marriage proposals[10].

The writing was on the lab chalkboard

If OpenAI were to ask me how to ensure users don’t form social relationships with ChatGPT, I would have a few simple recommendations.

First, don’t give it a voice. Second, don’t make it capable of holding up one end of an apparent conversation. Basically don’t make the product you made.

The product is so powerful precisely because it does such an excellent job of mimicking the traits we use to form social relationships.

Close-up of GPT-4o displayed on a smartphone screen.
OpenAI should have known the risks of creating a human-like chatbot. QubixStudio/Shutterstock[11]

The writing was on the laboratory chalkboard since the first chatbots flickered on nearly 60 years ago[12]. Computers have been recognised as social actors[13] for at least 30 years. The advanced voice mode of ChatGPT is merely the next impressive increment, not what the tech industry would gushingly call a “game changer”.

That users not only form relationships with chatbots but develop very close personal feelings became clear early last year when users of the virtual friend platform Replika AI[14] found themselves unexpectedly cut off from the most advanced functions of their chatbots.

Replika was less advanced than the new version of ChatGPT. And yet the interactions were of such a quality that users formed surprisingly deep attachments.

The risks are real

Many people, starved[15] for the kind of company that listens in a non-judgmental way, will get a lot out of this new generation of chatbots. They may feel less lonely and isolated[16]. These kinds of benefits of technology can never be overlooked.

But the potential dangers of ChatGPT’s advanced voice mode are also very real.

Time spent chatting with any bot is time that can’t be spent interacting with friends and family. And people who spend a lot of time with technology[17] are at greatest risk[18] of displacing relationships with other humans.

As OpenAI identifies, chatting with bots can also contaminate existing relationships people have with other people. They may come to expect their partners or friends to behave like polite, submissive, deferential chatbots.

These bigger effects of machines on culture[19] are going to become more prominent. On the upside, they may also provide deep insights into how culture works.

References

  1. ^ OpenAI says this new mode (help.openai.com)
  2. ^ It plans (help.openai.com)
  3. ^ has expressed concern (openai.com)
  4. ^ has coded ChatGPT to be her “boyfriend” (edition.cnn.com)
  5. ^ groom one another (link.springer.com)
  6. ^ to “groom” one another verbally (www.hup.harvard.edu)
  7. ^ Experiments in the 1990s (journals.sagepub.com)
  8. ^ between humans and chatbots (academic.oup.com)
  9. ^ intimate with the chatbots (dl.acm.org)
  10. ^ an avalanche of marriage proposals (www.yahoo.com)
  11. ^ QubixStudio/Shutterstock (www.shutterstock.com)
  12. ^ nearly 60 years ago (dl.acm.org)
  13. ^ recognised as social actors (dl.acm.org)
  14. ^ users of the virtual friend platform Replika AI (theconversation.com)
  15. ^ starved (www.thelancet.com)
  16. ^ less lonely and isolated (psyche.co)
  17. ^ time with technology (www.usu.edu)
  18. ^ risk (www.sciencedirect.com)
  19. ^ effects of machines on culture (www.nature.com)

Read more https://theconversation.com/the-latest-version-of-chatgpt-has-a-feature-youll-fall-in-love-with-and-thats-a-worry-238073

Times Magazine

With Nvidia’s second-best AI chips headed for China, the US shifts priorities from security to trade

This week, US President Donald Trump approved previously banned exports[1] of Nvidia’s powerful ...

Navman MiVue™ True 4K PRO Surround honest review

If you drive a car, you should have a dashcam. Need convincing? All I ask that you do is search fo...

Australia’s supercomputers are falling behind – and it’s hurting our ability to adapt to climate change

As Earth continues to warm, Australia faces some important decisions. For example, where shou...

Australia’s electric vehicle surge — EVs and hybrids hit record levels

Australians are increasingly embracing electric and hybrid cars, with 2025 shaping up as the str...

Tim Ayres on the AI rollout’s looming ‘bumps and glitches’

The federal government released its National AI Strategy[1] this week, confirming it has dropped...

Seven in Ten Australian Workers Say Employers Are Failing to Prepare Them for AI Future

As artificial intelligence (AI) accelerates across industries, a growing number of Australian work...

The Times Features

I’m heading overseas. Do I really need travel vaccines?

Australia is in its busiest month[1] for short-term overseas travel. And there are so many thi...

Mint Payments partners with Zip Co to add flexible payment options for travel merchants

Mint Payments, Australia's leading travel payments specialist, today announced a partnership with ...

When Holiday Small Talk Hurts Inclusion at Work

Dr. Tatiana Andreeva, Associate Professor in Management and Organisational Behaviour, Maynooth U...

Human Rights Day: The Right to Shelter Isn’t Optional

It is World Human Rights Day this week. Across Australia, politicians read declarations and clai...

In awkward timing, government ends energy rebate as it defends Wells’ spendathon

There are two glaring lessons for politicians from the Anika Wells’ entitlements affair. First...

Australia’s Coffee Culture Faces an Afternoon Rethink as New Research Reveals a Surprising Blind Spot

Australia’s celebrated coffee culture may be world‑class in the morning, but new research* sugge...

Reflections invests almost $1 million in Tumut River park to boost regional tourism

Reflections Holidays, the largest adventure holiday park group in New South Wales, has launched ...

Groundbreaking Trial: Fish Oil Slashes Heart Complications in Dialysis Patients

A significant development for patients undergoing dialysis for kidney failure—a group with an except...

Worried after sunscreen recalls? Here’s how to choose a safe one

Most of us know sunscreen is a key way[1] to protect areas of our skin not easily covered by c...