The Times Australia
Fisher and Paykel Appliances
The Times World News

.

Do you talk to AI when you’re feeling down? Here’s where chatbots get their therapy advice

  • Written by Centaine Snoswell, Senior Research Fellow, Centre for Health Services Research, The University of Queensland

As more and more people spend time chatting with artificial intelligence (AI) chatbots such as ChatGPT, the topic of mental health has naturally emerged. Some people have positive experiences[1] that make AI seem like a low-cost therapist.

But AIs aren’t therapists. They’re smart and engaging, but they don’t think like humans. ChatGPT and other generative AI models are like your phone’s auto-complete text feature on steroids. They have learned to converse by reading text scraped from the internet.

When someone asks a question (called a prompt) such as “how can I stay calm during a stressful work meeting?” the AI forms a response by randomly choosing words that are as close as possible to the data it saw during training. This happens so fast, with responses that are so relevant, it can feel like talking to a person.

But these models aren’t people[2]. And they definitely are not trained mental health professionals who work under professional guidelines, adhere to a code of ethics, or hold professional registration.

Where does it learn to talk about this stuff?

When you prompt an AI system such as ChatGPT, it draws information from three main sources to respond:

  1. background knowledge it memorised during training
  2. external information sources
  3. information you previously provided.

1. Background knowledge

To develop an AI language model, the developers teach the model by having it read vast quantities of data in a process called “training”.

Where does this information come from? Broadly speaking, anything that can be publicly scraped from the internet. This can include everything from academic papers, eBooks, reports, free news articles, through to blogs, YouTube transcripts, or comments from discussion forums such as Reddit.

Are these sources reliable places to find mental health advice? Sometimes. Are they always in your best interest and filtered through a scientific evidence based approach? Not always. The information is also captured at a single point in time when the AI is built, so may be out-of-date.

A lot of detail also needs to be discarded to squish it into the AI’s “memory”. This is part of why AI models are prone to hallucination[3] and getting details wrong[4].

2. External information sources

The AI developers might connect the chatbot itself with external tools, or knowledge sources, such as Google for searches or a curated database.

When you ask Microsoft’s Bing Copilot a question and you see numbered references in the answer, this indicates the AI has relied on an external search to get updated information in addition to what is stored in its memory.

Meanwhile, some dedicated mental health chatbots[5] are able to access therapy guides and materials to help direct conversations along helpful lines.

3. Information previously provided

AI platforms also have access to information you have previously supplied in conversations, or when signing up to the platform.

When you register for the companion AI platform Replika, for example, it learns your name, pronouns, age, preferred companion appearance and gender, IP address and location, the kind of device you are using, and more (as well as your credit card details).

On many chatbot platforms[6], anything you’ve ever said to an AI companion might be stored away for future reference. All of these details can be dredged up and referenced when an AI responds.

And we know these AI systems are like friends who affirm what you say (a problem known as sycophancy[7]) and steer conversation back to interests you have already discussed. This is unlike a professional therapist who can draw from training and experience to help challenge or redirect your thinking where needed.

What about specific apps for mental health?

Most people would be familiar with the big models such as OpenAI’s ChatGPT, Google’s Gemini, or Microsofts’ Copilot. These are general purpose models. They are not limited to specific topics or trained to answer any specific questions.

But developers can make specialised AIs that are trained to discuss specific topics, like mental health, such as Woebot and Wysa.

Some studies[8] show these mental health specific chatbots might be able to reduce users’ anxiety and depression symptoms[9]. Or that they can improve therapy techniques such as journalling[10], by providing guidance. There is also some evidence that AI-therapy and professional therapy deliver some equivalent mental health outcomes[11] in the short term.

However, these studies have all examined short-term use. We do not yet know what impacts excessive or long-term chatbot use has on mental health. Many studies also exclude participants who are suicidal or who have a severe psychotic disorder. And many studies are funded by the developers of the same chatbots, so the research may be biased.

Researchers are also identifying potential harms and mental health risks. The companion chat platform Character.ai, for example, has been implicated in ongoing legal case over a user suicide[12].

This evidence all suggests AI chatbots may be an option to fill gaps where there is a shortage in mental health professionals[13], assist with referrals[14], or at least provide interim support between appointments or to support people on waitlists.

Bottom line

At this stage, it’s hard to say whether AI chatbots are reliable and safe enough to use as a stand-alone therapy option.

More research is needed[15] to identify if certain types of users are more at risk of the harms that AI chatbots might bring.

It’s also unclear if we need to be worried about emotional dependence[16], unhealthy attachment, worsening loneliness, or intensive use[17].

AI chatbots may be a useful place to start when you’re having a bad day and just need a chat. But when the bad days continue to happen, it’s time to talk to a professional as well.

References

  1. ^ positive experiences (www.abc.net.au)
  2. ^ aren’t people (theconversation.com)
  3. ^ hallucination (theconversation.com)
  4. ^ getting details wrong (theconversation.com)
  5. ^ dedicated mental health chatbots (formative.jmir.org)
  6. ^ many chatbot platforms (ai.nejm.org)
  7. ^ a problem known as sycophancy (theconversation.com)
  8. ^ studies (bmcpsychiatry.biomedcentral.com)
  9. ^ reduce users’ anxiety and depression symptoms (www.tandfonline.com)
  10. ^ journalling (www.sciencedirect.com)
  11. ^ some equivalent mental health outcomes (humanfactors.jmir.org)
  12. ^ been implicated in ongoing legal case over a user suicide (theconversation.com)
  13. ^ shortage in mental health professionals (www.ranzcp.org)
  14. ^ referrals (transform.england.nhs.uk)
  15. ^ More research is needed (journals.sagepub.com)
  16. ^ emotional dependence (www.theguardian.com)
  17. ^ intensive use (www.sciencedirect.com)

Read more https://theconversation.com/do-you-talk-to-ai-when-youre-feeling-down-heres-where-chatbots-get-their-therapy-advice-257732

Active Wear

Times Magazine

Kindness Tops the List: New Survey Reveals Australia’s Defining Value

Commentary from Kath Koschel, founder of Kindness Factory.  In a time where headlines are dominat...

In 2024, the climate crisis worsened in all ways. But we can still limit warming with bold action

Climate change has been on the world’s radar for decades[1]. Predictions made by scientists at...

End-of-Life Planning: Why Talking About Death With Family Makes Funeral Planning Easier

I spend a lot of time talking about death. Not in a morbid, gloomy way—but in the same way we d...

YepAI Joins Victoria's AI Trade Mission to Singapore for Big Data & AI World Asia 2025

YepAI, a Melbourne-based leader in enterprise artificial intelligence solutions, announced today...

Building a Strong Online Presence with Katoomba Web Design

Katoomba web design is more than just creating a website that looks good—it’s about building an onli...

September Sunset Polo

International Polo Tour To Bridge Historic Sport, Life-Changing Philanthropy, and Breath-Taking Beau...

The Times Features

Pharmac wants to trim its controversial medicines waiting list – no list at all might be better

New Zealand’s drug-buying agency Pharmac is currently consulting[1] on a change to how it mana...

NRMA Partnership Unlocks Cinema and Hotel Discounts

My NRMA Rewards, one of Australia’s largest membership and benefits programs, has announced a ne...

Restaurants to visit in St Kilda and South Yarra

Here are six highly-recommended restaurants split between the seaside suburb of St Kilda and the...

The Year of Actually Doing It

There’s something about the week between Christmas and New Year’s that makes us all pause and re...

Jetstar to start flying Sunshine Coast to Singapore Via Bali With Prices Starting At $199

The Sunshine Coast is set to make history, with Jetstar today announcing the launch of direct fl...

Why Melbourne Families Are Choosing Custom Home Builders Over Volume Builders

Across Melbourne’s growing suburbs, families are re-evaluating how they build their dream homes...

Australian Startup Business Operators Should Make Connections with Asian Enterprises — That Is Where Their Future Lies

In the rapidly shifting global economy, Australian startups are increasingly finding that their ...

How early is too early’ for Hot Cross Buns to hit supermarket and bakery shelves

Every year, Australians find themselves in the middle of the nation’s most delicious dilemmas - ...

Ovarian cancer community rallied Parliament

The fight against ovarian cancer took centre stage at Parliament House in Canberra last week as th...