The Times Australia
The Times World News

.

Do you talk to AI when you’re feeling down? Here’s where chatbots get their therapy advice

  • Written by Centaine Snoswell, Senior Research Fellow, Centre for Health Services Research, The University of Queensland

As more and more people spend time chatting with artificial intelligence (AI) chatbots such as ChatGPT, the topic of mental health has naturally emerged. Some people have positive experiences[1] that make AI seem like a low-cost therapist.

But AIs aren’t therapists. They’re smart and engaging, but they don’t think like humans. ChatGPT and other generative AI models are like your phone’s auto-complete text feature on steroids. They have learned to converse by reading text scraped from the internet.

When someone asks a question (called a prompt) such as “how can I stay calm during a stressful work meeting?” the AI forms a response by randomly choosing words that are as close as possible to the data it saw during training. This happens so fast, with responses that are so relevant, it can feel like talking to a person.

But these models aren’t people[2]. And they definitely are not trained mental health professionals who work under professional guidelines, adhere to a code of ethics, or hold professional registration.

Where does it learn to talk about this stuff?

When you prompt an AI system such as ChatGPT, it draws information from three main sources to respond:

  1. background knowledge it memorised during training
  2. external information sources
  3. information you previously provided.

1. Background knowledge

To develop an AI language model, the developers teach the model by having it read vast quantities of data in a process called “training”.

Where does this information come from? Broadly speaking, anything that can be publicly scraped from the internet. This can include everything from academic papers, eBooks, reports, free news articles, through to blogs, YouTube transcripts, or comments from discussion forums such as Reddit.

Are these sources reliable places to find mental health advice? Sometimes. Are they always in your best interest and filtered through a scientific evidence based approach? Not always. The information is also captured at a single point in time when the AI is built, so may be out-of-date.

A lot of detail also needs to be discarded to squish it into the AI’s “memory”. This is part of why AI models are prone to hallucination[3] and getting details wrong[4].

2. External information sources

The AI developers might connect the chatbot itself with external tools, or knowledge sources, such as Google for searches or a curated database.

When you ask Microsoft’s Bing Copilot a question and you see numbered references in the answer, this indicates the AI has relied on an external search to get updated information in addition to what is stored in its memory.

Meanwhile, some dedicated mental health chatbots[5] are able to access therapy guides and materials to help direct conversations along helpful lines.

3. Information previously provided

AI platforms also have access to information you have previously supplied in conversations, or when signing up to the platform.

When you register for the companion AI platform Replika, for example, it learns your name, pronouns, age, preferred companion appearance and gender, IP address and location, the kind of device you are using, and more (as well as your credit card details).

On many chatbot platforms[6], anything you’ve ever said to an AI companion might be stored away for future reference. All of these details can be dredged up and referenced when an AI responds.

And we know these AI systems are like friends who affirm what you say (a problem known as sycophancy[7]) and steer conversation back to interests you have already discussed. This is unlike a professional therapist who can draw from training and experience to help challenge or redirect your thinking where needed.

What about specific apps for mental health?

Most people would be familiar with the big models such as OpenAI’s ChatGPT, Google’s Gemini, or Microsofts’ Copilot. These are general purpose models. They are not limited to specific topics or trained to answer any specific questions.

But developers can make specialised AIs that are trained to discuss specific topics, like mental health, such as Woebot and Wysa.

Some studies[8] show these mental health specific chatbots might be able to reduce users’ anxiety and depression symptoms[9]. Or that they can improve therapy techniques such as journalling[10], by providing guidance. There is also some evidence that AI-therapy and professional therapy deliver some equivalent mental health outcomes[11] in the short term.

However, these studies have all examined short-term use. We do not yet know what impacts excessive or long-term chatbot use has on mental health. Many studies also exclude participants who are suicidal or who have a severe psychotic disorder. And many studies are funded by the developers of the same chatbots, so the research may be biased.

Researchers are also identifying potential harms and mental health risks. The companion chat platform Character.ai, for example, has been implicated in ongoing legal case over a user suicide[12].

This evidence all suggests AI chatbots may be an option to fill gaps where there is a shortage in mental health professionals[13], assist with referrals[14], or at least provide interim support between appointments or to support people on waitlists.

Bottom line

At this stage, it’s hard to say whether AI chatbots are reliable and safe enough to use as a stand-alone therapy option.

More research is needed[15] to identify if certain types of users are more at risk of the harms that AI chatbots might bring.

It’s also unclear if we need to be worried about emotional dependence[16], unhealthy attachment, worsening loneliness, or intensive use[17].

AI chatbots may be a useful place to start when you’re having a bad day and just need a chat. But when the bad days continue to happen, it’s time to talk to a professional as well.

References

  1. ^ positive experiences (www.abc.net.au)
  2. ^ aren’t people (theconversation.com)
  3. ^ hallucination (theconversation.com)
  4. ^ getting details wrong (theconversation.com)
  5. ^ dedicated mental health chatbots (formative.jmir.org)
  6. ^ many chatbot platforms (ai.nejm.org)
  7. ^ a problem known as sycophancy (theconversation.com)
  8. ^ studies (bmcpsychiatry.biomedcentral.com)
  9. ^ reduce users’ anxiety and depression symptoms (www.tandfonline.com)
  10. ^ journalling (www.sciencedirect.com)
  11. ^ some equivalent mental health outcomes (humanfactors.jmir.org)
  12. ^ been implicated in ongoing legal case over a user suicide (theconversation.com)
  13. ^ shortage in mental health professionals (www.ranzcp.org)
  14. ^ referrals (transform.england.nhs.uk)
  15. ^ More research is needed (journals.sagepub.com)
  16. ^ emotional dependence (www.theguardian.com)
  17. ^ intensive use (www.sciencedirect.com)

Read more https://theconversation.com/do-you-talk-to-ai-when-youre-feeling-down-heres-where-chatbots-get-their-therapy-advice-257732

Times Magazine

Headless CMS in Digital Twins and 3D Product Experiences

Image by freepik As the metaverse becomes more advanced and accessible, it's clear that multiple sectors will use digital twins and 3D product experiences to visualize, connect, and streamline efforts better. A digital twin is a virtual replica of ...

The Decline of Hyper-Casual: How Mid-Core Mobile Games Took Over in 2025

In recent years, the mobile gaming landscape has undergone a significant transformation, with mid-core mobile games emerging as the dominant force in app stores by 2025. This shift is underpinned by changing user habits and evolving monetization tr...

Understanding ITIL 4 and PRINCE2 Project Management Synergy

Key Highlights ITIL 4 focuses on IT service management, emphasising continual improvement and value creation through modern digital transformation approaches. PRINCE2 project management supports systematic planning and execution of projects wit...

What AI Adoption Means for the Future of Workplace Risk Management

Image by freepik As industrial operations become more complex and fast-paced, the risks faced by workers and employers alike continue to grow. Traditional safety models—reliant on manual oversight, reactive investigations, and standardised checklist...

From Beach Bops to Alpine Anthems: Your Sonos Survival Guide for a Long Weekend Escape

Alright, fellow adventurers and relaxation enthusiasts! So, you've packed your bags, charged your devices, and mentally prepared for that glorious King's Birthday long weekend. But hold on, are you really ready? Because a true long weekend warrior kn...

Effective Commercial Pest Control Solutions for a Safer Workplace

Keeping a workplace clean, safe, and free from pests is essential for maintaining productivity, protecting employee health, and upholding a company's reputation. Pests pose health risks, can cause structural damage, and can lead to serious legal an...

The Times Features

Tricia Paoluccio designer to the stars

The Case for Nuturing Creativity in the Classroom, and in our Lives I am an actress and an artist who has had the privilege of sharing my work across many countries, touring my ...

Duke of Dural to Get Rooftop Bar as New Owners Invest in Venue Upgrade

The Duke of Dural, in Sydney’s north-west, is set for a major uplift under new ownership, following its acquisition by hospitality group Good Beer Company this week. Led by resp...

Prefab’s Second Life: Why Australia’s Backyard Boom Needs a Circular Makeover

The humble granny flat is being reimagined not just as a fix for housing shortages, but as a cornerstone of circular, factory-built architecture. But are our systems ready to s...

Melbourne’s Burglary Boom: Break-Ins Surge Nearly 25%

Victorian homeowners are being warned to act now, as rising break-ins and falling arrest rates paint a worrying picture for suburban safety. Melbourne residents are facing an ...

Exploring the Curriculum at a Modern Junior School in Melbourne

Key Highlights The curriculum at junior schools emphasises whole-person development, catering to children’s physical, emotional, and intellectual needs. It ensures early year...

Distressed by all the bad news? Here’s how to stay informed but still look after yourself

If you’re feeling like the news is particularly bad at the moment, you’re not alone. But many of us can’t look away – and don’t want to. Engaging with news can help us make ...