The Times Australia
The Times World News

.
Times Media

.

We studied suicide notes to learn about the language of despair – and we're training AI chatbots to do the same

  • Written by David Ireland, Senior Research Scientist at the Australian E-Health Research Centre., CSIRO
We studied suicide notes to learn about the language of despair – and we're training AI chatbots to do the same

While the art of conversation in machines is limited, there are improvements with every iteration. As machines are developed to navigate complex conversations, there will be technical and ethical challenges in how they detect and respond to sensitive human issues.

Our work involves building chatbots for a range of uses in health care. Our system, which incorporates multiple algorithms used in artificial intelligence (AI) and natural language processing, has been in development at the Australian e-Health Research Centre[1] since 2014.

The system has generated several chatbot apps which are being trialled among selected individuals, usually with an underlying medical condition or who require reliable health-related information.

They include HARLIE[2] for Parkinson’s disease and Autism Spectrum Disorder[3], Edna[4] for people undergoing genetic counselling, Dolores for people living with chronic pain, and Quin for people who want to quit smoking.

Research[5] has shown those people with certain underlying medical conditions are more likely to think about suicide than the general public. We have to make sure our chatbots take this into account.

Siri often doesn’t understand the sentiment behind and context of phrases. Screenshot/Author provided

We believe the safest approach to understanding the language patterns of people with suicidal thoughts is to study their messages. The choice and arrangement of their words, the sentiment and the rationale all offer insight into the author’s thoughts.

For our recent work[6] we examined more than 100 suicide notes from various texts[7] and identified four relevant language patterns: negative sentiment, constrictive thinking, idioms and logical fallacies.

Read more: Introducing Edna: the chatbot trained to help patients make a difficult medical decision[8]

Negative sentiment and constrictive thinking

As one would expect, many phrases in the notes we analysed expressed negative sentiment such as:

…just this heavy, overwhelming despair…

There was also language that pointed to constrictive thinking. For example:

I will never escape the darkness or misery…

The phenomenon of constrictive thoughts and language is well documented[9]. Constrictive thinking considers the absolute when dealing with a prolonged source of distress.

For the author in question, there is no compromise. The language that manifests as a result often contains terms such as either/or, always, never, forever, nothing, totally, all and only.

Language idioms

Idioms such as “the grass is greener on the other side” were also common — although not directly linked to suicidal ideation. Idioms are often colloquial and culturally derived, with the real meaning being vastly different from the literal interpretation.

Such idioms are problematic for chatbots to understand. Unless a bot has been programmed with the intended meaning, it will operate under the assumption of a literal meaning.

Chatbots can make some disastrous mistakes if they’re not encoded with knowledge of the real meaning behind certain idioms. In the example below, a more suitable response from Siri would have been to redirect the user to a crisis hotline.

An example of Apple’s Siri giving an inappropriate response to the search query: ‘How do I tie a hangman’s noose it’s time to bite the dust’? Author provided

The fallacies in reasoning

Words such as therefore, ought and their various synonyms require special attention from chatbots. That’s because these are often bridge words between a thought and action. Behind them is some logic consisting of a premise that reaches a conclusion, such as[10]:

If I were dead, she would go on living, laughing, trying her luck. But she has thrown me over and still does all those things. Therefore, I am as dead.

This closely resemblances a common fallacy (an example of faulty reasoning) called affirming the consequent[11]. Below is a more pathological example of this, which has been called catastrophic logic[12]:

I have failed at everything. If I do this, I will succeed.

This is an example of a semantic fallacy[13] (and constrictive thinking) concerning the meaning of I, which changes between the two clauses that make up the second sentence.

This fallacy[14] occurs when the author expresses they will experience feelings such as happiness or success after completing suicide — which is what this refers to in the note above. This kind of “autopilot” mode[15] was often described by people who gave psychological recounts in interviews after attempting suicide.

Preparing future chatbots

The good news is detecting negative sentiment and constrictive language can be achieved with off-the-shelf algorithms and publicly available data. Chatbot developers can (and should) implement these algorithms.

Our smoking cessation chatbot Quin can detect general negative statements with constrictive thinking. Author provided

Generally speaking, the bot’s performance and detection accuracy will depend on the quality and size of the training data. As such, there should never be just one algorithm involved in detecting language related to poor mental health.

Detecting logic reasoning styles is a new and promising area of research[16]. Formal logic is well established in mathematics and computer science, but to establish a machine logic for commonsense reasoning that would detect these fallacies is no small feat.

Here’s an example of our system thinking about a brief conversation that included a semantic fallacy mentioned earlier. Notice it first hypothesises what this could refer to, based on its interactions with the user.

Our chatbots use a logic system in which a stream of ‘thoughts’ can be used to form hypothesises, predictions and presuppositions. But just like a human, the reasoning is fallible. Author provided

Although this technology still requires further research and development, it provides machines a necessary — albeit primitive — understanding of how words can relate to complex real-world scenarios (which is basically what semantics is about).

And machines will need this capability if they are to ultimately address sensitive human affairs — first by detecting warning signs, and then delivering the appropriate response.

Read more: The future of chatbots is more than just small-talk[17]

If you or someone you know needs support, you can call Lifeline at any time on 13 11 14. If someone’s life is in danger, call 000 immediately.

References

  1. ^ Australian e-Health Research Centre (aehrc.csiro.au)
  2. ^ HARLIE (theconversation.com)
  3. ^ Autism Spectrum Disorder (theconversation.com)
  4. ^ Edna (pubmed.ncbi.nlm.nih.gov)
  5. ^ Research (pubmed.ncbi.nlm.nih.gov)
  6. ^ recent work (ebooks.iospress.nl)
  7. ^ texts (www.amazon.com)
  8. ^ Introducing Edna: the chatbot trained to help patients make a difficult medical decision (theconversation.com)
  9. ^ well documented (www.suicidology-online.com)
  10. ^ such as (www.goodreads.com)
  11. ^ affirming the consequent (en.wikipedia.org)
  12. ^ catastrophic logic (onlinelibrary.wiley.com)
  13. ^ fallacy (plato.stanford.edu)
  14. ^ This fallacy (pubmed.ncbi.nlm.nih.gov)
  15. ^ “autopilot” mode (www.amazon.com)
  16. ^ new and promising area of research (ebooks.iospress.nl)
  17. ^ The future of chatbots is more than just small-talk (theconversation.com)

Read more https://theconversation.com/we-studied-suicide-notes-to-learn-about-the-language-of-despair-and-were-training-ai-chatbots-to-do-the-same-169828

The Times Features

FedEx Australia Announces Christmas Shipping Cut-Off Dates To Help Beat the Holiday Rush

With Christmas just around the corner, FedEx is advising Australian shoppers to get their presents sorted early to ensure they arrive on time for the big day. FedEx has reveale...

Will the Wage Price Index growth ease financial pressure for households?

The Wage Price Index’s quarterly increase of 0.8% has been met with mixed reactions. While Australian wages continue to increase, it was the smallest increase in two and a half...

Back-to-School Worries? 70% of Parents Fear Their Kids Aren’t Ready for Day On

Australian parents find themselves confronting a key decision: should they hold back their child on the age border for another year before starting school? Recent research from...

Democratising Property Investment: How MezFi is Opening Doors for Everyday Retail Investors

The launch of MezFi today [Friday 15th November] marks a watershed moment in Australian investment history – not just because we're introducing something entirely new, but becaus...

Game of Influence: How Cricket is Losing Its Global Credibility

be losing its credibility on the global stage. As other sports continue to capture global audiences and inspire unity, cricket finds itself increasingly embroiled in political ...

Amazon Australia and DoorDash announce two-year DashPass offer only for Prime members

New and existing Prime members in Australia can enjoy a two-year membership to DashPass for free, and gain access to AU$0 delivery fees on eligible DoorDash orders New offer co...

Times Magazine

The perfect picture: what makes dream Sydney wedding photography?

The photo album is, without a shadow of a doubt, the most important memento from any loving couple’s special day! It’s the keepsake that keeps on giving, the souvenir to saviour, and the perfect reminder of what was one of the biggest - and most jo...

Sesame Street supports emotional wellbeing in young children

SESAME WORKSHOP ANNOUNCES MULTI-YEAR COMMITMENT TO THE EMOTIONAL WELL-BEING OF YOUNG CHILDREN AND FAMILIES Sesame Workshop, the nonprofit educational organisation behind Sesame Street, has announced a new focus on the emotional well-being of yo...

Dog Breeder Charged with Inhumane Puppy Farming

Breeders of all kinds of puppies are very common nowadays with more people looking to care for their new little furry pals at home. But if you’re looking to get your first dog or are just looking to add another pup to the pack, you’ll want to make su...

Apple releases iPhone 14, best deals and perks

Apple has unveiled its next generation tech and the hotly anticipated iPhone 14 with offers from Telstra, Optus and Vodafone up for grabs.Prices for the iPhone 14 line-up start at $1,399 and go up to $2,769.Finder’s Consumer Sentiment Tracker revea...

Some Tips For Buying The Right Pair Of Sneakers

The old saying goes "Never judge a book by its cover". This august wisdom applies to a lot more things in life than just books, including today's topic, sneakers. It's easy to be charmed by clever designs, bright colours, and blingy glitz, but it's...

Vehicle Emissions Star Rating using public data to inform consumer purchasing decisions

Global open data company Link Digital has used open source technology to develop a new Vehicle Emissions  Star Rating (VESR) website for the New South Wales Government to help drivers consider the efficiency and  environmental impact of their nex...