The Times Australia
The Times World News

.

AI is being used in social services – but we must make sure it doesn’t traumatise clients

  • Written by Suvradip Maitra, PhD Student, Australian National University

Late last year, ChatGPT was used by a Victorian child protection worker to draft documents. In a glaring error, ChatGPT referred to a “doll” used for sexual purposes as an “age-appropriate toy”. Following this, the Victorian information commissioner banned the use of generative artificial intelligence (AI) in child protection[1].

Unfortunately, many harmful AI systems[2] will not garner such public visibility. It’s crucial that people who use social services – such as employment, homelessness or domestic violence services – are aware they may be subject to AI. Additionally, service providers should be well informed about how to use AI safely.

Fortunately, emerging regulations and tools, such as our trauma-informed AI toolkit[3], can help to reduce AI harm.

How do social services use AI?

AI has captured global attention with promises of better service delivery. In a strained social services sector[4], AI promises to reduce backlogs, lower administrative burdens and allocate resources more effectively while enhancing services. It’s no surprise a range of social service providers are using AI in various ways.

Chatbots simulate human conversation with the use of voice, text or images. These programs are increasingly used for a range of tasks. For instance, they can provide mental health support or offer employment advice. They can also speed up data processing or help quickly create reports.

However, chatbots can easily produce harmful or inaccurate responses. For instance, the United States National Eating Disorders Association deployed the chatbot Tessa to support clients experiencing eating disorders. But it was quickly pulled offline when advocates flagged Tessa was providing harmful weight loss advice[5].

Recommender systems use AI to make personalised suggestions or options. These could include targeting job or rental ads, or educational material based on data available to service providers.

But recommender systems can be discriminatory, such as when LinkedIn showed more job ads to men than women[6]. They can also reinforce existing anxieties. For instance, pregnant women have been recommended alarming pregnancy videos on social media[7].

Recognition systems classify data such as images or text to compare one dataset to another. These systems can complete many tasks, such as face matching to verify identity or transcribing voice to text.

Such systems can raise surveillance[8], privacy[9], inaccuracy and discrimination[10] concerns. A homeless shelter in Canada[11] stopped using facial recognition cameras because they risked privacy breaches – it’s difficult to obtain informed consent from mentally unwell or intoxicated people using the shelter.

Risk-assessment systems use AI to predict the likelihood of a specific outcome occurring. Many systems have been used to calculate the risk of child abuse, long-term unemployment, or tax and welfare fraud.

Often data used in these systems can recreate societal inequalities, causing harm to already-marginalised peoples. In one such case, a tool in the US used for identifying risk of child mistreatment unfairly targeted poor[12], black and biracial families[13] and families with disabilities[14].

A Dutch risk assessment tool seeking to identify childcare benefits fraud was shut down for being racist[15], while an AI system in France faces similar accusations[16].

Read more: Algorithms that predict crime are watching – and judging us by the cards we've been dealt[17]

The need for a trauma-informed approach

Concerningly, our research shows using AI in social services can cause or perpetuate trauma for the people who use the services.

The American Psychological Association[18] defines trauma as an emotional response to a range of events, such as accidents, abuse or the death of a loved one. Broadly understood, trauma can be experienced at an individual or group level[19] and be passed down through generations. Trauma experienced by First Nations[20] people in Australia as a result of colonisation is an example of group trauma.

Between 57% and 75% of Australians[21] experience at least one traumatic event in their lifetime.

Many social service providers have long adopted a trauma-informed approach. It prioritises trust, safety, choice, empowerment, transparency, and cultural, historical and gender-based considerations. A trauma-informed service provider understands the impact of trauma and recognises signs of trauma in users.

Service providers should be wary of abandoning these core principles despite the allure of the often hyped[22] capabilities of AI.

Can social services use AI responsibly?

To reduce the risk of causing or perpetuating trauma, social service providers should carefully evaluate any AI system before using it.

For AI systems already in place, evaluation can help monitor their impact and ensure they are operating safely.

We have developed a trauma-informed AI assessment toolkit[23] that helps service providers to assess the safety of their planned or current use of AI. The toolkit is based on the principles of trauma-informed care, case studies of AI harms, and design workshops with service providers. An online version of the toolkit is about to be piloted within organisations.

By posing a series of questions, the toolkit enables service providers to consider whether risks outweigh the benefits. For instance, is the AI system co-designed with users? Can users opt out of being subject to the AI system?

It guides service providers through a series of practical considerations to enhance the safe use of AI.

Social services do not have to avoid AI altogether. But social service providers and users should be aware of the risks of harm from AI – so they can intentionally shape AI for good.

Read more https://theconversation.com/ai-is-being-used-in-social-services-but-we-must-make-sure-it-doesnt-traumatise-clients-248555

Times Magazine

DIY Is In: How Aussie Parents Are Redefining Birthday Parties

When planning his daughter’s birthday, Rich opted for a DIY approach, inspired by her love for drawing maps and giving clues. Their weekend tradition of hiding treats at home sparked the idea, and with a pirate ship playground already chosen as t...

When Touchscreens Turn Temperamental: What to Do Before You Panic

When your touchscreen starts acting up, ignoring taps, registering phantom touches, or freezing entirely, it can feel like your entire setup is falling apart. Before you rush to replace the device, it’s worth taking a deep breath and exploring what c...

Why Social Media Marketing Matters for Businesses in Australia

Today social media is a big part of daily life. All over Australia people use Facebook, Instagram, TikTok , LinkedIn and Twitter to stay connected, share updates and find new ideas. For businesses this means a great chance to reach new customers and...

Building an AI-First Culture in Your Company

AI isn't just something to think about anymore - it's becoming part of how we live and work, whether we like it or not. At the office, it definitely helps us move faster. But here's the thing: just using tools like ChatGPT or plugging AI into your wo...

Data Management Isn't Just About Tech—Here’s Why It’s a Human Problem Too

Photo by Kevin Kuby Manuel O. Diaz Jr.We live in a world drowning in data. Every click, swipe, medical scan, and financial transaction generates information, so much that managing it all has become one of the biggest challenges of our digital age. Bu...

Headless CMS in Digital Twins and 3D Product Experiences

Image by freepik As the metaverse becomes more advanced and accessible, it's clear that multiple sectors will use digital twins and 3D product experiences to visualize, connect, and streamline efforts better. A digital twin is a virtual replica of ...

The Times Features

Après Skrew: Peanut Butter Whiskey Turns Australia’s Winter Parties Upside Down

This August, winter in Australia is about to get a lot nuttier. Skrewball Whiskey, the cult U.S. peanut butter whiskey that’s taken the world by storm, is bringing its bold brand o...

450 people queue for first taste of Pappa Flock’s crispy chicken as first restaurant opens in Queensland

Queenslanders turned out in flocks for the opening of Pappa Flock's first Queensland restaurant, with 450 people lining up to get their hands on the TikTok famous crispy crunchy ch...

How to Choose a Cosmetic Clinic That Aligns With Your Aesthetic Goals

Clinics that align with your goals prioritise subtlety, safety, and client input Strong results come from experience, not trends or treatment bundles A proper consultation fe...

7 Non-Invasive Options That Can Subtly Enhance Your Features

Non-invasive treatments can refresh your appearance with minimal downtime Options range from anti-wrinkle treatments to advanced skin therapies Many results appear gradually ...

What is creatine? What does the science say about its claims to build muscle and boost brain health?

If you’ve walked down the wellness aisle at your local supermarket recently, or scrolled the latest wellness trends on social media, you’ve likely heard about creatine. Creati...

Whole House Water Filters: Essential or Optional for Australian Homes?

Access to clean, safe water is something most Australians take for granted—but the reality can be more complex. Our country’s unique climate, frequent droughts, and occasional ...