The Times Australia
Fisher and Paykel Appliances
The Times World News

.

Businesses can’t escape the AI revolution – so here’s how to build a culture of safe and responsible use

  • Written by Nicholas Davis, Industry Professor of Emerging Technology and Co-Director, Human Technology Institute, University of Technology Sydney

In November 2023, the estates of two now-deceased policyholders sued[1] the US health insurer, United Healthcare, for deploying what they allege is a flawed artificial intelligence (AI) system to systematically deny patient claims.

The issue – they claim – wasn’t just how the AI was designed. It was that the company allegedly also limited the ability of staff to override the system’s decisions, even if they thought the system was wrong.

They allege the company even went so far as to punish staff who failed to act in accordance with the model’s predictions.

Regardless of the eventual outcome of this case, which remains before the US court system, the claims made in the suit highlight a critical challenge facing organisations.

While artificial intelligence offers tremendous opportunities, its safe and responsible use depends on having the right people, skills and culture to govern it properly.

Read more: Beyond the hype: what workers really think about workplace AI assistants[2]

Getting on the front foot

AI is pervading businesses whether they like it or not. Many Australian organisations are moving quickly on the technology. Far too few are focused on proactively managing its risks.

According to the Australian Responsible AI Index 2024[3], 78% of surveyed organisations claim their use of AI is in line with the principles of responsible AI.

Yet, only 29% said they had implemented practices to ensure it was.

ChatGPT, Gemini, Microsoft Copilot, Claude, and Perplexity app icons are seen on a smartphone
AI applications range from easily accessible general-use chatbots to highly-specialised software. Tada Images/Shutterstock[4]

Sometimes visible, sometimes not

In some cases, AI is a well-publicised selling point for new products, and organisations are making positive decisions to adopt it.

At the same time, these systems are increasingly hidden from view. They may be used by an upstream supplier, embedded as a subcomponent of a new product, or inserted into an existing product via an automatic software update.

Sometimes, they’re even used by staff on a “shadow” basis – out of sight of management.

Modern city traffic and digital technology concept image
AI is increasingly becoming embedded in all kinds of systems, making it hard to know where and how we rely on it. metamorworks/Shutterstock[5]

The pervasiveness – and often hidden nature – of AI adoption means that organisations can’t treat AI governance as merely a compliance exercise or technical challenge.

Instead, leaders need to focus on building the right internal capability and culture to support safe and responsible AI use across their operations.

What to get right

Research[6] from the University of Technology Sydney’s Human Technology Institute points to three critical elements that organisations must get right.

First, it’s absolutely critical that boards and senior executives have sufficient understanding of AI to provide meaningful oversight.

This doesn’t mean they have to become technical experts. But directors need to have what we call a “minimum viable understanding” of AI. They need to be able to spot the strategic opportunities and risks of the technology, and to ask the right questions of management.

If they don’t have this expertise, they can seek training, recruit new members who have it or establish an AI expert advisory committee.

Clear accountability

Second, organisations need to create clear lines of accountability for AI governance. These should place clear duties on specific people with appropriate levels of authority.

A number of leading companies are already doing this, by nominating a senior executive with explicitly defined responsibilities. This is primarily a governance role, and it requires a unique blend of skills: strong leadership capabilities, some technical literacy and the ability to work across departments.

Third, organisations need to create a governance framework with simple and efficient processes to review their uses of AI, identify risks and find ways to manage them.

Above all, building the right culture

Perhaps most importantly, organisations need to cultivate a critically supportive culture around AI use.

What does that mean? It’s an environment where staff – at all levels – understand both the potential and the risks of AI and feel empowered to raise concerns.

Telstra’s “Responsible AI Policy” is one case study[7] of good practice in a complex corporate environment.

To ensure the board and senior management would have a good view of AI activities and risks, Telstra established an oversight committee dedicated to reviewing high-impact AI systems.

The committee brings together experts and representatives from legal, data, cyber security, privacy, risk and other teams to assess potential risks and make recommendations.

Importantly, the company has also invested in training all staff on AI risks and governance.

Warehouse staff working together using digital tablets to check the stock inventory
Appropriate AI training is necessary at every level of an organisation. Gumbariya/Shutterstock[8]

Bringing everyone along

The cultural element is particularly crucial because of how AI adoption typically unfolds.

Our previous research[9] suggests many Australian workers feel AI is being imposed on them without adequate consultation or training.

This doesn’t just create pushback. It can also mean organisations miss out on important feedback on how their staff actually use AI to create value and solve problems.

Ultimately, our collective success with AI depends not so much on the technology itself, but on the human systems we build around it.

This is important whether you lead an organisation or work for one. So, the next time your colleagues start discussing an opportunity to buy or use AI in a new way, don’t just focus on the technology.

Ask: “What needs to be true about our people, skills and culture to make this succeed?”

References

  1. ^ sued (www.reuters.com)
  2. ^ Beyond the hype: what workers really think about workplace AI assistants (theconversation.com)
  3. ^ Australian Responsible AI Index 2024 (www.fifthquadrant.com.au)
  4. ^ Tada Images/Shutterstock (www.shutterstock.com)
  5. ^ metamorworks/Shutterstock (shutterstock.com)
  6. ^ Research (www.uts.edu.au)
  7. ^ case study (www.uts.edu.au)
  8. ^ Gumbariya/Shutterstock (www.shutterstock.com)
  9. ^ previous research (www.uts.edu.au)

Read more https://theconversation.com/businesses-cant-escape-the-ai-revolution-so-heres-how-to-build-a-culture-of-safe-and-responsible-use-246024

Times Magazine

Kool Car Hire

Turn Your Four-Wheeled Showstopper into Profit (and Stardom) Have you ever found yourself stand...

EV ‘charging deserts’ in regional Australia are slowing the shift to clean transport

If you live in a big city, finding a charger for your electric vehicle (EV) isn’t hard. But driv...

How to Reduce Eye Strain When Using an Extra Screen

Many professionals say two screens are better than one. And they're not wrong! A second screen mak...

Is AI really coming for our jobs and wages? Past predictions of a ‘robot apocalypse’ offer some clues

The robots were taking our jobs – or so we were told over a decade ago. The same warnings are ...

Myer celebrates 70 years of Christmas windows magic with the LEGO Group

To mark the 70th anniversary of the Myer Christmas Windows, Australia’s favourite department store...

Kindness Tops the List: New Survey Reveals Australia’s Defining Value

Commentary from Kath Koschel, founder of Kindness Factory.  In a time where headlines are dominat...

The Times Features

From The Stage to Spotify, Stanhope singer Alyssa Delpopolo Reveals Her Meteoric Rise

When local singer Alyssa Delpopolo was crowned winner of The Voice last week, the cheers were louder...

How healthy are the hundreds of confectionery options and soft drinks

Walk into any big Australian supermarket and the first thing that hits you isn’t the smell of fr...

The Top Six Issues Australians Are Thinking About Today

Australia in 2025 is navigating one of the most unsettled periods in recent memory. Economic pre...

How Net Zero Will Adversely Change How We Live — and Why the Coalition’s Abandonment of That Aspiration Could Be Beneficial

The drive toward net zero emissions by 2050 has become one of the most defining political, socia...

How can you help your child prepare to start high school next year?

Moving from primary to high school is one of the biggest transitions in a child’s education. F...

Menulog is closing in Australia. Could food delivery soon cost more?

It’s been a rocky road for Australia’s food delivery sector. Over the past decade, major platfor...

Why Every Australian Should Hold Physical Gold and Silver in 2025

In 2025, Australians are asking the same question investors around the world are quietly whisper...

For Young Australians Not Able to Buy City Property Despite Earning Strong Incomes: What Are the Options?

For decades, the message to young Australians was simple: study hard, get a good job, save a dep...

The AI boom feels eerily similar to 2000’s dotcom crash – with some important differences

If last week’s trillion-dollar slide[1] of major tech stocks felt familiar, it’s because we’ve b...