The Times Australia
Fisher and Paykel Appliances
The Times World News

.

AI weapons are dangerous in war. But saying they can’t be held accountable misses the point

  • Written by Zena Assaad, Senior Lecturer, School of Engineering, Australian National University

In a speech to the United Nations Security Council last month, Australia’s Minister for Foreign Affairs, Penny Wong, took aim at artificial intelligence[1] (AI).

While she said the technology “heralds extraordinary promise” in fields such as health and education, she also said its potential use in nuclear weapons and unmanned systems challenges the future of humanity:

Nuclear warfare has so far been constrained by human judgement. By leaders who bear responsibility and by human conscience. AI has no such concern, nor can it be held accountable. These weapons threaten to change war itself and they risk escalation without warning.

This idea – that AI warfare poses a unique threat – often features[2] in public calls to safeguard this technology. But it is clouded by various misrepresentations of both the technology and warfare.

This raises the questions: will AI actually change the nature of warfare? And is it really unaccountable?

How is AI being used in warfare?

AI is by no means a new technology, with the term originally coined in the 1950s[3]. It has now become an umbrella term that encompasses everything from large language models to computer vision to neural networks – all of which are very different.

Generally speaking, applications of AI analyse patterns in data to infer, from inputs such as text prompts, how to generate outputs such as predictions, content, recommendations or decisions. But the underlying ways these systems are trained are not always comparable[4], despite them all being labelled as “AI”.

The use of AI in warfare ranges from wargaming simulations[5] used for training soldiers, through to the more problematic AI decision-support systems used for targeting, such as the Israel Defence Force’s use of the “Lavender” system[6] which allegedly identifies suspected members of Hamas, or other armed groups.

Broad discussions on AI in the military domain capture both of these examples, when it is only the latter which sits at the point of life-and-death decision making. It is this point which dominates most of the moral debates related to AI in the context of warfare.

Is there really an accountability gap?

Arguments on who, or what, is held liable when something goes wrong extend to both civil and military applications of AI. This predicament has been labelled an “accountability gap”[7].

Interestingly, this accountability gap – which is fuelled by media reports[8] about “killer robots” that make life-and-death decisions in war – is rarely debated when it comes to other technologies.

For example, there are legacy weapons such as unguided missiles or landmines that involve no human oversight or control in what is the deadliest portion of their operation. Yet no one asks whether the unguided missile or landmine was at fault.

Similarly, the Robodebt scandal[9] in Australia saw misfeasance on behalf of the federal government, not the automated system it relied on to tally debts.

So why do we ask if AI is at fault?

Like any other complex system, AI systems are designed, developed, acquired and deployed by humans. For military contexts, there is the added layer of command and control[10], a hierarchy of decision making to achieve military objectives.

AI does not exist outside of this hierarchy. The idea of independent decision making, on the part of AI systems, is clouded by a misunderstanding of how these systems actually work – and by what processes and practices led to the system being used in different applications.

While it’s correct to say that AI systems cannot be held accountable, it’s also superfluous. No inanimate object can or has ever been held accountable in any circumstance – be it an automated debt recovery system or a military weapon system.

The argument of accountability on behalf of a system is neither here nor there, because ultimately, decisions, and the responsibilities of those decisions, always sit at the human level.

It always comes back to humans

All complex systems, including AI systems, exist across a system lifecycle[11]: a structured and systematic process of taking a system from initial conception through to its ultimate retirement.

Humans make conscious decisions across every stage of a lifecycle: planning, design, development, implementation, operation, maintenance. These decisions range from technical engineering requirements through to regulatory compliance and operational safeguards.

What this lifecycle structure creates is a chain of responsibility[12] with clear intervention points.

This means, when an AI system is deployed, its characteristics – including its faults and limitations – are a product of cumulative human decision making.

AI weapon systems used for targeting are not making decisions on life and death. The people who consciously chose to use that system in that context are.

So when we talk about regulating AI weapon systems, really what we’re regulating are the humans involved in the lifecycle of those systems.

The idea of AI changing the nature of warfare clouds the reality of the roles humans play in military decision making. While this technology has and will continue to present challenges, those challenges seem always to come back to people.

References

  1. ^ took aim at artificial intelligence (www.foreignminister.gov.au)
  2. ^ features (news.un.org)
  3. ^ term originally coined in the 1950s (theconversation.com)
  4. ^ are not always comparable (fpf.org)
  5. ^ wargaming simulations (www.act.nato.int)
  6. ^ Israel Defence Force’s use of the “Lavender” system (time.com)
  7. ^ “accountability gap” (uxmag.medium.com)
  8. ^ media reports (news.harvard.edu)
  9. ^ Robodebt scandal (www.abc.net.au)
  10. ^ command and control (apps.dtic.mil)
  11. ^ system lifecycle (testbankdeal.com)
  12. ^ chain of responsibility (heinonline.org)

Read more https://theconversation.com/ai-weapons-are-dangerous-in-war-but-saying-they-cant-be-held-accountable-misses-the-point-266458

Active Wear

Times Magazine

Myer celebrates 70 years of Christmas windows magic with the LEGO Group

To mark the 70th anniversary of the Myer Christmas Windows, Australia’s favourite department store...

Kindness Tops the List: New Survey Reveals Australia’s Defining Value

Commentary from Kath Koschel, founder of Kindness Factory.  In a time where headlines are dominat...

In 2024, the climate crisis worsened in all ways. But we can still limit warming with bold action

Climate change has been on the world’s radar for decades[1]. Predictions made by scientists at...

End-of-Life Planning: Why Talking About Death With Family Makes Funeral Planning Easier

I spend a lot of time talking about death. Not in a morbid, gloomy way—but in the same way we d...

YepAI Joins Victoria's AI Trade Mission to Singapore for Big Data & AI World Asia 2025

YepAI, a Melbourne-based leader in enterprise artificial intelligence solutions, announced today...

Building a Strong Online Presence with Katoomba Web Design

Katoomba web design is more than just creating a website that looks good—it’s about building an onli...

The Times Features

Myer celebrates 70 years of Christmas windows magic with the LEGO Group

To mark the 70th anniversary of the Myer Christmas Windows, Australia’s favourite department store...

Pharmac wants to trim its controversial medicines waiting list – no list at all might be better

New Zealand’s drug-buying agency Pharmac is currently consulting[1] on a change to how it mana...

NRMA Partnership Unlocks Cinema and Hotel Discounts

My NRMA Rewards, one of Australia’s largest membership and benefits programs, has announced a ne...

Restaurants to visit in St Kilda and South Yarra

Here are six highly-recommended restaurants split between the seaside suburb of St Kilda and the...

The Year of Actually Doing It

There’s something about the week between Christmas and New Year’s that makes us all pause and re...

Jetstar to start flying Sunshine Coast to Singapore Via Bali With Prices Starting At $199

The Sunshine Coast is set to make history, with Jetstar today announcing the launch of direct fl...

Why Melbourne Families Are Choosing Custom Home Builders Over Volume Builders

Across Melbourne’s growing suburbs, families are re-evaluating how they build their dream homes...

Australian Startup Business Operators Should Make Connections with Asian Enterprises — That Is Where Their Future Lies

In the rapidly shifting global economy, Australian startups are increasingly finding that their ...

How early is too early’ for Hot Cross Buns to hit supermarket and bakery shelves

Every year, Australians find themselves in the middle of the nation’s most delicious dilemmas - ...