The Times Australia
The Times World News

.
The Times Real Estate

.

UN fails to agree on 'killer robot' ban as nations pour billions into autonomous weapons research

  • Written by James Dawes, Professor of English, Macalester College
UN fails to agree on 'killer robot' ban as nations pour billions into autonomous weapons research

Autonomous weapon systems – commonly known as killer robots – may have killed human beings for the first time ever[1] last year, according to a recent United Nations Security Council report on the Libyan civil war[2]. History could well identify this as the starting point of the next major arms race, one that has the potential to be humanity’s final one.

The United Nations Convention on Certain Conventional Weapons[3] debated the question of banning autonomous weapons at its once-every-five-years review meeting in Geneva Dec. 13-17, 2021, but didn’t reach consensus on a ban[4]. Established in 1983, the convention has been updated regularly to restrict some of the world’s cruelest conventional weapons, including land mines, booby traps and incendiary weapons.

Autonomous weapon systems are robots with lethal weapons that can operate independently, selecting and attacking targets without a human weighing in on those decisions. Militaries around the world are investing heavily[5] in autonomous weapons research and development. The U.S. alone budgeted US$18 billion[6] for autonomous weapons between 2016 and 2020.

Meanwhile, human rights and humanitarian organizations[7] are racing to establish regulations and prohibitions on such weapons development. Without such checks, foreign policy experts warn that disruptive autonomous weapons technologies will dangerously destabilize current nuclear strategies, both because they could radically change perceptions of strategic dominance, increasing the risk of preemptive attacks[8], and because they could be combined with chemical, biological, radiological and nuclear weapons[9] themselves.

As a specialist in human rights[10] with a focus on the weaponization of artificial intelligence[11], I find that autonomous weapons make the unsteady balances and fragmented safeguards of the nuclear world – for example, the U.S. president’s minimally constrained authority to launch a strike[12] – more unsteady and more fragmented. Given the pace of research and development in autonomous weapons, the U.N. meeting might have been the last chance to head off an arms race.

Lethal errors and black boxes

I see four primary dangers with autonomous weapons. The first is the problem of misidentification. When selecting a target, will autonomous weapons be able to distinguish between hostile soldiers and 12-year-olds playing with toy guns? Between civilians fleeing a conflict site and insurgents making a tactical retreat?

Killer robots, like the drones in the 2017 short film ‘Slaughterbots,’ have long been a major subgenre of science fiction. (Warning: graphic depictions of violence.)

The problem here is not that machines will make such errors and humans won’t. It’s that the difference between human error and algorithmic error is like the difference between mailing a letter and tweeting. The scale, scope and speed of killer robot systems – ruled by one targeting algorithm, deployed across an entire continent – could make misidentifications by individual humans like a recent U.S. drone strike in Afghanistan[13] seem like mere rounding errors by comparison.

Autonomous weapons expert Paul Scharre uses the metaphor of the runaway gun[14] to explain the difference. A runaway gun is a defective machine gun that continues to fire after a trigger is released. The gun continues to fire until ammunition is depleted because, so to speak, the gun does not know it is making an error. Runaway guns are extremely dangerous, but fortunately they have human operators who can break the ammunition link or try to point the weapon in a safe direction. Autonomous weapons, by definition, have no such safeguard.

Importantly, weaponized AI need not even be defective to produce the runaway gun effect. As multiple studies on algorithmic errors across industries have shown, the very best algorithms – operating as designed – can generate internally correct outcomes that nonetheless spread terrible errors[15] rapidly across populations.

For example, a neural net designed for use in Pittsburgh hospitals identified asthma as a risk-reducer[16] in pneumonia cases; image recognition software used by Google identified Black people as gorillas[17]; and a machine-learning tool used by Amazon to rank job candidates systematically assigned negative scores to women[18].

The problem is not just that when AI systems err, they err in bulk. It is that when they err, their makers often don’t know why they did and, therefore, how to correct them. The black box problem[19] of AI makes it almost impossible to imagine morally responsible development of autonomous weapons systems.

The proliferation problems

The next two dangers are the problems of low-end and high-end proliferation. Let’s start with the low end. The militaries developing autonomous weapons now are proceeding on the assumption that they will be able to contain and control the use of autonomous weapons[20]. But if the history of weapons technology has taught the world anything, it’s this: Weapons spread.

Market pressures could result in the creation and widespread sale of what can be thought of as the autonomous weapon equivalent of the Kalashnikov assault rifle[21]: killer robots that are cheap, effective and almost impossible to contain as they circulate around the globe. “Kalashnikov” autonomous weapons could get into the hands of people outside of government control, including international and domestic terrorists.

Front view of a quadcopter showing its camera
The Kargu-2, made by a Turkish defense contractor, is a cross between a quadcopter drone and a bomb. It has artificial intelligence for finding and tracking targets, and might have been used autonomously in the Libyan civil war to attack people. Ministry of Defense of Ukraine, CC BY[22][23]

High-end proliferation is just as bad, however. Nations could compete to develop increasingly devastating versions of autonomous weapons, including ones capable of mounting chemical, biological, radiological and nuclear arms[24]. The moral dangers of escalating weapon lethality would be amplified by escalating weapon use.

High-end autonomous weapons are likely to lead to more frequent wars because they will decrease two of the primary forces that have historically prevented and shortened wars: concern for civilians abroad and concern for one’s own soldiers. The weapons are likely to be equipped with expensive ethical governors[25] designed to minimize collateral damage, using what U.N. Special Rapporteur Agnes Callamard has called the “myth of a surgical strike[26]” to quell moral protests. Autonomous weapons will also reduce both the need for and risk to one’s own soldiers, dramatically altering the cost-benefit analysis[27] that nations undergo while launching and maintaining wars.

Asymmetric wars – that is, wars waged on the soil of nations that lack competing technology – are likely to become more common. Think about the global instability caused by Soviet and U.S. military interventions during the Cold War, from the first proxy war to the blowback experienced around the world today[28]. Multiply that by every country currently aiming for high-end autonomous weapons.

Undermining the laws of war

Finally, autonomous weapons will undermine humanity’s final stopgap against war crimes and atrocities: the international laws of war. These laws, codified in treaties reaching as far back as the 1864 Geneva Convention[29], are the international thin blue line separating war with honor from massacre. They are premised on the idea that people can be held accountable for their actions even during wartime, that the right to kill other soldiers during combat does not give the right to murder civilians. A prominent example of someone held to account is Slobodan Milosevic[30], former president of the Federal Republic of Yugoslavia, who was indicted on charges of crimes against humanity and war crimes by the U.N.’s International Criminal Tribunal for the Former Yugoslavia.

[Get our best science, health and technology stories. Sign up for The Conversation’s science newsletter[31].]

But how can autonomous weapons be held accountable? Who is to blame for a robot that commits war crimes? Who would be put on trial? The weapon? The soldier? The soldier’s commanders? The corporation that made the weapon? Nongovernmental organizations and experts in international law worry that autonomous weapons will lead to a serious accountability gap[32].

To hold a soldier criminally responsible[33] for deploying an autonomous weapon that commits war crimes, prosecutors would need to prove both actus reus and mens rea, Latin terms describing a guilty act and a guilty mind. This would be difficult as a matter of law, and possibly unjust as a matter of morality, given that autonomous weapons are inherently unpredictable. I believe the distance separating the soldier from the independent decisions made by autonomous weapons in rapidly evolving environments is simply too great.

The legal and moral challenge is not made easier by shifting the blame up the chain of command or back to the site of production. In a world without regulations that mandate meaningful human control[34] of autonomous weapons, there will be war crimes with no war criminals to hold accountable. The structure of the laws of war, along with their deterrent value, will be significantly weakened.

A new global arms race

Imagine a world in which militaries, insurgent groups and international and domestic terrorists can deploy theoretically unlimited lethal force at theoretically zero risk at times and places of their choosing, with no resulting legal accountability. It is a world where the sort of unavoidable algorithmic errors[35] that plague even tech giants like Amazon and Google can now lead to the elimination of whole cities.

[Over 140,000 readers rely on The Conversation’s newsletters to understand the world. Sign up today[36].]

In my view, the world should not repeat the catastrophic mistakes of the nuclear arms race. It should not sleepwalk into dystopia.

This is an updated version of an article[37] originally published on September 29, 2021.

References

  1. ^ killed human beings for the first time ever (www.npr.org)
  2. ^ report on the Libyan civil war (undocs.org)
  3. ^ Convention on Certain Conventional Weapons (www.un.org)
  4. ^ didn’t reach consensus on a ban (www.reuters.com)
  5. ^ investing heavily (www.newsweek.com)
  6. ^ budgeted US$18 billion (www.scientificamerican.com)
  7. ^ humanitarian organizations (www.stopkillerrobots.org)
  8. ^ increasing the risk of preemptive attacks (www.rand.org)
  9. ^ combined with chemical, biological, radiological and nuclear weapons (foreignpolicy.com)
  10. ^ specialist in human rights (scholar.google.com)
  11. ^ weaponization of artificial intelligence (muse.jhu.edu)
  12. ^ authority to launch a strike (wwnorton.com)
  13. ^ U.S. drone strike in Afghanistan (www.reuters.com)
  14. ^ the runaway gun (wwnorton.com)
  15. ^ generate internally correct outcomes that nonetheless spread terrible errors (brianchristian.org)
  16. ^ asthma as a risk-reducer (www.pulmonologyadvisor.com)
  17. ^ identified Black people as gorillas (www.usatoday.com)
  18. ^ systematically assigned negative scores to women (www.reuters.com)
  19. ^ black box problem (jolt.law.harvard.edu)
  20. ^ contain and control the use of autonomous weapons (www.popularmechanics.com)
  21. ^ Kalashnikov assault rifle (www.npr.org)
  22. ^ Ministry of Defense of Ukraine (commons.wikimedia.org)
  23. ^ CC BY (creativecommons.org)
  24. ^ mounting chemical, biological, radiological and nuclear arms (cpr.unu.edu)
  25. ^ ethical governors (smartech.gatech.edu)
  26. ^ myth of a surgical strike (news.un.org)
  27. ^ cost-benefit analysis (www.jstor.org)
  28. ^ blowback experienced around the world today (dx.doi.org)
  29. ^ Geneva Convention (www.law.cornell.edu)
  30. ^ Slobodan Milosevic (www.britannica.com)
  31. ^ Sign up for The Conversation’s science newsletter (theconversation.com)
  32. ^ accountability gap (www.hrw.org)
  33. ^ criminally responsible (digitalcommons.du.edu)
  34. ^ meaningful human control (blogs.icrc.org)
  35. ^ algorithmic errors (www.amazon.com)
  36. ^ Sign up today (memberservices.theconversation.com)
  37. ^ article (theconversation.com)

Read more https://theconversation.com/un-fails-to-agree-on-killer-robot-ban-as-nations-pour-billions-into-autonomous-weapons-research-173616

The Times Features

The Benefits of Animal-Assisted Speech Therapy For Children

Speech therapy has long been a standard for supporting children’s communication and emotional development. But what happens when you introduce a furry friend into the process? Th...

The Hidden Dangers of Blocked Drains and the Ultimate Solution for a Hassle-Free Home

Drain blockages are a big hassle to every homeowner and business owner alike. Whether it is a sink in the kitchen or bathroom, a clogged toilet, or a foul smell circulating aroun...

Understanding the Dangers of Ignoring a Gas Leak

Gas leaks are silent threats lurking within both homes and workplaces. A gas leak occurs when natural gas or any other gaseous substance escapes from a pipeline or containment. T...

Can You Sell Your House Privately in Queensland? Here’s How

Selling a house privately in Queensland is entirely possible and can be a cost-effective alternative to using a real estate agent. While agents provide valuable expertise, their co...

Itinerary to Maximize Your Two-Week Adventure in Vietnam and Cambodia

Two weeks may not seem like much, but it’s just the right time for travelers to explore the best of Vietnam and Cambodia. From the bustling streets of Hanoi to the magnificent te...

How to Protect Your Garden Trees from Wind Damage in Australia

In Australia's expansive landscape, garden trees hold noteworthy significance. They not only enhance the aesthetic appeal of our homes but also play an integral role in the local...

Times Magazine

CWU Assistive Tech Hub is Changing Lives: Win a Free Rollator Walker This Easter!

🌟 Mobility. Independence. Community. All in One. This Easter, the CWU Assistive Tech Hub is pleased to support the Banyule community by giving away a rollator walker. The giveaway will take place during the Macleod Village Easter Egg Hunt & Ma...

"Eternal Nurture" by Cara Barilla: A Timeless Collection of Wisdom and Healing

Renowned Sydney-born author and educator Cara Barilla has released her latest book, Eternal Nurture, a profound collection of inspirational quotes designed to support mindfulness, emotional healing, and personal growth. With a deep commitment to ...

How AI-Driven SEO Enhancements Can Improve Headless CMS Content Visibility

Whereas SEO (search engine optimization) is critical in the digital landscape for making connections to content, much of it is still done manually keyword research, metatags, final tweaks at publication requiring a human element that takes extensiv...

Crypto Expert John Fenga Reveals How Blockchain is Revolutionising Charity

One of the most persistent challenges in the charity sector is trust. Donors often wonder whether their contributions are being used effectively or if overhead costs consume a significant portion. Traditional fundraising methods can be opaque, with...

Navigating Parenting Arrangements in Australia: A Legal Guide for Parents

Understanding Parenting Arrangements in Australia. Child custody disputes are often one of the most emotionally charged aspects of separation or divorce. Parents naturally want what is best for their children, but the legal process of determining ...

Blocky Adventures: A Minecraft Movie Celebration for Your Wrist

The Minecraft movie is almost here—and it’s time to get excited! With the film set to hit theaters on April 4, 2025, fans have a brand-new reason to celebrate. To honor the upcoming blockbuster, watchfaces.co has released a special Minecraft-inspir...

LayBy Shopping