The Times Australia
The Times World News

.
Times Media

.

UN fails to agree on 'killer robot' ban as nations pour billions into autonomous weapons research

  • Written by James Dawes, Professor of English, Macalester College
UN fails to agree on 'killer robot' ban as nations pour billions into autonomous weapons research

Autonomous weapon systems – commonly known as killer robots – may have killed human beings for the first time ever[1] last year, according to a recent United Nations Security Council report on the Libyan civil war[2]. History could well identify this as the starting point of the next major arms race, one that has the potential to be humanity’s final one.

The United Nations Convention on Certain Conventional Weapons[3] debated the question of banning autonomous weapons at its once-every-five-years review meeting in Geneva Dec. 13-17, 2021, but didn’t reach consensus on a ban[4]. Established in 1983, the convention has been updated regularly to restrict some of the world’s cruelest conventional weapons, including land mines, booby traps and incendiary weapons.

Autonomous weapon systems are robots with lethal weapons that can operate independently, selecting and attacking targets without a human weighing in on those decisions. Militaries around the world are investing heavily[5] in autonomous weapons research and development. The U.S. alone budgeted US$18 billion[6] for autonomous weapons between 2016 and 2020.

Meanwhile, human rights and humanitarian organizations[7] are racing to establish regulations and prohibitions on such weapons development. Without such checks, foreign policy experts warn that disruptive autonomous weapons technologies will dangerously destabilize current nuclear strategies, both because they could radically change perceptions of strategic dominance, increasing the risk of preemptive attacks[8], and because they could be combined with chemical, biological, radiological and nuclear weapons[9] themselves.

As a specialist in human rights[10] with a focus on the weaponization of artificial intelligence[11], I find that autonomous weapons make the unsteady balances and fragmented safeguards of the nuclear world – for example, the U.S. president’s minimally constrained authority to launch a strike[12] – more unsteady and more fragmented. Given the pace of research and development in autonomous weapons, the U.N. meeting might have been the last chance to head off an arms race.

Lethal errors and black boxes

I see four primary dangers with autonomous weapons. The first is the problem of misidentification. When selecting a target, will autonomous weapons be able to distinguish between hostile soldiers and 12-year-olds playing with toy guns? Between civilians fleeing a conflict site and insurgents making a tactical retreat?

Killer robots, like the drones in the 2017 short film ‘Slaughterbots,’ have long been a major subgenre of science fiction. (Warning: graphic depictions of violence.)

The problem here is not that machines will make such errors and humans won’t. It’s that the difference between human error and algorithmic error is like the difference between mailing a letter and tweeting. The scale, scope and speed of killer robot systems – ruled by one targeting algorithm, deployed across an entire continent – could make misidentifications by individual humans like a recent U.S. drone strike in Afghanistan[13] seem like mere rounding errors by comparison.

Autonomous weapons expert Paul Scharre uses the metaphor of the runaway gun[14] to explain the difference. A runaway gun is a defective machine gun that continues to fire after a trigger is released. The gun continues to fire until ammunition is depleted because, so to speak, the gun does not know it is making an error. Runaway guns are extremely dangerous, but fortunately they have human operators who can break the ammunition link or try to point the weapon in a safe direction. Autonomous weapons, by definition, have no such safeguard.

Importantly, weaponized AI need not even be defective to produce the runaway gun effect. As multiple studies on algorithmic errors across industries have shown, the very best algorithms – operating as designed – can generate internally correct outcomes that nonetheless spread terrible errors[15] rapidly across populations.

For example, a neural net designed for use in Pittsburgh hospitals identified asthma as a risk-reducer[16] in pneumonia cases; image recognition software used by Google identified Black people as gorillas[17]; and a machine-learning tool used by Amazon to rank job candidates systematically assigned negative scores to women[18].

The problem is not just that when AI systems err, they err in bulk. It is that when they err, their makers often don’t know why they did and, therefore, how to correct them. The black box problem[19] of AI makes it almost impossible to imagine morally responsible development of autonomous weapons systems.

The proliferation problems

The next two dangers are the problems of low-end and high-end proliferation. Let’s start with the low end. The militaries developing autonomous weapons now are proceeding on the assumption that they will be able to contain and control the use of autonomous weapons[20]. But if the history of weapons technology has taught the world anything, it’s this: Weapons spread.

Market pressures could result in the creation and widespread sale of what can be thought of as the autonomous weapon equivalent of the Kalashnikov assault rifle[21]: killer robots that are cheap, effective and almost impossible to contain as they circulate around the globe. “Kalashnikov” autonomous weapons could get into the hands of people outside of government control, including international and domestic terrorists.

Front view of a quadcopter showing its camera
The Kargu-2, made by a Turkish defense contractor, is a cross between a quadcopter drone and a bomb. It has artificial intelligence for finding and tracking targets, and might have been used autonomously in the Libyan civil war to attack people. Ministry of Defense of Ukraine, CC BY[22][23]

High-end proliferation is just as bad, however. Nations could compete to develop increasingly devastating versions of autonomous weapons, including ones capable of mounting chemical, biological, radiological and nuclear arms[24]. The moral dangers of escalating weapon lethality would be amplified by escalating weapon use.

High-end autonomous weapons are likely to lead to more frequent wars because they will decrease two of the primary forces that have historically prevented and shortened wars: concern for civilians abroad and concern for one’s own soldiers. The weapons are likely to be equipped with expensive ethical governors[25] designed to minimize collateral damage, using what U.N. Special Rapporteur Agnes Callamard has called the “myth of a surgical strike[26]” to quell moral protests. Autonomous weapons will also reduce both the need for and risk to one’s own soldiers, dramatically altering the cost-benefit analysis[27] that nations undergo while launching and maintaining wars.

Asymmetric wars – that is, wars waged on the soil of nations that lack competing technology – are likely to become more common. Think about the global instability caused by Soviet and U.S. military interventions during the Cold War, from the first proxy war to the blowback experienced around the world today[28]. Multiply that by every country currently aiming for high-end autonomous weapons.

Undermining the laws of war

Finally, autonomous weapons will undermine humanity’s final stopgap against war crimes and atrocities: the international laws of war. These laws, codified in treaties reaching as far back as the 1864 Geneva Convention[29], are the international thin blue line separating war with honor from massacre. They are premised on the idea that people can be held accountable for their actions even during wartime, that the right to kill other soldiers during combat does not give the right to murder civilians. A prominent example of someone held to account is Slobodan Milosevic[30], former president of the Federal Republic of Yugoslavia, who was indicted on charges of crimes against humanity and war crimes by the U.N.’s International Criminal Tribunal for the Former Yugoslavia.

[Get our best science, health and technology stories. Sign up for The Conversation’s science newsletter[31].]

But how can autonomous weapons be held accountable? Who is to blame for a robot that commits war crimes? Who would be put on trial? The weapon? The soldier? The soldier’s commanders? The corporation that made the weapon? Nongovernmental organizations and experts in international law worry that autonomous weapons will lead to a serious accountability gap[32].

To hold a soldier criminally responsible[33] for deploying an autonomous weapon that commits war crimes, prosecutors would need to prove both actus reus and mens rea, Latin terms describing a guilty act and a guilty mind. This would be difficult as a matter of law, and possibly unjust as a matter of morality, given that autonomous weapons are inherently unpredictable. I believe the distance separating the soldier from the independent decisions made by autonomous weapons in rapidly evolving environments is simply too great.

The legal and moral challenge is not made easier by shifting the blame up the chain of command or back to the site of production. In a world without regulations that mandate meaningful human control[34] of autonomous weapons, there will be war crimes with no war criminals to hold accountable. The structure of the laws of war, along with their deterrent value, will be significantly weakened.

A new global arms race

Imagine a world in which militaries, insurgent groups and international and domestic terrorists can deploy theoretically unlimited lethal force at theoretically zero risk at times and places of their choosing, with no resulting legal accountability. It is a world where the sort of unavoidable algorithmic errors[35] that plague even tech giants like Amazon and Google can now lead to the elimination of whole cities.

[Over 140,000 readers rely on The Conversation’s newsletters to understand the world. Sign up today[36].]

In my view, the world should not repeat the catastrophic mistakes of the nuclear arms race. It should not sleepwalk into dystopia.

This is an updated version of an article[37] originally published on September 29, 2021.

References

  1. ^ killed human beings for the first time ever (www.npr.org)
  2. ^ report on the Libyan civil war (undocs.org)
  3. ^ Convention on Certain Conventional Weapons (www.un.org)
  4. ^ didn’t reach consensus on a ban (www.reuters.com)
  5. ^ investing heavily (www.newsweek.com)
  6. ^ budgeted US$18 billion (www.scientificamerican.com)
  7. ^ humanitarian organizations (www.stopkillerrobots.org)
  8. ^ increasing the risk of preemptive attacks (www.rand.org)
  9. ^ combined with chemical, biological, radiological and nuclear weapons (foreignpolicy.com)
  10. ^ specialist in human rights (scholar.google.com)
  11. ^ weaponization of artificial intelligence (muse.jhu.edu)
  12. ^ authority to launch a strike (wwnorton.com)
  13. ^ U.S. drone strike in Afghanistan (www.reuters.com)
  14. ^ the runaway gun (wwnorton.com)
  15. ^ generate internally correct outcomes that nonetheless spread terrible errors (brianchristian.org)
  16. ^ asthma as a risk-reducer (www.pulmonologyadvisor.com)
  17. ^ identified Black people as gorillas (www.usatoday.com)
  18. ^ systematically assigned negative scores to women (www.reuters.com)
  19. ^ black box problem (jolt.law.harvard.edu)
  20. ^ contain and control the use of autonomous weapons (www.popularmechanics.com)
  21. ^ Kalashnikov assault rifle (www.npr.org)
  22. ^ Ministry of Defense of Ukraine (commons.wikimedia.org)
  23. ^ CC BY (creativecommons.org)
  24. ^ mounting chemical, biological, radiological and nuclear arms (cpr.unu.edu)
  25. ^ ethical governors (smartech.gatech.edu)
  26. ^ myth of a surgical strike (news.un.org)
  27. ^ cost-benefit analysis (www.jstor.org)
  28. ^ blowback experienced around the world today (dx.doi.org)
  29. ^ Geneva Convention (www.law.cornell.edu)
  30. ^ Slobodan Milosevic (www.britannica.com)
  31. ^ Sign up for The Conversation’s science newsletter (theconversation.com)
  32. ^ accountability gap (www.hrw.org)
  33. ^ criminally responsible (digitalcommons.du.edu)
  34. ^ meaningful human control (blogs.icrc.org)
  35. ^ algorithmic errors (www.amazon.com)
  36. ^ Sign up today (memberservices.theconversation.com)
  37. ^ article (theconversation.com)

Read more https://theconversation.com/un-fails-to-agree-on-killer-robot-ban-as-nations-pour-billions-into-autonomous-weapons-research-173616

The Times Features

The Gift That Keeps Growing: Why Tinybeans+ Gift Cards are a game-changer for new parents

As new parents navigate the joys and challenges of raising a child in the digital age, one question looms large: how do you preserve and share your baby's milestones without co...

Group Adventures Made Easy: How to Coordinate Shuttle Services from DCA to IAD

Traveling as a large group can be both exciting and challenging, especially when navigating busy airports like DCA (Ronald Reagan Washington National Airport) and IAD (Washington...

From Anxiety to Assurance: Proven Strategies to Support Your Child's Emotional Health

Navigating the intricate landscape of childhood emotions can be a daunting task for any parent, especially when faced with common fears and anxieties. However, transforming anxie...

The Rise of Meal Replacement Shakes in Australia: Why The Lady Shake Is Leading the Pack

Source Meal replacement shakes are having a moment in Australia, and it’s not hard to see why. They’re quick, convenient, and packed with nutrition, making them the perfect solu...

HCF’s Healthy Hearts Roadshow Wraps Up 2024 with a Final Regional Sprint

Next week marks the final leg of the HCF Healthy Hearts Roadshow for 2024, bringing free heart health checks to some of NSW’s most vibrant regional communities. As Australia’s ...

The Budget-Friendly Traveler: How Off-Airport Car Hire Can Save You Money

When planning a trip, transportation is one of the most crucial considerations. For many, the go-to option is renting a car at the airport for convenience. But what if we told ...

Times Magazine

Employment support for people with disability

If you’re a job seeker in Australia and you’re currently living with a disability, there will be some hurdles to overcome and added challenges you will have to face in your efforts to find and keep a job. The positive news is that you don’t have ...

The Seamless Transition from Blogs to AI-Enhanced Videos

The stuff we see and do online keeps changing because new technologies and websites pop up. We use different things like words, pictures, sounds, and videos. Blogs are one of the oldest and coolest ways people share their thoughts online. They us...

The MCI Institute: Empowering Students to Achieve Success

As a Vocational Education and Training (VET) student, it can be difficult to know which courses to take, and where to get your qualification. Finding a reliable institution to provide quality education and training is essential to success. Th...

The Heart of Speed: An In-Depth Look at Racing Engines

Racing engines are the beating heart of motorsport, engineered for speed, performance, and reliability. These powerhouses are finely crafted machines, embodying the pinnacle of automotive engineering. In this article, we will explore the intricacie...

Swimming with whales: you must know the risks and when it’s best to keep your distance

Three people were injured last month in separate humpback whale encounters off the Western Australia coast. The incidents happened during snorkelling tours on Ningaloo Reef when swimmers came too close to a mother and her calf. Swim encounter...

Take a Spin on a Pair of Yellow Roller Skates

History of Yellow Roller Skates Roller skates have been a popular form of transportation since the late 1700s and have seen many design changes over the years. One of the most iconic designs is that of yellow roller skates, which have become a bel...