1. Skip to content
  2. Skip to main menu
  3. Skip to more DW sites
Politics

Killer robots could escape regulation

August 20, 2019

It's no longer the stuff of science fiction: Wars could soon be decided by lethal autonomous weapons systems. But a concerted effort to ban "killer robots" through international law at the UN appears to have hit a wall.

Platform M Combat Robot
Image: imago/ITAR-TASS

UN Secretary-General Antonio Guterres has labeled lethal autonomous weapons (LAW) "politically unacceptable and morally repulsive," and called for them to be banned under international law. "Consider the consequences if an autonomous weapons system can independently select people as targets and attack them," Guterres has repeatedly told the international community, whose representatives are currently debating a ban on what's colloquially often termed killer robots in Geneva. Agreement, however, is not in sight. In fact, as one of the participants told DW, the talks are making no progress.

Many armies around the world are already testing weapons that combine artificial intelligence and robotics to form potentially lethal technology. When machines fight independently on the battlefield, according to military logic, the lives of soldiers are spared. Supporters argue that unlike humans, machines do not get tired and are less prone to making mistakes or decisions based on prejudice or emotion. This, they say, could help avoid civilian casualties.

Read more: Resistance to killer robots growing

Violation of international law

According to the generally accepted definition of the International Committee of the Red Cross, autonomous weapons select their targets independently and fight them independently — soldiers are no longer involved, no triggers are pulled. Guided weapons that hover in the air until their target is in a favorable position before they attack already exist.

On the battlefields of the future, scientists say, algorithms could decide over life and death, not humans. That, however, would violate international humanitarian law stipulating a clear distinction must be made between combatants and civilians in attacks. Autonomous weapons systems cannot do that very well at present. Critics argue these weapons can't very well be equipped so as to allow them to make decisions in accordance with international law — at least not at the moment. That's why a human should always be "in the loop."

Read more: Artificial intelligence — are machines taking over?

An international coalition of activists, along with 28 states, are calling for a ban on 'killer robots'Image: Reuters/A. Hilse

Difficult talks

That has been under discussion for five years at the UN in Geneva, within the framework of the Convention on Certain Conventional Weapons. In 1995, the Convention succeeded in banning the use of blinding laser weapons before they were used in wars. The opponents of autonomous weapons hoped for a similar outcome, but these talks have been sluggish.

Pioneering countries in the field of autonomous weapons systems — Russia, the United States and Israel — reject a binding ban under international law. These military heavyweights face a group of 28 states that are demanding a binding ban, including Austria — the group's only European Union country. Their backing from civil society groups is strong and continues to grow — 113 nongovernmental organizations from more than 50 countries support the international Campaign to Stop Killer Robots, along with the European Parliament and 21 Nobel Peace Prize winners.

Read more: 10 things to know about 'killer robots'

Condemnation on paper only

Germany has not joined either camp. The government made a clear commitment in its coalition agreement, stating: "We reject autonomous weapon systems that are beyond man's control. We want to outlaw them worldwide." All the same, the German delegation in Geneva considers a binding ban under international law to be unenforceable at present, arguing US and Russian opposition is too great, as is the danger that the negotiations could fail completely. The German government fails to mention that it, too, has an interest in autonomous weapons systems, including as part of the Franco-German Future Combat Aerial System.

Two more years of negotiations?

Amidst the possibility that current talks may fail, a new proposal suggests continuing the negotiations for another two years in hopes that by then, the vague formulation reads, a "normative framework" has been reached. This is far removed from the binding ban demanded by 28 states.

The current proposal's "nebulous formulations" are disappointing, said Thomas Küchenmeister, German spokesman of the Campaign to Stop Killer Robots. The necessity of human control in the use of armed force is being "played down," he told DW. Küchenmeister expressed concerned that the talks in Geneva "will never lead to a binding ban on autonomous weapons."

The critics of autonomous weapons systems are not ready to give up, however.

When an agreement over a ban on anti-personnel mines and cluster munitions could not be reached at the UN in Geneva, activists campaigned internationally for agreements that were eventually signed in Ottawa (1997) and Oslo (2008) respectively. Today, they are part of existing international law.

Skip next section Explore more
Skip next section DW's Top Story

DW's Top Story

Skip next section More stories from DW