Skip to main content

Statement on International Humanitarian Law, CCW meeting on lethal autonomous weapons systems

Delivered by Bonnie Docherty, Senior Researcher

Thank you Madame Chairwoman.

Human Rights Watch appreciates the discussions we have had yesterday and today on the international humanitarian law implications of lethal autonomous weapons systems. With all due respect, however, we want it to be clear that there are legal experts with different points of view than those on the panel. We see the weapons’ inability to comply with existing law as a major problem and the possibility of adopting new law as the best solution.

We would like to make four points in response to the session.

First, we believe that there is little prospect that fully autonomous weapons could comply adequately with international humanitarian law (IHL). In an era where combatants blend with the civilian population, distinction depends on understanding an individual’s intentions rather than recognizing a uniform. Humans can understand subtle clues, such as tone of voice or body language, because they can identify with another individual as a fellow human being. A robot could miss or misconstrue such cues. In addition, the proportionality test requires using human judgment to weigh civilian harm and military advantage in a complex situation. It would be difficult for a robot ever to replicate such judgment. It is important to note that the relevant proportionality decision would be made at the time of attack, not when the commander deployed the robot. At the time of attack, there would be no human in the loop.

Some people contend that fully autonomous weapons could be used lawfully in narrow circumstances or that the IHL requirement to take all feasible precautions would limit the use of fully autonomous weapons to situations in which they were the most humane option. These arguments assume that militaries will always comply with IHL. It ignores the fact that some parties have little regard for the law and others who are generally compliant might be tempted to use the weapons in dire circumstances once they are in their arsenals.

Second, the Martens Clause is crucial to this debate because it applies in cases where a means or method of war is not covered by specific provisions of international law. Such is the case for fully autonomous weapons, and therefore the Martens Clause should be taken into account. We appreciate the clarity on the relevance of the Martens Clause provided yesterday by the ICRC. Fully autonomous weapons threaten to contravene both prongs of the Martens Clause. They raise concerns under the principles of humanity, which has been defined as requiring compassion and the ability to protect. As machines, robots could not exhibit compassion, a key emotional safeguard that restrains killing, and as already mentioned, they could not adequately protect civilians under IHL. They also seem likely to run counter to the dictates of public conscience. The prospect of ceding life-and-death decisions to machines on the battlefield shocks the conscience of many. The adoption of CCW Protocol IV on Blinding Lasers was driven in large part by concern that the proposed weapons would violate the Martens Clause. CCW states should look to the protocol for precedent.

Third, we urge states to conduct Article 36 reviews and to do so in a transparent manner. But they alone are not sufficient to deal with problems associated with fully autonomous weapons. Many states do not conduct such weapons reviews, and few, if any, do so transparently. Article 36 reviews, however, could complement a ban on these weapons by helping to identify when technology was crossing the border into a prohibited development.

Finally, existing IHL is not adequate to address the host of problems raised by fully autonomous weapons. A new legally binding instrument—whether a CCW protocol or stand-alone treaty—is needed to bring clarity about how the law applies in this context and to cover development and production as well as use.

We believe that a ban, not existing law or new regulation, is the best solution. That has been the case for other weapons that cause serious humanitarian harm, including those banned under the CCW. A ban would be easier to enforce because of its clarity, and it is the only way to stigmatize the weapon, which can influence states not party to abide by the ban. It would also seek to prevent misuse by keeping the weapons out of arsenals of states and non-state actors.

The ban should be preemptive, like that of CCW Protocol IV. A wait-and-see approach could lead to an arms race, proliferation, and deployment of technology before it is ready to deal with the potential legal challenges. Furthermore, the more states invest in a new technology, the less like they will be to want to give it up.

In conclusion, I want to refer you to our new publication, released this week, called “Advancing the Debate on Killer Robots.” It expands on the points I discussed today and responds in depth to 12 of our critics’ arguments, including many of those raised in this session.

Thank you.

Your tax deductible gift can help stop human rights violations and save lives around the world.