Skip to main content

Statement on International Humanitarian Law, CCW meeting on lethal autonomous weapons systems

Delivered by Steve Goose, Director

Thank you Madam Chairperson,

Let me start by saying that we do not see the relevance of Professor Jensen’s history lesson, the relevance of the century-old proposals that were abandoned to ban submarines and ban the use of balloons as weapons platforms. We think it would be more instructive to look at the weapons bans that were adopted during the past 20 years.

In 1995, this body adopted a preemptive ban on blinding laser weapons. The blinding laser protocol shows that it is both desirable and feasible to achieve a preemptive international prohibition – despite initial opposition to the ban that was much stronger than what we have seen thus far to fully autonomous weapons.

When we first started calling for a ban on antipersonnel landmines, military lawyers and others said that a ban was both unnecessary and unachievable, and that existing international humanitarian law was sufficient. When we first started calling for a ban on cluster munitions, military lawyers, including notably Bill Boothby, said that a ban was both unnecessary and unachievable, and that existing IHL was sufficient. But in a very short time, those views were rejected, and comprehensive ban treaties were adopted in 1997 and 2008, highly successful treaties that have saved hundreds of thousands of lives.

We think there will be a similar trajectory for fully autonomous weapons, that legal objections to a ban will fade away as humanitarian, moral, ethical, and international security concerns hold sway, and countries will agree on a comprehensive ban in a matter of a few years.

As Human Rights Watch has stated many times in the past, it believes that it is highly unlikely that fully autonomous weapons will be able to be used in compliance with international humanitarian law or international human rights law. In particular, basic requirements such as distinction and proportionality demand human judgment, something that machines are unlikely to ever attain.

It is unfortunate that none of the panelists addressed the issue of accountability, but we were pleased that several states did so in their comments. As we noted on Monday, Human Rights Watch has released a new report looking at accountability, which concludes that assigning personal accountability for the actions of fully autonomous weapons would be virtually impossible in most situations. We consider this accountability gap one of the key reasons why fully autonomous weapons must be banned preemptively.

We had also hoped to hear some mention of the Martens Clause, even though it is on the agenda tomorrow. We are glad that Greece referred to it a few minutes ago. We believe that there is already considerable evidence that fully autonomous weapons are seen as contrary to the dictates of public conscience and the principles of humanity, and are becoming more so with each passing day, as the implications of the weapons are better known and understood.

Two of the panelists, as well as several states over the past few days, have cited the possible utility and lawfulness of using fully autonomous weapons only in very limited circumstances, for very specific missions. But it is inconceivable that, once developed and fielded, fully autonomous weapons would be limited to carefully prescribed roles. This is especially true in light of the reality that once fielded by any one nation, others will surely follow suit, including nations with little respect for international law. The history of warfare shows that once acquired, weapons will be used in any way they possibly can, not just in a limited, controlled way.

Finally, it is important to note that fully autonomous weapons are not just an IHL issue, or even primarily and IHL issue. The weapons do not just invoke IHL concerns, but a wide range of dangers. Perhaps first and foremost, are ethical and moral concerns, and the conviction that humans should not cede life and death decisions on the battlefield, or in domestic law enforcement situations, to machines.

Among others, there are also deep concerns about proliferation, including to human rights abusers, and about the impact on international security, with the specter of a global robotic arms race.

Given this plethora of concerns, it would be irresponsible to take a wait and see approach to fully autonomous weapons. CCW states parties have a chance to fix this potentially calamitous problem before widespread harm is done. We ask you to seize the opportunity and to do so urgently.

Thank you.

Your tax deductible gift can help stop human rights violations and save lives around the world.