Skip to main content

Statement on International Humanitarian Law, CCW meeting on lethal autonomous weapons systems

Delivered by Bonnie Docherty, Senior Researcher

Thank you, Mr. Chairman.

Fully autonomous weapons raise a host of moral, legal, security, and technological concerns, but I will focus my remarks today on those related to international humanitarian law (IHL).

As indicated by its title and preamble, the CCW aims to promote compliance with IHL and in particular provisions related to the protection of civilians. To live up to its stated goal, CCW must address the possibility of a next generation of weapons technology that would apply force without meaningful human control and thus directly challenge IHL in several ways.

First, fully autonomous weapons would face significant obstacles to complying with the principles of distinction and proportionality. For example, these systems would lack the human judgment necessary to determine whether expected civilian harm outweighs anticipated military advantage in ever-changing and unforeseen combat situations.

Second, the use of fully autonomous weapons would lead to a gap in individual criminal responsibility for war crimes. Commanders are responsible for the actions of a subordinate if they knew or should have known the actions would be unlawful and did not prevent or punish them. It would be legally challenging, and unfair, to hold commanders liable for the unforeseeable actions of a machine operating outside their control.

Third, fully autonomous weapons raise serious concerns under the Martens Clause, a provision of IHL that sets a moral baseline for judging emerging technologies. Fully autonomous weapons would undermine the principles of humanity because they would be unable to apply compassion or human judgment to decisions to use force. Widespread opposition from experts and the general public shows that the weapons would also run counter to the dictates of public conscience.

The challenges posed to IHL have been widely discussed during CCW meetings over the past five years. Now it is time for states to act on these concerns—whether in the CCW or another forum.  Like many states in this room, we believe that the solution is a new legally binding instrument that ensures meaningful human control is maintained over the use of force.

We have repeatedly heard the refrain “existing IHL is adequate” in debates about other weapons systems. But we have also seen that the international community often recognizes the necessity of new law to complement and strengthen IHL. The CCW and other disarmament treaties demonstrate that adopting such law is feasible and effective. 

While existing IHL establishes fundamental rules regarding civilian protection, accountability, and ethical considerations, it was not designed for situations in which life-and-death decisions are delegated to machines. 

The fact that states have been debating the challenges this revolutionary technology poses under IHL for so long underscores the need for more clarity in the law.  A new protocol or stand-alone treaty would provide such legal clarity. It would set a high standard that would bind states parties. By stigmatizing the weapons systems, it could also influence the actions of states not party and even non-state actors.

Such a treaty should ensure that meaningful human control over the use of force is maintained. It should prohibit the development, production, and use of weapons systems that select and engage targets objects without meaningful human control.

By enshrining the need for meaningful human control, the instrument would address the range of problems raised by fully autonomous weapons. For example, with regard to IHL, meaningful human control would ensure humans can apply judgment to the selection and engagement of targets, be held accountable for their actions, and uphold the principles of humanity and dictates of public conscience.

The exact term and its definition can be worked out during negotiations, but meaningful human control could encompass different elements, including but not limited to: (1) necessary time for human deliberation and action; (2) sufficient access to information about the context in which force might be used and the machine’s process for selecting and engaging target objects; and (3) predictable, reliable, and transparent technology.

The discussions this week—and at previous CCW meetings—show that states have largely agreed on the necessity for human control. This convergence provides the common ground needed to take the next step. High Contracting Parties to the CCW should approve a negotiating mandate at the November 2019 meeting and adopt a new protocol by the end of 2020.  

Thank you.

 

Your tax deductible gift can help stop human rights violations and save lives around the world.

Most Viewed