Human Rights Watch delivered five statements at the Convention on Conventional Weapons Group of Governmental Experts meeting on lethal autonomous weapons systems in March 2019.
Statement on International Humanitarian Law, 26 March 2019
Delivered by Bonnie Docherty, Senior Researcher
Thank you, Mr. Chairman.
Fully autonomous weapons raise a host of moral, legal, security, and technological concerns, but I will focus my remarks today on those related to international humanitarian law (IHL).
As indicated by its title and preamble, the CCW aims to promote compliance with IHL and in particular provisions related to the protection of civilians. To live up to its stated goal, CCW must address the possibility of a next generation of weapons technology that would apply force without meaningful human control and thus directly challenge IHL in several ways.
First, fully autonomous weapons would face significant obstacles to complying with the principles of distinction and proportionality. For example, these systems would lack the human judgment necessary to determine whether expected civilian harm outweighs anticipated military advantage in ever-changing and unforeseen combat situations.
Second, the use of fully autonomous weapons would lead to a gap in individual criminal responsibility for war crimes. Commanders are responsible for the actions of a subordinate if they knew or should have known the actions would be unlawful and did not prevent or punish them. It would be legally challenging, and unfair, to hold commanders liable for the unforeseeable actions of a machine operating outside their control.
Third, fully autonomous weapons raise serious concerns under the Martens Clause, a provision of IHL that sets a moral baseline for judging emerging technologies. Fully autonomous weapons would undermine the principles of humanity because they would be unable to apply compassion or human judgment to decisions to use force. Widespread opposition from experts and the general public shows that the weapons would also run counter to the dictates of public conscience.
The challenges posed to IHL have been widely discussed during CCW meetings over the past five years. Now it is time for states to act on these concerns—whether in the CCW or another forum. Like many states in this room, we believe that the solution is a new legally binding instrument that ensures meaningful human control is maintained over the use of force.
We have repeatedly heard the refrain “existing IHL is adequate” in debates about other weapons systems. But we have also seen that the international community often recognizes the necessity of new law to complement and strengthen IHL. The CCW and other disarmament treaties demonstrate that adopting such law is feasible and effective.
While existing IHL establishes fundamental rules regarding civilian protection, accountability, and ethical considerations, it was not designed for situations in which life-and-death decisions are delegated to machines.
The fact that states have been debating the challenges this revolutionary technology poses under IHL for so long underscores the need for more clarity in the law. A new protocol or stand-alone treaty would provide such legal clarity. It would set a high standard that would bind states parties. By stigmatizing the weapons systems, it could also influence the actions of states not party and even non-state actors.
Such a treaty should ensure that meaningful human control over the use of force is maintained. It should prohibit the development, production, and use of weapons systems that select and engage targets objects without meaningful human control.
By enshrining the need for meaningful human control, the instrument would address the range of problems raised by fully autonomous weapons. For example, with regard to IHL, meaningful human control would ensure humans can apply judgment to the selection and engagement of targets, be held accountable for their actions, and uphold the principles of humanity and dictates of public conscience.
The exact term and its definition can be worked out during negotiations, but meaningful human control could encompass different elements, including but not limited to: (1) necessary time for human deliberation and action; (2) sufficient access to information about the context in which force might be used and the machine’s process for selecting and engaging target objects; and (3) predictable, reliable, and transparent technology.
The discussions this week—and at previous CCW meetings—show that states have largely agreed on the necessity for human control. This convergence provides the common ground needed to take the next step. High Contracting Parties to the CCW should approve a negotiating mandate at the November 2019 meeting and adopt a new protocol by the end of 2020.
Statement on Options for Future Work, 27 March 2019
Delivered by Mary Wareham, Human Rights Watch for the Campaign to Stop Killer Robots
For the Campaign to Stop Killer Robots, our preferred option for addressing the humanitarian and international security challenges posed by fully autonomous weapons—or “LAWS”—is for states to negotiate a legally-binding instrument to prohibit weapons systems that can select and engage targets without meaningful human control. The treaty should enshrine the principle that states should maintain meaningful human control over the use of force.
There are multiple advantages and benefits to creating such a ban treaty or protocol. A treaty would help satisfy mounting ethical, legal, humanitarian, operational, technical and other concerns raised by fully autonomous weapons. Here are nine reasons why a treaty is necessary:
1. To enhance and strengthen existing international humanitarian and human rights law. A new treaty would build on those areas of law and eliminate any doubts that fully autonomous weapons are incapable of abiding by the fundamental principles of international humanitarian and human rights law. These weapons fundamentally differ from other weapons and raise unique challenges. A treaty can unambiguously address the application of existing law to these weapons.
2. To clarify states’ obligations and make explicit the requirements for compliance. A new treaty would standardize rules across countries. Even states that do not immediately join the treaty would be inclined to abide by its prohibition due to the stigma associated with removing meaningful human control from weapons systems and the use of force. A binding, absolute ban on fully autonomous weapons would also be easier to enforce than a complex series of rules and regulations because it would be simpler and clearer and reduce the need for case-by-case determinations.
3. To make the illegality of fully autonomous weapons clear, especially in countries that do not conduct Article 36 legal reviews of new or modified weapons. A new treaty would also help to resolve the shortcomings of Article 36 reviews, which are only conducted by approximately 30 states, as reviewers follow varying standards, reviews can be narrow in their scope, and reviews are never publicly released.
4. To facilitate agreement on the legal definition of fully autonomous weapons and, in so doing, establish what is unacceptable about autonomy in weapons systems.
5. To help stop development before it goes too far and thereby avert an arms race and prevent proliferation, including by states with little regard for international humanitarian law or by non-state armed groups. The new treaty should prohibit not only use, but also development and production of fully autonomous weapons.
6. To close the accountability gap raised by fully autonomous weapons. There are currently insurmountable legal and practical obstacles that would, in most cases, prevent holding anyone responsible for unlawful harms caused by fully autonomous weapons. A treaty prohibiting killer robots could lead to national implementation laws criminalizing violations of the treaty, thereby facilitating enforcement.
7. To address the far-reaching moral and ethical objections raised over fully autonomous weapons, most notably their lack of judgment and empathy, threat to human dignity, and absence of moral agency.
8. To satisfy rising calls for regulation from states, industry, and civil society. This would help meet public expectations that governments will act preventively and address emerging technologies that raise a host of concerns.
9. To ensure continued research and development of beneficial civilian applications of new and emerging technologies including robotics and artificial intelligence, by providing clarity to tech companies and the financial institutions and investment communities that support them, and by ensuring their work is not tainted by the stigmatizing impact of autonomous weapons.
If states are serious about the Convention on Conventional Weapons being the appropriate forum to tackle concerns raised by fully autonomous weapons, then they should commit to move to a negotiating mandate at the Meeting of High Contracting Parties in November. The CCW is a flexible framework convention that was intended to prohibit or restrict the use of certain weapons which may be deemed to be excessively injurious or to have indiscriminate effects. A new protocol could confirm the areas of convergence captured by the “possible guiding principles” contained in the final report of the last annual meeting, which affirmed that “human responsibility for decisions on the use of weapon systems must be retained.”
There is, of course, precedent for a ban treaty, including ones negotiated outside of United Nations auspices. In the past, responsible states found it necessary to supplement existing legal frameworks for weapons that by their nature posed significant humanitarian threats, such as biological weapons, chemical weapons, antipersonnel mines, and cluster munitions. There is also precedent for such a preemptive ban in CCW Protocol IV prohibiting laser weapons designed to permanently blind human soldiers.
The Campaign to Stop Killer Robots cannot support alternative approaches that fall short of new international law, such as political declarations, guidelines, codes of conduct, compendiums of military “best practices,” and questionnaires. We highly doubt that such measures will satisfy public concerns.
This Group of Governmental Experts should agree to recommend that the CCW move to a negotiating mandate and not simply roll the current one over and consider options again. There is not time or money to waste on inconclusive talks that lead nowhere. If the CCW cannot deliver a negotiating mandate in 2019—after six years of work—it is time to look elsewhere.
Statement on Options for Future Work, 27 March 2019
Delivered by Steve Goose, Director, Arms Division
Thank you for the floor Mr. Chairman.
Human Rights Watch is a co-founder of the Campaign to Stop Killer Robots.
The only viable option is a legally-binding instrument, one that comprehensively prohibits the development, production, and use of fully autonomous weapons, or Lethal Autonomous Weapon Systems, and one that requires meaningful human control over critical combat functions.
The partial measures, or more accurately the baby steps, that have been proposed are not justified after six years of work. The lack of ambition and of urgency on the part of some states is shameful.
In fact, there is widespread support for a legally-binding instrument and for a ban on Lethal Autonomous Weapon Systems. The vast majority of states in this room support moving to negotiation of a legally-binding instrument. Only a very small number of states have expressed opposition to a legally-binding instrument.
Some of those states that have expressed opposition seem to be looking for a green light to develop and field fully autonomous weapons. They not only reject the notion of a red light for their efforts, they also reject even a yellow caution light.
In the past, we heard loud and insistent proclamations that there was no need for new law, and certainly not for a ban, on antipersonnel mines and on cluster munitions. Yet, many of those proclaiming the loudest changed their views, participated in negotiations of a legally-binding instrument outside of the CCW, and joined the Mine Ban Treaty and the Convention on Cluster Munitions.
If there is anything that demands legally-binding measures, it is autonomous weapons. This is because of their novel and unique character, and because of their far-reaching implications, including changing the very nature of warfare. Lesser measures simply will not suffice to address the many potential dangers of fully autonomous weapons.
A non-legally-binding political declaration has been touted by some as a useful interim measure, as a step toward a legal instrument. This may have made sense four years ago, but not now. Moreover, based on the way CCW usually operates, one can confidently predict that consideration of a political declaration would involve negotiation of every word, would take years to conclude, and would be the end point. There would be no further action on Lethal Autonomous Weapon Systems in the CCW.
It appears that some states are thinking of substituting additional deliberations on the 10 Guiding Principles agreed to last year for the notion of a political declaration. But this would suffer the same downsides as a political declaration, most notably that it would not be legally-binding. Some have advocated further discussion of the principles, others have said to build on them, and still others have said to “operationalize” them. I would welcome clarity on what such operationalization would entail.
In any event, this would likely result in a continuation of the “talk shop” approach that has dominated the past five years. It is unlikely to produce a concrete outcome or to have any real impact.
We appreciate the efforts to enhance, strengthen, and universalize Article 36 on weapons reviews. This is an admirable and worthwhile goal. But, as many states have said, this is not enough in and of itself to address the issue. Others have pointed to the small number of states that carry out such reviews and to the complete lack of transparency by all states. Moreover, this is not the right place for a thorough and comprehensive examination of Article 36. The task of this GGE is to deal with Lethal Autonomous Weapons Systems, and not to have that effort turned into consideration of weapons reviews. That should be a separate undertaking.
Mr. Chairman, we urge CCW High Contracting Parties to adopt a negotiating mandate at the November annual meeting. We would hope that would result in a new Protocol VI that prohibits fully autonomous weapons and requires meaningful human control over the use of force.
We have strongly supported the CCW’s work on this issue since 2013, and we have sincerely hoped for a successful outcome in this forum. But if High Contracting Parties are unable to agree to a negotiation mandate in November, other paths must be explored, such as the UN General Assembly or an independent process like the Ottawa Process on landmines and the Oslo Process on cluster munitions.
Statement on Guiding Principles, 29 March 2019
Delivered by Bonnie Docherty, Senior Researcher
Thank you, Mr. Chairman.
We appreciate the discussions that you have led this week and the efforts to bring more substance to the debate and to identify points of convergence.
We believe, however, that the agenda for this afternoon raises a procedural concern. Discussing the guiding principles instead of the range of proposals for the way ahead has the potential to prejudice the debate. In our view, there is no consensus that the guiding principles are the best path forward.
With regard to substance, the principles are an inadequate response to the dangers presented by the prospect of weapons systems that could select and engage targets without meaningful human control. They are inadequate for many reasons, including the following:
• Principles on the applicability of international humanitarian law (IHL) (Principle 1) and weapons reviews (Principle 4) simply restate international law.
• Other principles restate uncontroversial points that this body has agreed on—for example, that policy measures should not interfere with peaceful uses of technology (Principle 8) and that the use of emerging technologies should comply with IHL (Principle 7).
• The lack of clarity and precision in the language will confuse the issue and muddy accepted understandings of international law. For example, responsibility is used sometime to refer to legal responsibility (Principle 3) and sometimes to what has widely been referred to in this room as human control (Principle 2)
• The principles implicitly acknowledge that there are risks to lethal autonomous weapons systems. For example, they reference “risk assessments” (Principle 6), the need to ensure legal responsibility (Principle 3), and certain security and technical concerns (Principle 5). But the list of concerns is neither complete nor clearly articulated.
• Most important, the principles mention “crafting potential policy measures” but they do not themselves represent policy measures that would address any of these concerns. There is nothing in the guiding principles that indicates that a potential goal is a legally binding instrument, despite the fact that the majority of states in the room have called for one.
• Principle 10 notes that “CCW is an appropriate framework for dealing with the issue.” If that is the case, we believe it should deal with the issue in the way CCW was designed to operate—that is, through a new protocol that addresses the humanitarian concerns presented by certain weapons systems.
Statement on The Way Forward, 29 March 2019
Delivered by Mary Wareham, Human Rights Watch for the Campaign to Stop Killer Robots
Mr. Chair, we appreciate the way in which you have chaired this meeting and acknowledge the way in which delegates have deepened their understandings of the concerns raised by lethal autonomous weapons systems. It shows how having draft text can focus deliberations, but it’s a shame the documents are merely summaries of the various sessions of the meeting.
However, yet again, we are dismayed that a small group of states have limited the ambition of the majority and concerned by the lack of urgency for achieving a meaningful result from these diplomatic talks on lethal autonomous weapons systems.
Many of the 90 states participating in this week’s United Nations meeting on these weapons expressed their firm desire to move to negotiate a new treaty to prohibit or restrict these weapons systems. Such a treaty is widely seen as necessary to enshrine the principle that states should maintain meaningful human control over the use of force.
As you know, calls to ban killer robots are multiplying rapidly and more than 4,500 artificial intelligence experts have called for a new treaty to prohibit lethal autonomous weapons systems in various open letters since 2015. That includes Yoshua Bengio, Yann le Cun, and Geoffrey Hinton, who were this week awarded the Turing Award, the most prestigious prize in the field of computer science. We welcome the fittingly recognition of their important contributions to this field and their active support for our common goal of a new treaty to ban killer robots.
It’s clear that a majority of states want to do the right thing, but the calls from some states for guiding principles, declarations, guidelines, codes of conduct, compendiums of military “best practices,” questionnaires, and more committees are not the answer. Such measures will not satisfy public concerns.
I will be frank. There is rising concern that these Convention on Conventional Weapons talks on lethal autonomous weapons systems are a way for militarily powers to try to placate civil society, distract public attention, and manage media expectations rather than seriously address the challenges such weapons pose for humanity.
We say this because it is the states opposing any move to create a new treaty who are investing significant funds and effort into developing weapons systems with decreasing human control over the critical functions of selecting and engaging targets.
I want to remind everyone how this meeting opened at the beginning of this week with an appeal from the UN Secretary-General Antonio Guterres to prohibit lethal autonomous weapons systems, which he called “morally repugnant and politically unacceptable.” As he stated: “the world is watching, the clock is ticking.”
So the Campaign to Stop Killer Robots will be back in August for the next CCW meeting, but our faith in this forum is rapidly dissipating. Therefore, we will be deepening and expanding our engagement in capitals around the world and also present at the United Nations General Assembly later this year.