Skip to main content

Thank you Mr. Chairperson,

Human Rights Watch is a co-founder of the Campaign to Stop Killer Robots, and Mary Wareham of Human Rights Watch is the global coordinator of the Campaign. We are calling for a preemptive prohibition on the development, production, and use of fully autonomous weapons (known here as lethal autonomous weapons systems) – and it would be our preference to achieve that ban in the form of a new CCW Protocol VI.

I would like to highlight today what I am calling the top five fallacies of the CCW discussions on lethal autonomous weapon systems. These are interrelated fallacies, and they have the regrettable effect—and possibly the intention—of slowing progress.

Fallacy #1:  We don’t know what we are talking about; we still don’t know what LAWS are or what we are really discussing. 

It is somewhat astonishing that some states continue to say this, after three years of consideration and dozens of expert presentations. In fact, there is a common understanding in the room, and it has been clearly articulated by many. We are talking about future weapons systems that once initiated, using sensors and artificial intelligence, will be able to operate without meaningful human control, that will be able to select and engage targets on their own, rather than a human making targeting and kill decisions for each individual attack. There is naturally still some lack of clarity about all aspects of what might constitute a lethal autonomous weapon system and what might not, but there is a solid understanding of the concept, and we certainly know in general what we are talking about.

Fallacy #2: We must have an agreed upon definition before we can proceed further.

There has been much hand-wringing about the lack of a precise definition of lethal autonomous weapons systems, or of meaningful human control.  But this is the normal state of affairs at this stage of international discussions on a weapons issue. I have been involved in seven different international negotiations on weapons in the past 22 years, inside and outside of the CCW, and definitions are almost always the last thing agreed to.  That is because definitions are so important.  They determine what is captured in a treaty and what is not. They largely determine how strong or how weak a treaty is going to be.

What is important is that there is a shared understanding of concept of lethal autonomous weapons weapons systems—which as I noted already exists—and that growing out of that a working definition is developed.  We already have working definitions provided by the ICRC, the United States, and others, so this should not be a difficult task.

Fallacy #3: It is premature to move to formal discussions or to negotiations.

The fact is that CCW states parties have already done as much preparatory work on this issue as they had done on landmines, blinding lasers, explosive remnants of war, mines other than antipersonnel mines, or cluster munitions when they decided to move to formal work on those issues.  More informal work would be like being stuck in place while the wheels spin round and round. In that regard, we were very glad to hear in yesterday’s general statements that many, perhaps most, states spoke in favor of creating a Group of Governmental Experts and moving to formal work.

Fallacy #4: One week of discussions per year is an adequate and fruitful approach.

These sessions have been valuable in educating states parties and developing a common base of knowledge about lethal autonomous weapons systems and the dangers that they may pose. But one week per year of panel discussions is not a sustainable strategy for dealing effectively with this far-reaching issue that affects the future of humanity. Yet in the general statements, not a single state addressed the matter of how much time needs to be devoted to LAWS next year or in the coming years. Four weeks in 2017 would be a good target. The track record of the CCW shows that states need that kind of time in order to work seriously on an issue, and to have the prospect of any sort of meaningful outcome. States Parties have devoted that much time and more in a single year to other issues, and need to do so for lethal autonomous weapons systems as well.

Fallacy #5: Existing international humanitarian law is sufficient to deal with LAWS.

This is a frequently heard refrain for those of us working on weapons issues.  We heard the same thing with respect to antipersonnel landmines, cluster munitions, and blinding lasers. One could make the same argument for every weapon that has been banned, including those three as well as chemical weapons, biological weapons, and poison gas.  But in each case, states recognized the very real benefit in having additional law beyond existing IHL.

This is especially true for fully autonomous weapons, which would represent a new category of weapons that could change the way wars are fought and pose serious risks to civilians.  As such, they demand new, specific law that clarifies and strengthens existing international humanitarian law. And only with new, specific law is it possible to get the stigmatization effect that is so crucial in dealing with weapons of ill-repute.  Finally, it is worth noting that if existing IHL is indeed sufficient, there is no need to have these discussions.

Mr. Chairperson, distinguished delegates,

Human Rights Watch has prepared a new memorandum for CCW delegates that focuses on meaningful human control, and it is available at the back of the room and online. It discusses the moral and legal importance of control and shows countries’ growing recognition of the need for humans to remain in charge of the critical functions of selecting and firing on targets. It also examines rules requiring control in different areas of the law and how they could inform how the term is understood in the context of autonomous weapons. Bans on mines, biological weapons, and chemical weapons show the value disarmament law has placed on control of weapons.

Mr. Chairperson, distinguished delegates,

The drumbeat for a ban on fully autonomous weapons is getting louder and louder and cannot be ignored. We were pleased to hear Algeria and Costa Rica add their names to the growing list of countries calling for a preemptive ban. Over 3,000 artificial intelligence experts have now signed the open letter calling for a preemptive ban.  The issue was addressed at Davos, the World Economic Forum, with a senior official from the major weapons manufacturer BAE calling for a ban. It was recently addressed for the first time at the Munich Security Forum. Support for a ban continues to grow steadily.

Yet technology is racing ahead in spite of the drumbeat. Notably, in recent weeks, there have been numerous articles citing US defense officials talking about the “Third Offset Strategy” and its focus on ever-greater autonomy in weapons, including the possibility of fully autonomous weapons.

States must act now. As provided for in the CCW LAWS mandate, it is crucial that states parties agree this week on concrete recommendations for the Five-Year Review Conference in December.  We believe the key recommendations should be that states agree at the Review Conference on a mandate that establishes a Group of Governmental Experts, and that the GGE is tasked with initiating formal negotiations on lethal autonomous weapons systems, negotiations that should be aimed at a prohibition. Four weeks should be set aside for this work. 

It is in this way that the CCW can act with the urgency and broad vision that is required for this issue of such monumental importance.

Thank you.  

Your tax deductible gift can help stop human rights violations and save lives around the world.