Conventional Weapons Meeting Provides Opportunity for Action
November 13, 2013
As technology races ahead, governments need to engage now in intensive discussions on the potential dangers of fully autonomous weapons. Deliberations about killer robots need to include nongovernmental groups, and be underpinned by a clear sense of urgency and purpose if they are to result in concrete action.
Mary Wareham, arms division advocacy director at Human Rights Watch and coordinator of the Campaign to Stop Killer Robots.

(Geneva) – Governments should agree this week to begin international discussions in 2014 on fully autonomous robot weapons, with a view to a future treaty banning the weapons, said Human Rights Watch today.

Human Rights Watch, together with the Harvard Law School International Human Rights Clinic, issued a report making the case for a pre-emptive ban to government delegates attending the annual meeting in Geneva of the Convention on Conventional Weapons (CCW).

“As technology races ahead, governments need to engage now in intensive discussions on the potential dangers of fully autonomous weapons,” said Mary Wareham, arms division advocacy director at Human Rights Watch and coordinator of the Campaign to Stop Killer Robots. “Deliberations about killer robots need to include nongovernmental groups, and be underpinned by a clear sense of urgency and purpose if they are to result in concrete action.”

Fully autonomous weapons – also called “lethal autonomous robots” or “killer robots” – have not yet been developed, but technology is moving toward increasing autonomy. Such weapons would select and engage targets without further intervention by a human.


“Most fundamentally, an international ban is needed to ensure that humans will retain control over decisions to target and use force against other humans,” said Wareham.

In recent months, fully autonomous weapons have gone from an obscure issue to one that is commanding worldwide attention. Since May, 34 governments have made their first public statements on fully autonomous weapons: Algeria, Argentina, Austria, Belarus, Belgium, Brazil, Canada, China, Costa Rica, Cuba, Ecuador, Egypt, France, Germany, Greece, India, Indonesia, Iran, Ireland, Italy, Japan, Mexico, Morocco, Netherlands, New Zealand, Pakistan, Russia, Sierra Leone, South Africa, Spain, Sweden, Switzerland, United Kingdom, and United States. All nations that have spoken out have expressed interest and concern at the challenges and dangers posed by fully autonomous weapons.

France, aschair of the next meeting of the Convention on Conventional Weapons, will propose a mandate to add fully autonomous weapons to the convention’s work program in 2014. The states parties to the treaty meeting in Geneva are expected to make a decision on the matter on November 15.

Human Rights Watch supports any action to urgently address fully autonomous weapons in any forum. Agreement to consider the subject in the framework of the Convention on Conventional Weapons would be a positive step. Human Rights Watch was centrally involved in creating the treaty’s 1995 protocol banning blinding lasers, which is a pertinent example of a weapon being pre-emptively banned before it was fielded or used.

Human Rights Watch urges nations to adopt a mandate in this forum that is broad enough to address the range of issues surrounding the development, production, and use of fully autonomous weapons. Discussion of automated and autonomous capabilities in weapon systems is needed to understand where to draw the line to ensure meaningful human control over targeting and attack decisions.

In November 2012, Human Rights Watch and the Harvard Law School International Human Rights Clinic issued “Losing Humanity: The Case against Killer Robots,” a 50-page report outlining numerous legal, ethical, policy, and other concerns with fully autonomous weapons. A “Questions and Answers” document issued on October 21, 2013, clarifies some of the issues raised in the report, while the new report rebuts the argument that existing international humanitarian law is sufficient to deal with fully autonomous weapons.

Many governments are developing their policies on fully autonomous weapons. The United States Defense Department issued a directive on November 21, 2012 that, for now, requires a human being to be “in-the-loop” when decisions are made about using lethal force, unless department officials waive the policy at a high level.

The US policy directive, while positive, is not a comprehensive or permanent solution to the potential problems posed by fully autonomous systems, Human Rights Watch said. The policy of self-restraint it embraces may also be hard to sustain if other nations begin to deploy fully autonomous weapons systems.

Human Rights Watch is coordinating the Campaign to Stop Killer Robots, an international coalition of civil society groups that began its campaign in April. The campaign is working for an international treaty, as well as national laws and other measures to pre-emptively ban weapons that would be able to select and attack targets without any human intervention.

Representatives from the Campaign to Stop Killer Robots, including Human Rights Watch, a founding member, will present their concerns about fully autonomous weapons at an event at the United Nations on November 13, 2013.

More reporting on: