(Geneva) – The United States should endorse the call by a United Nations (UN) special rapporteur to halt fully autonomous robotic weapons. These weapons, once activated, can select and engage targets without further intervention by a human.
For the first time, countries will debate the challenges posed by fully autonomous weapons, sometimes called “killer robots,” at the United Nations Human Rights Council in Geneva on May 29, 2013.
“The UN report makes it abundantly clear that we need to put the brakes on fully autonomous weapons, or civilians will pay the price in the future,” said Steve Goose, arms director at Human Rights Watch. “The US and every other country should endorse and carry out the UN call to stop any plans for killer robots in their tracks.”
Professor Christof Heyns, the UN special rapporteur on extrajudicial, summary or arbitrary executions for the Office of the High Commissioner for Human Rights, has prepared a 22-page report on lethal autonomous robotic weapons to be delivered to the second session of the Human Rights Council on May 29. The council will then consider how to act on the report’s recommendations, including its call on nations to institute an immediate moratorium on these weapons and work for an international agreement that addresses the many concerns identified in the report.
In acknowledgement of the challenges that fully autonomous weapons would pose, the US Department of Defense issued a directive on November 21, 2012, that, for now, requires a human being to be “in-the-loop” when decisions are made about using lethal force. For up to 10 years, Directive Number 3000.09 generally allows the Defense Department to develop or use only fully autonomous systems that deliver nonlethal force, unless department officials waive the policy at a high level. In effect, the directive constitutes the world’s first moratorium on lethal fully autonomous weapons.
While a positive step, the directive is not a comprehensive or permanent solution to the potential problems posed by fully autonomous systems, Human Rights Watch said. The policy of self-restraint it embraces may also be hard to sustain if other nations begin to deploy fully autonomous weapons systems.
Over the past decade, the expanded use of unmanned armed vehicles or drones has dramatically changed warfare, bringing new humanitarian and legal challenges, Human Rights Watch said. The UN report acknowledges that “robots with full lethal autonomy have not yet been deployed” despite the lack of transparency on their research and development. The report lists several robotic systems with various degrees of autonomy and lethality that are in use by the US, Israel, South Korea, and the UK. Other nations with high-tech militaries, such as China, and Russia, are also believed to be moving toward systems that would give full combat autonomy to machines.
On November 19, Human Rights Watch and the Harvard Law School International Human Rights Clinic released “Losing Humanity: The Case Against Killer Robots,” a 50-page report outlining their numerous legal, ethical, policy, and other concerns with fully autonomous weapons.
Human Rights Watch is the initial coordinator of the Campaign to Stop Killer Robots, a new international coalition of civil society groups that is working to preemptively ban lethal robot weapons that would be able to select and attack targets without any human intervention. This prohibition should be achieved through an international treaty, as well as through national laws and other measures, to enshrine the principle that decisions to use lethal force against a human being must always be made by a human being.
“It is possible to halt the slide toward full autonomy in weaponry before moral and legal boundaries are crossed, but only if we start to draw the line now,” Goose said. “Initial public reaction to the possible development of fully autonomous weapons has been one of great concern. These weapons should be rejected as repugnant to the public conscience.”