Swift Action Needed to Prevent Fully Autonomous Weapons
Many militaries are pursuing ever-greater autonomy for weaponry, but the line needs to be drawn now on fully autonomous weapons. These weapons would take technology a step too far, and a ban is needed urgently before investments, technological momentum, and new military doctrine make it impossible to stop.
(London) – Civil society will lead the way to press governments to ban fully autonomous weapons, Human Rights Watch said today at the launch of the global Campaign to Stop Killer Robots. These potential future weapons, sometimes called “killer robots,” would be able to select and attack targets without any human intervention.
“Lethal armed robots that could target and kill without any human intervention should never be built,” said Steve Goose, Arms Division director at Human Rights Watch. “A human should always be ‘in-the-loop’ when decisions are made on the battlefield. Killer robots would cross moral and legal boundaries, and should be rejected as repugnant to the public conscience.”
Human Rights Watch is the initial coordinator of the Campaign to Stop Killer Robots, a new international coalition of nongovernmental organizations that calls for a preemptive and comprehensive ban on fully autonomous weapons. The prohibition should be achieved through an international treaty, as well as through national laws and other measures.
Over the past decade, the expanded use of unmanned armed vehicles or drones has dramatically changed warfare, bringing new humanitarian and legal challenges. Now rapid advances in technology are permitting the United States and other nations with high-tech militaries, including China, Israel, Russia, and the United Kingdom, to move toward systems that would provide greater combat autonomy to machines. If one or more country chooses to deploy fully autonomous weapons, others may feel compelled to abandon policies of restraint, leading to a robotic arms race.
“Many militaries are pursuing ever-greater autonomy for weaponry, but the line needs to be drawn now on fully autonomous weapons,” Goose said. “These weapons would take technology a step too far, and a ban is needed urgently before investments, technological momentum, and new military doctrine make it impossible to stop.”
The UN special rapporteur on extrajudicial, summary or arbitrary executions for the Office of the High Commissioner for Human Rights, Professor Christof Heyns, is to deliver his report on lethal autonomous robots to the second session of the Human Rights Council in Geneva, starting May 27, 2013. The report is expected to contain recommendations for government action on fully autonomous weapons.
On November 19, 2012, Human Rights Watch and the Harvard Law School International Human Rights Clinic released “Losing Humanity: The Case Against Killer Robots,” a 50-page report outlining the organizations’ numerous legal, ethical, policy, and other concerns with fully autonomous weapons. “Losing Humanity” found that fully autonomous weapons would be unlikely to meet the key provisions of international humanitarian law in the most common contemporary battlefield environments, and that their use would create an accountability gap as it would be unclear who would be legally responsible for a robot’s actions.
The report also details how lethal autonomous robots would undermine other checks on the killing of civilians. Fully autonomous weapons could not show human compassion for their victims, for example, and autocrats could abuse them by directing them against their own people.
In acknowledgement of the challenges such weapons would pose, the US Department of Defense issued a directive on November 21 that, for now, requires a human being to be “in-the-loop” when decisions are made about using lethal force. For up to 10 years, Directive Number 3000.09 generally allows the Department of Defense to develop or use only fully autonomous systems that deliver nonlethal force, unless department officials waive the policy at a high level. In effect, it constitutes the world’s first moratorium on lethal fully autonomous weapons. While a positive step, the directive is not a comprehensive or permanent solution to the potential problems posed by fully autonomous systems.
The Campaign to Stop Killer Robots includes several nongovernmental organizations centrally involved in the successful efforts to ban antipersonnel landmines, cluster munitions, and blinding lasers. Its members collectively have a wide range of expertise in robotics and science, aid and development, human rights, humanitarian disarmament, international law and diplomacy, and the empowerment of women, youth, and people with disabilities. The campaign is building a network of civil society contacts in countries including Canada, Colombia, Egypt, Germany, Japan, the Netherlands, New Zealand, Pakistan, the United Kingdom, and the United States.