US Policy on Autonomy in Weapons Systems is First in the World
April 16, 2013

This policy shows that the United States shares our concern that fully autonomous weapons could endanger civilians in many ways. Humans should never delegate to machines the power to make life-and-death decisions on the battlefield. US policy should lay the basis for a permanent, comprehensive ban on fully autonomous weapons.

Steve Goose, Arms Division director

(Washington, DC) – Temporary US restrictions on lethal fully autonomous weapons should be strengthened and made permanent. Fully autonomous weapons, sometimes called “killer robots,” would be able to select and attack targets on their own without any human intervention.

In acknowledgement of the challenges such weapons would pose, the US Department of Defense issued a directive on November 21, 2012, that, for now, requires a human being to be “in-the-loop” when decisions are made about using lethal force. This was the department’s first public policy on autonomy in weapons systems and the first policy announcement by any country on fully autonomous weapons. 

“This policy shows that the United States shares our concern that fully autonomous weapons could endanger civilians in many ways,” said Steve Goose, Arms Division director at Human Rights Watch. “Humans should never delegate to machines the power to make life-and-death decisions on the battlefield. US policy should lay the basis for a permanent, comprehensive ban on fully autonomous weapons.”

The briefing paper by Human Rights Watch and the Harvard Law School International Human Rights Clinic reviews the content of the new directive and notes that it is a positive step. For up to 10 years, Directive Number 3000.09 generally allows the Department of Defense to develop or use only fully autonomous systems that deliver non-lethal force. In effect, it constitutes the world’s first moratorium on lethal fully autonomous weapons.

However, the directive contains significant loopholes and is not an adequate solution to the potential problems posed by fully autonomous systems, Human Rights Watch said. The policy can be waived by high-level department officials. It will also last for only up to 10 years unless renewed within five years. The department could alternatively cancel its policy within that same five-year period.

“The US policy is an important step in the right direction, but it clearly leaves the door open to future acquisition and use of lethal fully autonomous weapons,” Goose said.

Over the past decade, the expanded use of unmanned armed vehicles, or drones, has dramatically changed warfare, bringing new humanitarian and legal challenges. While not the subject of Human Rights Watch’s call for a ban on fully autonomous weapons, these semi-autonomous systems are evidence of rapid advances in technology. The US and other nations with high-tech militaries, including China, Israel, Russia, and the United Kingdom, are moving toward systems that would give greater combat autonomy to machines. If one or more countries choose to deploy fully autonomous weapons, others might feel compelled to follow suit.

“In light of the dangers fully autonomous weapons pose and the chilling prospect of a robotic arms race, all nations should impose an immediate moratorium, paving the way for a legal ban,” Goose said. “If global rules aren’t established, and other nations begin to develop these systems, the US may not stick with its policy.”

The US policy directive was released two days after Human Rights Watch and the Harvard Law School International Human Rights Clinic released “Losing Humanity: The Case Against Killer Robots,” a 50-page report outlining the organizations’ numerous legal, ethical, policy, and other concerns with fully autonomous weapons.

“Losing Humanity” found that fully autonomous weapons would be unlikely to meet the key provisions of international humanitarian law in the most common contemporary battlefield environments. In addition, their use could create an accountability gap, as it would be unclear who could be held legally responsible for a robot’s actions. The report also details how lethal autonomous robots would undermine other checks on the killing of civilians. Fully autonomous weapons could not show human compassion for their victims, for example, and autocrats could abuse them by directing them against their own people.

Human Rights Watch will be the initial coordinator of a Campaign to Stop Killer Robots, a new international coalition of nongovernmental organizations calling for a preemptive and comprehensive ban on fully autonomous weapons. The prohibition should be achieved through an international treaty, as well as through national laws and other measures.

The Campaign to Stop Killer Robots will be launched in London on Tuesday, April 23, with a 10:30 a.m. news briefing at the Frontline Club.