• Fully autonomous weapons, also known as "killer robots," would be able to select and engage targets without human intervention. Fully autonomous weapons do not exist yet, but they are being developed by several countries and precursors to fully autonomous weapons have already been deployed by high-tech militaries. Some experts predict that fully autonomous weapons could be operational in 20 to 30 years. These weapons would be incapable of meeting international humanitarian law standards, including the rules of distinction, proportionality, and military necessity. The weapons would not be constrained by the capacity for compassion, which can provide a key check on the killing of civilians. Fully autonomous weapons also raise serious questions of accountability because it is unclear who should be held responsible for any unlawful actions they commit. Human Rights Watch calls for a preemptive prohibition on fully autonomous weapons.

    Human Rights Watch is a founding member of the Campaign to Stop Killer Robots, and currently serves as the campaign’s global coordinator.

  • The United Kingdom’s Taranis combat aircraft, whose prototype was unveiled in 2010, is designed to strike distant targets, “even in another continent.” While the Ministry of Defence has stated that humans will remain in the loop, the Taranis exemplifies the move toward increased autonomy.

    Governments should pre-emptively ban fully autonomous weapons because of the danger they pose to civilians in armed conflict.

Reports

Killer Robots

  • Mar 27, 2014
    We write to urge you to vote in favor of the Human Rights Council Resolution on ensuring use of remotely piloted aircraft or armed drones in counter-terrorism and military operations in accordance with international law, including international human rights and humanitarian law, A/HRC/25/L.32.
  • Jan 6, 2014
    Tech has been turned against human rights – or so it seemed from Edward Snowden’s revelations last year. A big challenge for 2014 will be to utilize new tools and tactics for positive change – while reining in the efforts of those who are thinking just as hard about how to use tech to steal, spy or stifle dissent.
  • Nov 15, 2013
    An agreement on November 15, 2013, to begin international discussions on fully autonomous robot weapons is the beginning of a process that should conclude in a treaty banning these weapons, Human Rights Watch said today. Governments attending a weapons meeting in Geneva have agreed to begin international discussions in May 2014 on these weapons, which would select and engage targets without further human intervention.
  • Nov 14, 2013
    Steve Goose, executive director of the arms division of Human Rights Watch, and Mary Wareham, coordinator of the Campaign to Stop Killer Robots and advocacy director of the arms division of Human Rights Watch, delivered statements on Fully Autonomous Weapons at Convention on Conventional Weapons (CCW) Meeting of States Parties in Geneva.
  • Nov 13, 2013
    Governments should agree this week to begin international discussions in 2014 on fully autonomous robot weapons, with a view to a future treaty banning the weapons, said Human Rights Watch today.
  • Nov 13, 2013
    International attention to the subject of fully autonomous weapons has grown rapidly over the past year. These weapons, also called “lethal autonomous robots” or “killer robots,” would be able to identify and fire on targets without meaningful human intervention. Although they do not yet exist, they have generated widespread concern about their implications for the protection of civilians and combatants from unlawful attacks during armed conflict.
  • Oct 21, 2013
    All governments should support international talks to address the threat posed by fully autonomous robotic weapons, Human Rights Watch said today. Human Rights Watch and the Harvard Law School International Human Rights Clinic on October 21, 2013, issued a question-and-answer document about the legal problems posed by these weapons.
  • Oct 21, 2013
    Advances in artificial intelligence (AI) and other technologies will soon make possible the development of fully autonomous weapons, which would revolutionize the way wars are fought. These weapons, unlike the current generation of armed drones, would be able to select and engage targets without human intervention. Military officials in the United States and other technologically advanced countries generally say that they prefer to see humans retain some level of supervision over decisions to use lethal force, and the US Defense Department has issued a policy directive embracing that principle for the time being.
  • Oct 17, 2013
    The use of armed drones over the past decade has dramatically changed warfare and brought new humanitarian and legal challenges. But unmanned aerial vehicles still have a human pilot—backed up by a team of people—who decides what to target and when to fire.
  • May 30, 2013
    It is clear that many countries are moving toward systems that would give full combat autonomy to the high-tech machines of the future.