• Fully autonomous weapons, also known as "killer robots," would be able to select and engage targets without human intervention. Precursors to these weapons, such as armed drones, are being developed and deployed by nations including China, Israel, South Korea, Russia, the United Kingdom and the United States. It is questionable that fully autonomous weapons would be capable of meeting international humanitarian law standards, including the rules of distinction, proportionality, and military necessity, while they would threaten the fundamental right to life and principle of human dignity. Human Rights Watch calls for a preemptive ban on the development, production, and use of fully autonomous weapons.

    Human Rights Watch is a founding member and serves as global coordinator of the Campaign to Stop Killer Robots.

  • The next generation of weapons in military arsenals could be "killer robots," machines that would select and destroy specific targets without further human intervention. But if a robot broke the law, who would be held responsible? Would the programmer, manufacturer, military commander, or robot end up in court?
    Programmers, manufacturers, and military personnel could all escape liability for unlawful deaths and injuries caused by fully autonomous weapons, or “killer robots,” Human Rights Watch said in a report released today. The report was issued in advance of a multilateral meeting on the weapons at the United Nations in Geneva.

Reports

Killer Robots