International attention to the subject of fully autonomous weapons has grown rapidly over the past year. These weapons, also called “lethal autonomous robots” or “killer robots,” would be able to identify and fire on targets without meaningful human intervention. Although they do not yet exist, they have generated widespread concern about their implications for the protection of civilians and combatants from unlawful attacks during armed conflict.
This memorandum calls on states parties to the Convention on Conventional Weapons (CCW) to take up this challenge by adopting a mandate to discuss the issue in 2014 with an eye to negotiating a new protocol as quickly as possible.
This memorandum argues that existing international humanitarian law is insufficient to deal with fully autonomous weapons and provides four principle reasons why a supplementary legally binding instrument is needed:
- Fully autonomous weapons raise the kinds of concerns under the Martens Clause—which requires weapons to meet the “principles of humanity” and “dictates of public conscience”—that have justified the creation of past treaties.
- While existing international humanitarian law focuses on use, a new convention could also address development and production.
- Such a convention could help close the accountability gap associated with fully autonomous weapons.
- The precautionary principle, which applies perfectly to this situation, allows for action to prevent harm even though scientific uncertainty remains.
The memorandum concludes by spelling out why new law should take the form of a preemptive ban and laying out recommendations for global and domestic action.