Skip to main content

Statement on International Humanitarian Law, CCW meeting on lethal autonomous weapons systems

Delivered by Bonnie Docherty, Senior Researcher

Thank you Mme. Chairperson.

One of the fundamental principles of international law is that an individual should be held responsible for actions that violate it. International humanitarian law and international criminal law include an obligation to prosecute war crimes. International human rights law establishes a right to a remedy and ensuring individual responsibility for abuses is a state obligation.

The weapons we are discussing this week—lethal autonomous weapons systems, which have no humans in the loop—threaten to undermine that foundational norm. They would have the potential to create an accountability gap because under most circumstances a human could not be found legally responsible for unlawful harm caused by the weapons.  Operators, programmers, and manufacturers would all escape liability 

A human could presumably be held criminally responsible if he or she intentionally created or used a fully autonomous weapon in order to commit a war crime. But a more likely and troublesome scenario would be one in which the robot unexpectedly committed an unlawful act.  

The robot could not itself be held liable because machines cannot be punished. There would also be no human to hold directly responsible for the decision to attack because the robot, not a human, made the decision.  

Finally, indirect or command responsibility would be difficult to impose. Command responsibility holds commanding officers indirectly responsible for subordinates’ actions if they knew or should have known their subordinates committed or were going to commit a crime and they failed to prevent the crime or punish the subordinates.  If a robot acted in an unforeseeable way after being deployed, a commander could not have known in advance what it would do.  It would be unfair and legally challenging to hold a commander responsible in such a situation.  At the same time, a commander could be held not be held liable for failing to punish an unpunishable machine.

Holding a manufacturer or programmer civilly liable for the unanticipated acts of a fully autonomous weapon would also be difficult. In the United States, for example, defense contractors are generally granted immunity for harm caused by their products. And victims would generally find it impractical or too costly to sue in a foreign court.

This accountability gap relates directly to the issue of meaningful human control that has been discussed in depth this week.  If humans abdicated their control over the use of force, they would give machines the power to make life-and-death determinations.  As a result, there would be no human to hold responsible if the weapon unlawfully killed civilians. 

The loss of accountability would undermine deterrence of future bad acts.  It would also leave victims and their relatives without the satisfaction that someone paid for the suffering they experienced.

The legal and moral imperative to hold individual responsible for unlawful acts thus demands that there always be human control over the selection and engagement of targets in individual attacks.  

There is a straightforward solution to maintain control and prevent an accountability gap: prohibiting LAWS, which many delegations in the room understand as weapons that operate outside of human control. 

Thank you.

Your tax deductible gift can help stop human rights violations and save lives around the world.