Skip to main content

Statement to the Convention on Conventional Weapons Informal Meeting of Experts on Lethal Autonomous Weapons Systems - Other Areas of International Law

Delivered by Bonnie Docherty, Senior Researcher, Arms Division & Senior Clinical Instruction at Harvard Law School's International Human Rights Clinic

Thank you Madame Chairperson.

This session raises two areas of great concern to Human Rights Watch: the accountability gap created by fully autonomous weapons and the human rights implications of the weapons.

Serious doubts exist about whether there could be meaningful accountability for the actions of a fully autonomous weapon. While a human could presumably be held criminally responsible if he or she intentionally created or used a fully autonomous weapon to commit a war crime, a more likely and troublesome scenario would be one in which the robot unexpectedly committed an unlawful act.

A robot could not itself be held liable in such a situation. Because it cannot suffer like a human, it could not be punished in the same way. At the same time, there would be no human to hold directly responsible for the decision to attack, and indirect liability would be difficult to achieve.

Command responsibility holds commanding officers indirectly responsible for subordinates’ actions if they knew or should have known their subordinates committed or were going to commit a crime and they failed to prevent the crime or punish the subordinates. In the unexpected situation I just described, however, the commander could not foresee and thus not prevent the violations in question, and he or she could not punish the robot after the fact.

An alternative would be to try to hold the programmer or manufacturer civilly liable for the unanticipated acts of a fully autonomous weapon. But tort law too would likely fail to ensure accountability. In the United States, for example, defense contractors are generally granted immunity for harm caused by their products.

Even without a legal gap, there are practical problems with holding programmers and manufacturers accountable. In particular, civil suits are generally brought by victims and, especially in cases of armed conflict, it is unrealistic to think all victims would have the resources or adequate access to obtain justice.

The use of fully autonomous weapons would thus lead to the creation of a potentially insurmountable accountability gap. The lack of criminal or civil consequences would interfere with deterrence. A failure to punish would leave victims and their relatives without the satisfaction that someone paid for the suffering they experienced.

The human rights implications of fully autonomous weapons are equally problematic.
Although we are heartened by states’ many references to human rights law this week, the debate about fully autonomous weapons so far has focused on their use in armed conflict and questions about their ability to comply with international humanitarian law. But the weapons could easily be adapted for use in law enforcement operations. Such use would trigger international human rights law, which applies during war and peace.

In a new report released Monday, Human Rights Watch finds that fully autonomous weapons threaten to violate the foundational rights to life and a remedy and to undermine the underlying principle of human dignity.

The right to life has been described as the “supreme right” on which all others are based. It prohibits the arbitrary deprivation of life, and it only allows killing if it meets three cumulative requirements: it must be necessary to protect human life, constitute a last resort, and be applied in manner proportionate to the threat. Each of these prerequisites for lawful force involves qualitative assessments of specific situations. Yet there is little prospect that robots could be developed to have certain human qualities—such as judgment and the ability to identify with humans—that facilitate compliance with the three criteria.

Human right law also establishes a right to a remedy in order to deter future unlawful acts and punish past ones. The accountability gap I have already described would infringe on this right.

Finally, fully autonomous weapons could undermine the principle of dignity, which implies that everyone has a worth deserving of respect. As inanimate machines, fully autonomous weapons could truly comprehend and respect neither the value of individual life nor the significance of its loss. Allowing them to make determinations to take life away would thus conflict with the principle of dignity.

For more information on our findings regarding the human rights implications of fully autonomous weapons, I refer you to our report Shaking the Foundations, which is available at the back of the room or from a member of the Human Rights Watch delegation.

In conclusion, the lack of accountability and the threat to human rights associated with fully autonomous weapons bolsters the case for an absolute and preemptive ban on the development, production, and use of these weapons.

Thank you.

Your tax deductible gift can help stop human rights violations and save lives around the world.