Thank you, Mr. Chairman.
Human Rights Watch is a co-founder of the Campaign to Stop Killer Robots, and Mary Wareham of Human Rights Watch is the global coordinator of the Campaign.
We are pleased that the GGE has shifted its focus to options for the way forward on lethal autonomous weapons systems (LAWS). For five years, states have highlighted the host of problems with these weapons, including legal, moral, accountability, technical, and security concerns. It is time to move on and take action. As Brazil noted, the world is watching and there are high expectations for the CCW to produce a strong, concrete outcome.
Human Rights Watch supports the proposal for a mandate to begin negotiations in 2019 of a legally binding instrument to require meaningful human control over the critical functions of lethal autonomous weapons systems. Such a requirement is effectively the same as a prohibition on weapons that lack such control.
We were pleased to hear so many states—the vast majority of states—express support for a legally binding instrument prohibiting lethal autonomous weapons systems. We hope that High Contracting Parties set aside significant time in 2019 to fulfill that mandate—at least four weeks, so that the negotiations could be concluded within one year.
Several states have said the CCW’s discussions should focus on the compliance of lethal autonomous weapons systems with international law and particularly international humanitarian law. We agree that compliance with rules of proportionality and distinction is critical, and we question whether this technology could comply.
But another provision of international humanitarian law must also be considered. The Martens Clause—which appears in the Geneva Convention, Additional Protocol I, and the preamble of the CCW—creates a legal obligation for states to consider moral implications when assessing new technology. The clause applies when there is no specific existing law on a topic, which is the case with lethal autonomous weapons systems, also called fully autonomous weapons.
The Martens Clause requires in particular that emerging technology comply with the principles of humanity and dictates of public conscience. As we have outlined in a new report distributed this week, fully autonomous weapons would fail this test on both counts.
The principles of humanity require humane treatment of others and respect for human life and dignity. Weapons that lack meaningful human control over the critical functions would be unable to comply with these principles.
Fully autonomous weapons would lack compassion, which motivates humans to minimize suffering and killing. They would also lack the legal and ethical judgment necessary to determine the best means for protecting civilians on a case-by-case basis in complex and unpredictable combat environments.
As inanimate machines, fully autonomous weapons could also not appreciate the value of human life and the significance of its loss. They would base life-and-death determinations on algorithms, objectifying their human targets—whether civilians or combatants. They would thus fail to respect human dignity.
The development of weapons without meaningful human control would also run counter to the dictates of public conscience. In national and regional group statements, a majority of states at CCW have called for the negotiation of a legally binding instrument on lethal autonomous weapons systems. Many have expressly called for a prohibition on the weapons. Virtually all states have stressed the need to maintain human control over the use of force. Collectively, these statements provide evidence that the public conscience favors human control and objects to fully autonomous weapons.
Experts and the general public have reached similar conclusions. As was discussed in yesterday’s side event sponsored by the Campaign to Stop Killer Robots, thousands of AI and robotics researchers along with companies and industry representatives have called for a ban on fully autonomous weapons. Traditional voices of conscience—faith leaders and Nobel Peace Laureates—have echoed those calls, expressing moral outrage at the prospect of losing human control over the use of force. Civil society and the ICRC have also emphasized that law and ethics require human control over the critical functions of a weapon.
In conclusion, the rules of law and morality demand the negotiation of a new legally binding instrument on fully autonomous weapons. An assessment of the technology under the Martens Clause shows there is a gap in international law that needs to be filled. Concerns related to the principles of humanity and dictates of public conscience show that the new instrument should ensure that meaningful human control over the use of force is maintained and the development, production, and use of fully autonomous weapons are prohibited.