On October 8, the Nobel Prize in Physics was awarded to Professors Geoffrey E. Hinton and John J. Hopfield in recognition of their contributions to the field of artificial intelligence. A few weeks later, in New York, the 193 member states of the United Nations General Assembly will vote on a resolution on autonomous weapon systems, also known as killer robots. Many of these weapons systems rely on artificial intelligence, and Professor Geoffrey E. Hinton has been sounding the alarm about their dangers for over a decade.
Autonomous weapons are systems that, once activated, can select a target and use lethal force based solely on sensor processing, without human intervention. These systems raise fundamental ethical, humanitarian, legal, operational, moral, and security concerns. The prospect of their use and proliferation is alarming and poses a serious threat to international humanitarian law and the protection of civilians. Entrusting a machine with the power of life and death over a human being crosses an unacceptable moral line.
It is highly uncertain that autonomous weapons can comply with the laws of war, especially in distinguishing combatants from non-combatants, or determining if the expected civilian harm of an attack outweighs the anticipated military advantage, a case-by-case decision that requires human judgment. Civilians are likely to pay the highest price. Moreover, in cases of international law violations, it will be legally challenging, if not unjust, to hold human operators criminally accountable for actions taken by autonomous weapon systems that they could not foresee or control.
To avoid having to face these dangers, Hinton, Hopfield, and thousands of other AI experts, scientists, and civil society organizations, including ours, have been calling for preventive action to ban and regulate autonomous weapons systems. And there is urgency.
Current conflicts dramatically illustrate how wars are becoming increasingly digitized and accelerated—from the Israeli military’s use of the “Gospel” and “Lavender” targeting systems in Gaza, to armed drones using laser-guided bombs and other munitions in Burkina Faso and Ethiopia, to loitering munitions used in Ukraine, Nagorno-Karabakh, and Libya. These military investments in autonomy and other emerging technologies are pushing humanity down a dangerous slippery slope.
In the past, nations have acted together to prohibit chemical and biological weapons, blinding lasers, anti-personnel landmines, and cluster munitions. These humanitarian disarmament treaties have stigmatized any use of these weapons by any actor under any circumstances. Even non-signatories to these treaties have come into line by ending or drastically reducing their use, thereby saving countless civilian lives.
Since 2013, the challenges raised by autonomous weapons have been discussed at length, initially by the UN Commission on Human Rights (since replaced by the Human Rights Council), and then at the Convention on Conventional Weapons (CCW) in Geneva. France, which had identified the emerging dangers, played a significant role in driving the discussions. Today, a minority of states at the CCW, including Russia and India, are blocking proposals to negotiate new international law on autonomous weapons systems. The CCW’s consensus decision-making process allows a single country to prevent any agreement.
To overcome this impasse, Austria and other countries are proposing talks at the UN General Assembly in New York, a forum that includes more states than the Convention on Conventional Weapons. This framework will also allow for addressing the broader human rights, proliferation, security and other issues raised by autonomous weapons in addition to compliance with the laws of war. Progress in this forum is far less likely to be thwarted by a handful of states opposed to any regulation.
The General Assembly presents a significant opportunity for states to push toward negotiations on an international treaty that would deal with the many concerns raised by autonomous weapons systems. In the face of the dangers highlighted by scientists, legal experts, and human rights organizations, and the urgency of containing the automation of violence in wars and beyond, we urge France to stand with those acting for better civilian protection. The government should fully support efforts to address autonomous weapons systems via the UN General Assembly, with a view to the swift adoption of a treaty.
Signatories:
Patrice Bouveret: Director of l’Observatoire des armements
Bénédicte Jeannerod: France Director - Human Rights Watch
Sylvie Brigot: Director of Amnesty International France