Thank you, Chair.
I speak on behalf of Human Rights Watch, a co-founding member of Stop Killer Robots campaign. The Martens Clause provides one existing ethical framework against which to assess autonomous weapons systems. The clause states that when no specific law on a topic exists, civilians and combatants are protected by the principles of humanity and dictates of public conscience. Autonomous weapons systems would pose significant threats to both of these prongs.
A widely accepted provision of international humanitarian law, the Martens Clause creates a legal obligation to take into account ethical concerns. It is thus one of several reasons discussions of autonomous weapons systems should encompass ethics.
First, the principles of humanity require the humane treatment of others and respect for human life and human dignity.
People are motivated to treat each other humanely because they feel compassion and empathy for their fellow humans. Autonomous weapons systems, by contrast, would not be sentient beings capable of feeling emotions, like compassion. In addition, ethical and legal judgment gives people the means to minimize harm; it enables them to make considered decisions based on an understanding of a particular context. Autonomous weapons systems would lack the capacity to interpret complex, rapidly changing contexts.
As discussed in earlier sessions, the principle of human dignity, a fundamental ethical and legal principle, recognizes that all people have inherent worth. Autonomous weapons systems would be unable to understand the value of a human life and the significance of its loss. In addition, they would dehumanize people because their algorithms would treat people as data sets when selecting targets.
Second, autonomous weapons systems would raise ethical concerns because they would run afoul of the dictates of public conscience. Those dictates refer to shared moral guidelines that shape the actions of states and individuals. The use of the term “conscience” indicates that the dictates are based on a sense of morality, a knowledge of what is right and wrong. Faith leaders, scientists and technologists, civil society members, governments, and members of the general public have all spoken against delegating life-and-death decisions to autonomous weapons systems, many of them explicitly citing ethical problems as a key motivation.
The dangers autonomous weapons systems present to both the principles of humanity and dictates of public conscience underscore the need for a new legally binding instrument on these systems. A strong instrument would fill the legal gap that triggers the Martens Clause and address the range of issues raised at this meeting, including human rights, humanitarian, security, technological, and ethical concerns.
In closing, we urge states to build on this week’s constructive work. The high level of participation and substantive statements made at this meeting show it was worth it. There’s clearly appetite for additional UNGA consultations. We urge states to consider another UNGA resolution later this year that is even more ambitious in setting the goal of negotiating a new international treaty in 2026.
Thank you.