This week, Germany convened the first-ever major digital disarmament meeting for governments and civil society. Representatives from more than 70 countries logged on to participate in the two-day meeting on lethal autonomous weapons systems, also known as fully autonomous weapons or killer robots.
The meeting’s goal was to explore the international framework and commitments needed to address mounting concerns over the dangers of removing meaningful human control from the use of force.
Scientists, roboticists, and artificial intelligence experts have long warned of the dangers posed by permitting machines to select and engage targets without further human intervention. That sentiment is now widely shared. In opening the Berlin Forum, German Foreign Minister Heiko Maas said, “letting machines decide over life and death of human beings goes against ethical standards and undermines human dignity.”
Human Rights Watch concurs with the minister’s comment that permitting fully autonomous weapons is “a red line we should never cross.” A new international treaty to ban such weapons is the only logical way to prevent such a disastrous development.
In her remarks to the Berlin Forum, Human Rights Watch senior researcher Bonnie Docherty made the case for a treaty prohibiting weapons systems that select and engage targets autonomously and present fundamental moral and legal challenges. Such a legally-binding instrument must establish a general obligation that states maintain meaningful human control over the use of force.
Initiatives such as the Berlin Forum are helping the international community to lay collective groundwork for such a treaty. This online meeting shows how governments are innovating and adapting to the COVID-19 pandemic. Digital diplomatic efforts such as this one are essential to keep up multilateral dialogue and advance efforts to protect humanity from serious threats such as killer robots.