Over the past decade, concerns have deepened over the increasing prospect of weapons systems that, once activated, would select and attack targets without further human intervention.
The Campaign to Stop Killer Robots, co-founded by Human Rights Watch in 2012, has been working to prohibit the use of autonomous weapons systems that could have a devastating impact on civilians in armed conflict. The campaign has released a 23-minute documentary entitled “Immoral Code,” which explores the risks of permitting machines making life and death decisions.
The film shows how human decision making is influenced by our individual moral codes that stem from our background, upbringing, values, and beliefs. It shows that reducing these decisions to automated machines presents fundamental ethical, legal, and moral concerns.
“One of the fundamental objections to the use of machines making the decision to use force, or who to kill or target, is that they can never be able to understand the value of human life,” says Dr. Thompson Chengeta, an international law expert and lead campaigner on Africa for the Campaign to Stop Killer Robots, in the film. “That is a value judgment that only humans can understand.”
Pressure is increasing on governments to open negotiations on a new international treaty that would require meaningful human control over these weapons and ban the development and use of autonomous weapons systems that lack such control.
A 2021 report by Human Rights Watch and the International Human Rights Clinic at Harvard Law School called for strengthening existing international humanitarian law to address mounting concerns over autonomy in weapons systems.
Governments need to recognize the urgency of moving from talking about this challenge to launching negotiations on a legally binding treaty to ban killer robots.
As the Stop Killer Robots website warns: “Technology should be used to empower all people, not to reduce us – to stereotypes, labels or just a pattern of 1’s and 0’s.”