Lire la version en français / Hier auf Deutsch lesen/ Lea la versión en español
You don’t need to be any kind of expert to understand the problem of “killer robots.” Machines that decide who lives and who dies – through an algorithm of some sort, perhaps – are terrifying on their own.
We often think of such automated weapon systems only in the context of wars and how they might be deployed on the battlefield. However, their use in peacetime also poses a major threat.
It’s something science fiction writers have been warning us about for decades.
For example, Ray Bradbury in his 1950s dystopian classic Fahrenheit 451, describes a horror called “the Mechanical Hound.” It’s an eight-legged, spider-like machine with powerful sensory receptors and a long needle in its snout, filled with anesthetic. It’s used in domestic policing to hunt, even kill, humans, specifically dissidents, and once it’s let loose, off it goes on its own.
It was science fiction then, but today, three quarters of a century later, all the technology to build such a beast seems to exist. So – given there are always repressive governments seeking new ways to impose their will and tech companies eager to profit from them – dystopian dogs may be just around the corner. (See also: Black Mirror episode “Metalhead”)
Indeed, the future is now, as explained in a new report from Human Rights Watch and Harvard Law School’s International Human Rights Clinic. Technological advances and military investments are spurring the rapid development of “killer robots,” with multiple threats to human rights, including the right to life.
We’re talking about autonomous weapons systems, operating without meaningful human control.
Once activated, they would rely on software, often using algorithms, input from sensors like cameras, radar signatures, and heat shapes, and other data, to identify a target. After finding a target, they would fire or release their payload without the need for approval or review by a human operator.
That means a machine rather than a human would determine where, when, and against what, force is applied.
Bradbury’s needle-nosed Mechanical Hound is only one of a thousand potential horrors.
But Bradbury was also clear why he wrote science fiction: “People ask me to predict the future, when all I want to do is prevent it. Better yet, build it.”
And the good news is, some governments are looking to do just that: build a different future.
Next month will see the first ever United Nations General Assembly meeting on autonomous weapons systems. It follows campaigners’ efforts to Stop Killer Robots and the call by more than 120 countries for the adoption of a new international treaty on autonomous weapons systems.
Yes, getting countries around the world to agree on anything these days can be a bit like herding cats. But the threat of dystopian dogs is strong motivation.