News

Killer robots are potentially a thing that you shouldn’t be complacent about

It sounds like something out of a dodgy sci-fi movie rather than real life.
It sounds like something out of a dodgy sci-fi movie rather than real life.

When the second highest ranking general in the US army warns you against killer robots, no doubt you’ll want to take note.

General Paul Selva warned against using autonomous weapons systems that humans could lose control of. Instead, he said politicians should keep the “ethical rules of war” in place.

On Tuesday Selva told a Senate Armed Services Committee hearing: “I don’t think it’s reasonable for us to put robots in charge of whether or not we take a human life.”

Military
(Csaba Krizsan/AP/PA Images)
(Csaba Krizsan/AP)

However, this isn’t to say that Selva thinks that America’s enemies will do the same. Instead, Selva said the US should research the technology in order to find out how to defend against it.

He said it “doesn’t mean that we don’t have to address the development of those kinds of technologies and potentially find their vulnerabilities and exploit those vulnerabilities”.

This isn’t the first time that a prominent figure has publicly warned of the unknown dangers of autonomous weapons systems. In July last year an open letter signed by the likes of Stephen Hawking, Elon Musk and Noam Chomsky warned against the potential dangers of using artificial intelligence as weapons.

US soldiers
(Massoud Hossaini/AP/PA Images)
(Massoud Hossaini/AP)

It read: “We believe that AI has great potential to benefit humanity in many ways, and that the goal of the field should be to do so.

“Starting a military AI arms race is a bad idea, and should be prevented by a ban on offensive autonomous weapons beyond meaningful human control.”

This is in keeping with Selva’s warning this week that the military should stick to “the ethical rules of war in place lest we unleash on humanity a set of robots that we don’t know how to control”.

Potentially killer robots that no one knows how to control? Now that doesn’t sound like an attractive proposition at all.