The international community must impose a moratorium on robot weapons, a UN expert told the world body’s top human rights forum Thursday, warning that they could enable war crimes to go unpunished.
“Human rights requires that human beings should in one way or another retain meaningful control over weapons of war,” Christof Heyns said in a debate at the UN Human Rights Council on lethal autonomous robots, or LARs.
As a result, nations should “declare and implement national moratoria on the production, assembly, transfer, acquisition, deployment and use of LARs until a framework on the future of LARs has been established”.
“Their deployment may be unacceptable because no adequate system of legal accountability can be devised,” said Heyns, the UN’s special rapporteur on extrajudicial, summary or arbitrary executions.
Such weapons are a step up from drones — unmanned aircraft controlled from far-distant bases — over which controversy is raging notably due to civilian casualties when the United States uses them to kill alleged Al-Qaeda militants.
“While drones have a ‘human in the loop’, who takes the decision to deploy lethal force, the new technology of LARs involves an on-board computer that takes these decisions on its own,” explained Heyns.
“No state is currently using fully autonomous weapons that would classify as LARs, but the technology is already available,” he said.
Washington, a leader in robot arms development, last November imposed a 10-year human-control requirement.
Heyns commended that, but warned that history showed that once weapons exist, it is all too easy to give in to the temptation to use them.
Despite speeding up decision-making, robots would not read the nuances of war like humans, and would be capable of neither cruelty nor compassion, he said.
“Humans can, based on their understanding of the bigger picture, know that a more lenient approach is called for in a specific situation, for instance, when a soldier is surrendering,” he noted.
“War without reflection is mechanical slaughter,” he added.
Heyns highlighted a raft of concerns under the laws of war and human rights.
“These include questions of whether robots will make it easier for states to go to war, and the extent to which they can be programmed to comply with the requirements of international humanitarian law, especially distinction and proportionality,” he said.
“LARs can potentially be also used by repressive governments to suppress internal domestic opponents,” he warned.