UN human rights expert calls for a moratorium on lethal autonomous robots
“While drones still have a ‘human in the loop’ who takes the decision to use lethal force, LARs have on-board computers that decide who should be targeted,” he stressed.
“The possible introduction of LARs raises far-reaching concerns about the protection of life during war and peace,” Mr. Heyns noted during the presentation of his latest report* to the UN Human Rights Council. “If this is done, machines and not humans, will take the decision on who is alive or dies,” he said.
“This may make it easier for States to go to war; and raises the question whether they can be programmed to comply with the requirements of international humanitarian law, especially the distinction between combatant and civilians and collateral damage,” he explained.
“Beyond this, their deployment may be unacceptable because no adequate system of legal accountability can be devised for the actions of machines,” the independent human rights expert noted, based on his analysis of potential violations of the rights to life and human dignity should the use of LARs materialize.
In his report, Mr. Heyns urges the UN Human Rights Council to call on all States “to declare and implement national moratoria on the production, assembly, transfer, acquisition, deployment and use of LARs, until a framework on the future of LARs has been established.” He also invites the UN High Commissioner for Human Rights to convene or to work with other UN bodies to convene a High Level Panel on LARs to articulate this framework.
“War without reflection is mechanical slaughter,” the UN expert on summary executions said. “In the same way that the taking of any human life deserves as a minimum some deliberation, a decision to allow machines to be deployed to kill human beings deserves a collective pause worldwide.”
For the Special Rapporteur, the time is ripe for a thorough and cool-headed global reflection in order “to ensure that not only life itself but also the value of life and human dignity is protected in the long run.”
“If deployed, LARs will take humans ‘out of the loop,’” Mr. Heyns warned. In his view, “States find this technology attractive because human decision-making is often much slower than that of robots, and human thinking can be clouded by emotion.”
“At the same time, humans may in some cases, unlike robots, be able to act out of compassion or grace and can, based on their understanding of the bigger picture, know that a more lenient approach is called for in a specific situation,” he underscored.
The Special Rapporteur stressed that there is now an opportunity to pause collectively, and to engage with the risks posed by LARs in a proactive way, in contrast to other revolutions in military affairs, where serious reflection mostly began after the emergence of new methods of warfare. “The current moment may be the best we will have to address these concerns,” he said.
The new report provides specific recommendations to the UN system, regional and inter-governmental organizations, and States, as well as to developers of robotics systems, NGOs, civil society, human rights groups and the International Committee of the Red Cross.
The Special Rapporteur on extrajudicial, summary or arbitrary executions, Christof Heyns (South Africa), is a director of the Institute for International and Comparative Law in Africa and Professor of Human Rights Law at the University of Pretoria, where he has also directed the Centre for Human Rights, and has engaged in wide-reaching initiatives on human rights in Africa. He has advised a number of international, regional and national entities on human rights issues. Mr. Heyns’ research interests include international human rights law and human rights law in Africa. Learn more, log on to:http://www.ohchr.org/EN/Issues/Executions/Pages/SRExecutionsIndex.aspx
(*) Check the full report: http://daccess-dds-ny.un.org/doc/UNDOC/GEN/G13/127/76/PDF/G1312776.pdf?OpenElement or http://ap.ohchr.org/documents/dpage_e.aspx?si=A/HRC/23/47
Watch the Special Rapporteur on our YouTube Channel: http://youtu.be/LEInsrT8cHU