"Campaign to Stop Killer Robots." That may sound like a clique of conspiracy theorists or the title of a summer B movie, but it's actually an alliance of human rights groups raising legal and ethical concerns about people's willingness to cede life-and-death decisions to computers.
Who is responsible if an armed robot fails to distinguish between civilians and combatants when unleashing lethal force against a target that meets its programmed criteria?
And how, skeptics wonder, can a "fully autonomous weapon" be taught to recognize soldiers attempting to surrender or those already wounded and no longer a threat?
If national military forces can rely on machines to take on the front-line hazards of armed combat, will that reduced risk of human casualties remove an important deterrent to waging war?
The Campaign to Stop Killer Robots was joined Thursday by a diverse array of peace advocates and diplomats at a session of the
"Their deployment may be unacceptable because no adequate system of legal accountability can be devised and because robots should not have the power of life and death over human beings," the United Nations' watchdog on extrajudicial killings, Christof Heyns, told the council.
In calling for U.N. member nations to freeze development of robotic weapons "while the genie is still in the bottle," Heyns warned of the risk of rapidly advancing technology outpacing political and moral consideration of unintended consequences.
In a 22-page report submitted to the U.N. rights forum, Heyns detailed the precursors to "fully autonomous weapons" already in operation:
-- Soldier-robots patrol the demilitarized zone between North and South Korea, and though remotely commanded by humans now, the programmed sentinels from Samsung Techwin are equipped with an automatic option.
-- Israel's Harpy combat drone is designed to detect, attack and destroy radar emitters and suppress enemy air defenses.
Existing drone technology has stirred plenty of controversy and frustrated relations between the United States, its foremost developer and user, and countries like
Getting the international community united on ground rules for fully autonomous weapons is likely to pose at least as much challenge as balancing the pros and cons of using drones, but one that legal experts contend isn't beyond the realm of possibility.
There is already significant recognition among the technologically advanced countries that there should be limits to the degree to which computerized systems can take action without human involvement, said Bonnie Docherty, a Harvard Law School lecturer and senior instructor at its International Human Rights Clinic. The rights clinic co-wrote a report with
Docherty pointed to the
Steve Goose, arms division director at Human Rights Watch, told journalists covering the U.N. meeting in Geneva this week that several governments have expressed willingness to take the lead in getting a global moratorium on lethal robotics in place.
The burgeoning alliance against "killer robots" is hopeful that world leaders can be brought together on the need for keeping humans in control.
"There is a good chance of success because we are trying to act preemptively, to prevent states from investing so much in this technology that they don't want to give it up," said Docherty.
M. Ryan Calo, a University of Washington law professor with expertise in robotics and data security, notes that there are upsides to robotic warfare, like the speed at which computers can make decisions and their ability to approach problem-solving in ways that are beyond humans.
"There's a reason why 75% of trading is now by high-speed algorithms," Calo said of the sales of stocks and commodities. "But humans tend to disproportionately trust the recommendations of computers."
James Cavallaro, a law professor and director of Stanford's International Human Rights and Conflict Resolution Clinic, disputes the notion that technology moves too fast to be bridled by the often glacial pace of international treaty drafting and ratification.
“People say that weaponized drones are already in use, that the cat’s out of the bag. But in the
The same power of retroactive evaluation has brought about a global covenant renouncing the use of land mines, which have inflicted horrendous civilian casualties in conflict-ravaged regions worldwide, he added.
"It's a progression, in terms of weapons systems that reduce human engagement," Cavallaro said. "And obviously robotics and weaponized drone aircraft that can make decisions about what is a danger are the final step on a very dangerous continuum."