In Theory: Exploring the ethical concerns surrounding ‘killer robots’

The Christian Science monitor reports a planned inaugural meeting of the United Nations to address the development of "killer robots" was called off, while an ambiguous policy at the Pentagon was allowed to continue.

The current Defense Department directive reads, "Autonomous and semi-autonomous weapon systems shall be designed to allow commanders and operators to exercise appropriate levels of human judgment over the use of force."


However, "human judgment" in modern conflict may become obsolete in a world where machines make decisions faster than people.

Among the weapons in development are Britain's Taranis, a stealth drone that would carry out its own missions, and a submarine drone built by the U.S. Navy that operates autonomously and is "expected to be outfitted with weapons at some point."


In the article, staff writer Laurent Belsie writes, "Experts worry that the technology will soon cross a line where machines, rather than humans, decide when to take a human life."

Q. How do you feel about using machines equipped with artificial intelligence to engage in war? Should the U.S. be concerned with ethical questions surrounding "killer robots" even as other nations forge ahead with their development?

In addressing this question we must first be mindful that war is always a last-measure, messy business. Though war in certain cases can be justified, it is always a tragedy when a man made in God's image must kill another man equally created. The decision to take a human life in war is always a matter of ethical concern no matter what the method is. Each must be carefully considered so that the ultimate goal is accomplished — the least possible human casualties to attain victory with zero civilian deaths.

I don't believe artificial intelligence should be allowed to make instantaneous "kill or not kill" decisions. The actions of humans, whether combatant or not, cannot be so quickly judged with appropriate accuracy. Would a machine kill an enemy who's trying to surrender? Or a noncombatant who somehow resembles a soldier? A human must always be directly responsible for the taking of a life and also given the opportunity to show mercy whenever possible. Moral decisions must always be made by moral agents.


Pastor Jon Barta



The existence of killer robots using artificial intelligence to conduct warfare, with or without human guidance, is abhorrent to me. Yet as depressing as this new technical development is, it is no more horrible than sending our living soldiers with their human intelligence into wars with no end in sight, let alone any hope of victory.

All of humanity needs to be deeply concerned with the clear and present danger of autonomous killing machines. The new U.N. Group of Governmental Experts on Lethal Autonomous Weapon Systems has canceled its inaugural meeting, but I hope they will press on and heed the call of many organizations and by 126 founders and CEOs of robotics and artificial intelligence companies worldwide, that the U.N. protect the world from killer robots. (See for more information.)

But machines are no scarier than humans. As one robotic engineer understates it, "Unfortunately, humanity has a rather dismal record in ethical behavior in the battlefield."

Our unmanned weaponized drones, operated from thousands of miles away by humans in both the CIA and the DOD, have rained death from the sky on terrorists, suspected terrorists and innocent civilians alike in numerous countries for years. Now ISIS is itself using inexpensive off-the-shelf drones, equipped with small bombs or grenades. So far over a dozen Iraqi government soldiers have been killed and more than 50 wounded. It is only a matter of time until United States troops in Iraq, Syria, Afghanistan and the four other countries in which we are actively at war — can you name them all? — are killed by remote-controlled toys.

Roberta Medford





I do not believe the evolution of combat machinery may be stopped. Man is at war with man always, and the ways of war are ever modernizing. I hate that we live in a world where perhaps our children's generation may see the end of days by virtue of its technology. As a Christian, I accept God's assessment of us, that we are all sinners (in need of the Savior). But even those of us who throw our allegiance to God still remain fumbling sinners and angel wannabees. We are called "saints" yet we are still fallen human beings struggling with our sin nature, and still prone to err. I say this to point out that our fellow sinners outside the church restrain their sin only insofar as they are willing to abide governmental laws or follow some bendable personal codes. There is no stopping this majority sinner block from creating all the Armageddon-level war machines that we see currently arising. If one nation "pledges" not to use totally hands-free weaponry, you can bet that one of their hands is behind their back with its fingers crossed, and if good nations do not maintain superior fire-power, then rogue and malevolent regimes will take over the world. Yet even the so-called "good" nations are not truly good, are they? They are only "so" good, perhaps ethically superior to others, but still completely run by sinners.

The attractive aspect of autonomous weaponry is that it will complete its mission without human equivocation. The scary aspect of autonomous weaponry is that it will complete its mission without human equivocation. I have GPS in my car; sometimes it tells me to turn left into a wall that it didn't know was there. That's part of the problem with technology, cleaning up its mess and fixing unforeseen problems. You can't fix what you don't know yet to be a problem, and a killer robot can cause a lot of problems. I think people fantasize that if we let robots fight for us, people will get to live rather than die on the battlefield. Unfortunately, the battlefield may just be coming soon to a town near you.

Rev. Bryan A. Griem