Robots at War: Scholars Debate the Ethical Issues
By Don Troop
"Noel Sharkey, a professor of artificial intelligence and robotics at the University of Sheffield, in England, says that while the Geneva Conventions require that new weapons systems be tested during their development to ensure that they won't inadvertently harm civilians, there is no such requirement for systems that are used for other purposes, like surveillance. That was the role of Predator drones until the terrorist attacks of 2001, after which the CIA and the Air Force equipped them with Hellfire missiles.
Since most unmanned systems can quickly be weaponized, Mr. Sharkey fears that is precisely what would happen if America suddenly found itself in a new war. "It's called military necessity," he says. "We've got this facility, and we're engaged in a war. We'll stick the weapons on."
Mr. Sharkey argues that lethal autonomous systems will never attain the proficiency of humans in following such "just war theory" cornerstones as distinction and proportionality. The principle of distinction establishes that active combatants are the only legitimate targets of attack. Civilians, including children and the elderly, are to be excluded, as are combatants who are wounded or have surrendered. When it is impossible to fully protect noncombatants, the principle of proportionality requires that any loss of life be proportional to the direct military advantage that one expects to gain."