There are some kinds of war that the Geneva Conventions did not anticipate. No, I’m not talking about George W. Bush’s idea that the ethics of the Geneva Conventions become suddenly outdated when a nation decides that it really, really wants to defeat its enemies. I’m talking about the use of robots as military weapons.
Theoretically, robots might be able to comply with guidelines like the Geneva Conventions better than humans, given that they lack the psychological reactions that lead to irrational acts of violence. However, the pure rational algorithms of robots could also lead to situations in which they are unable to correctly assess the dynamics of human situations, and such blindness to human psychology could itself lead to robot atrocities in wartime. A professor in artificial intelligence and robotics explains, “I can imagine a little girl being zapped because she points her ice cream at a robot to share.”
This isn’t some kind of abstract Isaac Asimov thought experiment. It’s a real ethical problem in the present day. Some basic military robots are already in use, and new, more advanced generations of fighting robots are being designed.
This summer, the American Department of Defense solicited bids from technology firms to develop software and sensor systems to be used in teams of robots that, with a human handler, could hunt people down, like a pack of dogs hunts down a fox. The proposal reads, “This topic seeks to merge these research areas and develop a software/hardware suit that would enable a multi-robot team, together with a human operator, to search for and detect a non-cooperative human subject.”
New Scientist points out that, although the prototypes requested this summer would not have weapons, and would require a human controller, there’s no reason that subsequent iterations of the idea could not be armed and autonomous.
The best ethical decision when it comes to war is not to have any. Likewise, the most ethical choice when it comes to the use of robots as weapons of war is to decide not to build them in the first place. The time has come for an international treaty banning robotic weaponry. However, I don’t expect to see such a treaty come into effect. After all, huge arsenals of nuclear weapons are still regarded as perfectly legal.