Enter your email address to subscribe to Irregular Times and receive notifications of new posts by email.

Join 484 other subscribers

Irregular Times Newsletters

Click here to subscribe to any or all of our six topical e-mail newsletters:
  1. Social Movement Actions,
  2. Credulity and Faith,
  3. Election News,
  4. This Week in Congress,
  5. Tech Dispatch and
  6. our latest Political Stickers and Such

Contact Us

We can be contacted via retorts@irregulartimes.com

Military Seeks Packs of Robots to Hunt Humans Down

There are some kinds of war that the Geneva Conventions did not anticipate. No, I’m not talking about George W. Bush’s idea that the ethics of the Geneva Conventions become suddenly outdated when a nation decides that it really, really wants to defeat its enemies. I’m talking about the use of robots as military weapons.

Theoretically, robots might be able to comply with guidelines like the Geneva Conventions better than humans, given that they lack the psychological reactions that lead to irrational acts of violence. However, the pure rational algorithms of robots could also lead to situations in which they are unable to correctly assess the dynamics of human situations, and such blindness to human psychology could itself lead to robot atrocities in wartime. A professor in artificial intelligence and robotics explains, “I can imagine a little girl being zapped because she points her ice cream at a robot to share.”

This isn’t some kind of abstract Isaac Asimov thought experiment. It’s a real ethical problem in the present day. Some basic military robots are already in use, and new, more advanced generations of fighting robots are being designed.

This summer, the American Department of Defense solicited bids from technology firms to develop software and sensor systems to be used in teams of robots that, with a human handler, could hunt people down, like a pack of dogs hunts down a fox. The proposal reads, “This topic seeks to merge these research areas and develop a software/hardware suit that would enable a multi-robot team, together with a human operator, to search for and detect a non-cooperative human subject.”

New Scientist points out that, although the prototypes requested this summer would not have weapons, and would require a human controller, there’s no reason that subsequent iterations of the idea could not be armed and autonomous.

The best ethical decision when it comes to war is not to have any. Likewise, the most ethical choice when it comes to the use of robots as weapons of war is to decide not to build them in the first place. The time has come for an international treaty banning robotic weaponry. However, I don’t expect to see such a treaty come into effect. After all, huge arsenals of nuclear weapons are still regarded as perfectly legal.

2 comments to Military Seeks Packs of Robots to Hunt Humans Down

  • tom

    Well, what do you know – we’re going towards the Terminator scenario. Geez, we haven’t even gotten to the Mad Max stage yet and already they’re workin’ on terminators! Slow down! One bad movie at a time!

  • Elle

    Well, if you want to get technical, we have been using robots for a few genreations of “smart” weapons already. This is just taking it to the next logical level.

    The previous and current generation of “fire and forget” weapons could be considered robotic in nature. Once an operator aims the weapon and enters the targeting information, the weapon autonomuosly seeks out the target that it has been sent after. Also such systems as the Predator and Global Hawk UAV’s have built in autonomous routines for such things as piloting, navigation, targeting and evasion.

    The future has been here for awhile now. Most people just haven’t paid attention.

Leave a Reply

  

  

  

You can use these HTML tags

<a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>