In a development that lends a good deal of credence to the movement, the United Nations Human Rights Commission has filed a report in agreement with the "Stop Killer Robots" campaign's ideals.
Basically, they'd rather not have their fate decided by killer robots either, so they're calling for a worldwide moratorium on murderous robot rampages. And if that sounds like something we needn't bother with, guess again, because the killer robots the UN is talking about aren't some sci-fi concept like Cylons or Terminators. They're real and could even be lurking somewhere above us at this very moment.
The killer robots we're talking about, at least for now, are drones — semi-autonomous drones that could be capable of carrying out their missions with minimal human control. While the U.S. is actually leading the charge in the development of such drones, that doesn't mean that Americans are any safer than anyone else.
"Computer controlled devices can be hacked, jammed, spoofed, or can be simply fooled and misdirected by humans."
That chilling reminder is courtesy of roboticist Noel Sharkey, a member of the Campaign to Stop Killer Robots and chairman of the International Committee for Robot Arms Control.
The UN report notes a number of robots already deployed and capable of taking lethal action. They are deployed by not only the U.S., but by Israel and Britain, and along the demilitarized zone between North and South Korea as well. Since these robots process data much faster than people and are capable of taking action on their own, human controllers could already be considered "out of the loop" when it comes to understanding the reasoning for a strike launched by their robots.
Needless to say, the UN report has been applauded by the Campaign to Stop Killer Robots, as well as other humanitarian groups that are aware of the issue. Whether the UN's report will halt — or even slow — the rise of the machines is yet to be determined. All we know is that if Arnold shows up at DVICE's doorstep, we're gonna go with him. We want to live.