Pages

Wednesday, April 20, 2011

Robotics? Really

It is essential that, before unmanned systems become ubiquitous (if it is not already too late) that we consider this issue and ensure that, by removing some of the horror, ... that we do not risk losing our controlling humanity and make war more likely.

  -  
When your step 1 is to toss out Asimov's First Law of Robotics, where do you go from there?
Sunday's Guardian website pointed out a new report from the UK's Ministry of Defence(pdf), outlining important considerations that are not being properly undertaken in the rush to robotize war and weaponry. The report details the current state of technology and some legal issues that come along with remotely flying a piece of equipment over people and property. Most compelling, however, is the section on moral and ethical issues that begins on page 5-8.
In addition to the quote above and the associated issue that if you're not risking anyone, you don't have to convince anyone of the righteousness of the mission, there's an interesting set of questions about how to identify combatants and the battlefield when the weapons are operated remotely:
The concept of fighting from barracks as it has been termed raises a number of interesting areas for debate. Is the Reaper operator walking the streets of his home town after a shift a legitimate target as a combatant? Would an attack by a Taliban sympathiser be an act of war under international law or murder under the statutes of the home state? Does a person who has the right to kill as a combatant while in the control cabin cease to be a combatant that evening on his way home? More broadly, do we fully understand the psychological effects on remote operators of conducting war at a distance?
And perhaps most disturbing is the subject of war robot autonomy:
There is also an increasing body of discussion that suggests that the increasing speed, confusion and information overload of modern war may make human response inadequate and that the environment will be 'too complex for a human to direct' and this has already been exemplified by the adoption (described above in the legal section) of autonomous weapon systems such as C-RAM. The role of the human in the loop has, before now, been a legal requirement which we now see being eroded, what is the role of the human from a moral and ethical standpoint in automatic systems?
The italics are original to the report, but the idea that humans can make wars that are too complicated to be humanly manageable caused my brain to impose its bold and underscore.
Wired's Danger Room reminds us that 1 in 50 troops in Afghanistan is a robot (which sounds sexier than it is, but still).

No comments:

Post a Comment