view counter

TrendRegulate armed robots before it's too late

Published 10 March 2009

Unmanned machines now carry out more and more military and police missions; soon these robots will be allowed to make autonomous life-and-death decisions: when to shoot — and at whom; a philosopher argues that we should be more mindful of the ethical implications of this trend

There are two trends we note when it comes to robotics and the military. The first is clear to see: unmanned systems in the air, on the ground, and under water are taking on many of the missions — from reconnaissance to aerial attacks to bomb disposal — which were typically carried out by trained crews and complex machines. The second trend is more subtle yet more significant: allowing these unmanned machines to make life-and-death decisions, that is, to decide, on their own and without human intervention, when to open fire and at whom (see, for example, “Decision-making Killer Robots to be Used by Armies — and Terrorists,” 28 February 2008 HS Daily Wire; “Robot Wars Are a Reality, So We Should Develop Rules to Govern Them,” 20 August 2007 HS Daily Wire; “Military Robotics Moves Forward,” 3 March 2009 HS Daily Wire; and “Terminating the Terminators: Anti-robot Defense Company Launched,” 18 September 2008 HS Daily Wire).

A. C. Grayling, a philosopher at Birkbeck, University of London, says we should think long and hard about this trend. The danger of not doing so is that “we only wake up to the need for forethought when in the midst of a storm created by innovations that have already overtaken us.”

Robot sentries are already positioned on the borders of South Korea and Israel. Remote-controlled aircraft mount missile attacks on enemy positions. Other military robots now do much more than defuse bombs or detect landmines: they shoot at enemy soldiers (although, so far, under human control) and engage in deep penetration into enemy territory. Police forces are looking to acquire miniature Taser-firing robot helicopters. In South Korea and Japan the development of robots for feeding and bathing the elderly and children is already advanced. Some vacuum cleaners sense their autonomous way around furniture, and a driverless car has already negotiated its way through Los Angeles traffic.

In the next decades, completely autonomous robots might be involved in many military, policing, transport and even caring roles,” Grayling writes. “What if they malfunction? What if a programming glitch makes them kill, electrocute, demolish, drown, and explode, or fail at the crucial moment?”

Grayling points put that most thinking about the implications of robotics tends to take sci-fi forms: robots enslave humankind, or beautifully sculpted humanoid machines have sex with their owners and then post-coitally tidy the room and make coffee. The “real concern lies in the areas to which the money already flows: the military and the police,” he says.

A confused controversy arose in early 2008 over the deployment in Iraq of three SWORDS armed robotic vehicles carrying M249 machine guns (see HS Daily Wire of 3 August 2007; 10 September 2007; and 11 April 2008). The manufacturer of these vehicles said the robots were never used in combat and that they were involved in no “uncommanded or unexpected movements.” Predator drones have been mounting missile attacks in Afghanistan and Pakistan, and there are at least another dozen military robot projects in development. “What are the rules governing their deployment?” Grayling asks. “How reliable are they? One sees their advantages: they keep friendly troops out of harm’s way, and can often fight more effectively than human combatants. But what are the limits, especially when these machines become autonomous?” He concludes:

The civil liberties implications of robot devices capable of surveillance involving listening and photographing, conducting searches, entering premises through chimneys or pipes, and overpowering suspects are obvious. Such devices are already on the way. Even more frighteningly obvious is the threat posed by military or police-type robots in the hands of criminals and terrorists.

There needs to be a considered debate about the rules and requirements governing all forms of robot devices, not a panic reaction when matters have gone too far. That is how bad law is made - and on this issue time is running out.

-For more on the next phase in robotics:

  • Decision-making Killer Robots to be Used by Armies — and Terrorists,” 28 February 2008 HS Daily Wire
  • Robot Wars Are a Reality, So We Should Develop Rules to Govern Them,” 20 August 2007 HS Daily Wire
  • Military Robotics Moves Forward,” 3 March 2009 HS Daily Wire
  •  “Terminating the Terminators: Anti-robot Defense Company Launched,” 18 September 2008 HS Daily Wire 
view counter
view counter