Human rights group seeks ban on autonomous 'killer robots'
A new forward-looking campaign is hoping to shape public perception in a bid to make sure that the concept of armed drones that attack targets without human input never become a reality.
The New York-based Human Rights Watch has announced the creation of an international coalition, which it hopes can push for a global treaty for a pre-emptive ban on artificially intelligent weapons before they can be fielded in battle.
These “killer robots,” as the group calls them, are not yet being operated by any army in the world, though the pace at which drone technology and robotics are advancing has led to speculation that they could be developed within the next few decades.
According to Humans Right Watch, which cites a report it co-authored with Harvard Law School in November of 2012, a fully autonomous machine that could select and fire upon a target selected of its own volition could be available within 20 years, if not sooner.
That report, entitled “Losing Humanity: The Case Against Killer Robots,” outlines the legal, ethical and policy concerns surrounding the development of self-directed weaponry. While in certain contexts advanced weaponry already possesses some semblance of “intelligence,” such as guided missiles or drones capable of adapting to battlefield conditions, present technology still requires human input, and in that sense involves basic humanitarian law.
Leading the Campaign to Stop Killer Robots is Jody William, a
Nobel Peace laureate for her efforts to enact a ban on
anti-personnel landmines. She believes that exposure of the issue
will bring about public opposition.
"As people learn about our campaign, they will flock to it.
The public conscience is horrified to learn about this possible
advance in weapons systems. People don't want killer robots out
there,” William said in comments to the BBC.
"Normal human beings find it repulsive," she added.
Already the use of semi-autonomous weapons, such as aerial drones, has led to mounting criticism over the legal processes behind both the selection of foreign targets, and also their unilateral use on foreign soil. More recently in the US, there’s been a heated debate over whether they should be used on American citizens at all.
Current semi-autonomous technology at least requires a human operator, and in theory does not kill without proper authorization. As Human Rights Watch points out, the development of artificial intelligence-equipped machines presents a number of issues circumventing conventional law, and might even lead to an international arms race not unlike the one currently happening within the field of drone tech.
For its part, a directive issued by the US Department of Defense
in 2012 and quoted by the BBC has stated that weapons with a degree
of autonomy "shall be designed to
allow commanders and operators to exercise appropriate levels of
human judgment over the use of force."
While in March of 2012, Lord Astor of Hever - the UK's
parliamentary undersecretary of state for defense - stated that the
Ministry of Defence "currently
has no intention of developing systems that operate without human
intervention."
According to a report produced by the Teal Group, in 2011 its
market study estimated that in the coming decade worldwide spending
on drones will reach $94 billion, with countries such as China
looking to quickly close the gap with US technology.