The UK is opposing an international ban on so-called “killer robots” at a United Nations conference in Geneva this week, which will examine the future of lethal autonomous weapons systems (Laws).
There is currently no internationally agreed definition of what constitutes a lethal autonomous weapons system and the lack of legal groundwork for killer robots will be a main debate topic at the UN conference. The UK is among 120 countries which will attend the Convention on Conventional Weapons in Geneva from April 13 to 17.
The Foreign Office told the Guardian: “At present, we do not see the need for a prohibition on the use of Laws, as international humanitarian law already provides sufficient regulation for this area.
“The United Kingdom is not developing lethal autonomous weapons systems, and the operation of weapons systems by the UK armed forces will always be under human oversight and control. As an indication of our commitment to this, we are focusing development efforts on remotely piloted systems rather than highly automated systems.”
One of the main criticisms is the systems, by their nature, lack human control. Laws are operated via computers, enabling drones and other weapons to select targets and carry out attacks without direct human intervention.
At the convention, complicated battlefields maps will be used to demonstrate how difficult it could be for an automated weapon to accurately select and distinguish targets.
READ MORE: ‘Killer robots’ to provide ‘accountability gap’ for military, must be banned – HRW
Andrew Smith of Campaign Against Arms Trade told RT: “The development of killer robots is very concerning. A comprehensive, pre-emptive prohibition on these weapons is urgently needed. Weaponry without meaningful human control is the ultimate expression of militarism and risks decisions about life and death being made by machines.”
Many human rights groups have come forward to call for a legal framework and ban on these weapons systems. Last week, Human Rights Watch and Harvard Law School released a report calling for the creation of a new protocol to outlaw fully autonomous weapons.
“No accountability means no deterrence of future crimes, no retribution for victims, no social condemnation of the responsible party,” said Bonnie Docherty, senior Arms Division researcher at Human Rights Watch and the report’s lead author. “The many obstacles to justice for potential victims show why we urgently need to ban fully autonomous weapons.”
Employment of a lethal autonomous weapons system with unknown conserquences is likley not to be attractive to politicans - Quintana #CCWUN
— Stop Killer Robots (@BanKillerRobots) April 14, 2015
If an automated weapon were to attack a benign civilian target due to an error in its system or mistaking a target, the owner of the killer robot could not be held responsible because of a lack of current legal framework.
Elizabeth Quintana from UK lists range of global security threats, incl. cyber attacks, non-state armed groups #CCWUN
— Stop Killer Robots (@BanKillerRobots) April 14, 2015
“A fully autonomous weapon could commit acts that would rise to the level of war crimes if a person carried them out, but victims would see no one punished for these crimes,” said Docherty, who is also a lecturer at the Harvard Law School clinic. “Calling such acts an ‘accident’ or ‘glitch’ would trivialize the deadly harm they could cause.”
Timeline of 20-30 years for development of fully autonomous weapons seems far but they may be here much sooner - Stuart Russell #CCWUN
— Stop Killer Robots (@BanKillerRobots) April 13, 2015
Human Rights Watch is a co-founder of the Campaign to Stop Killer Robots and serves as its coordinator. This international coalition of more than 50 nongovernmental organizations is calling for a pre-emptive ban on the development, production, and use of fully autonomous weapons.