Human Rights Watch is highlighting the dangers of killer robots and calling for a ban on the fully-autonomous weapons ahead of a fifth UN weapons review conference in Geneva next week.
The organization has published a report Making the Case: The Dangers of Killer Robots and the Need for a Preemptive Ban, ahead of the Conference of the Convention on Conventional Weapons (CCW), beginning December 12.
The CCW is made up of 121 countries, which will decide by consensus on December 16 whether deliberations on lethal autonomous weapons systems (LAWS) will continue in 2017 and, if so, how these talks will progress.
Fully-autonomous weapons can independently seek, identify and attack targets. The HRW has voiced concern that some of the member countries, including the United States, the United Kingdom, China, Israel, Russia, and South Korea are developing weapons systems with growing levels of autonomy
The Pentagon allocated US$18 billion of its latest budget to develop systems and technologies that could form the basis of fully-autonomous weapons, according to The New York Times.
US Deputy Secretary of Defense Robert Work said that this was in response to China and Russia’s development of killer robots. Work said last year it was important for the US to "dominate" machine learning and artificial intelligence to counteract the emerging threats posed by China and Russia.
Bonnie Docherty, senior arms researcher at Human Rights Watch will present the report at UN offices in Geneva on December 14. It comes on the back of a campaign launched by HRW and other NGOs more than three years ago - Campaign to Stop Killer Robots.
This has resulted in three informal meetings about these lethal autonomous weapons by CCW states since 2014.
“Governments should ensure that humans retain control over whom to target with their weapons and when to fire,” Docherty said.
“The success of past disarmament treaties shows that an absolute prohibition on fully-autonomous weapons would be achievable and effective,” Docherty added.
The report, co-published with Harvard Law School’s International Human Rights Clinic, examined “the legal, moral, security, and other dangers of killer robots” and concluded that a ban is the only option for addressing all of the concerns.
It recommended the adoption of an international treaty prohibiting the development, production and use of fully-autonomous weapons, noting this would help clarify existing international humanitarian law in relation to the matter.
It also advised that formal discussions and national policies establishing prohibition of weapons should be initiated.
The report responded to 16 key arguments made against banning the lethal autonomous weapons, including the assertion that regulation would be more effective than an all out ban.
“A binding, absolute ban on fully-autonomous weapons would reduce the chance of misuse of the weapons, would be easier to enforce, and would enhance the stigma associated with violations,” the report contended.
Researchers also argued that fully-autonomous weapons puts civilians and soldiers at risk as they would lack emotions, including compassion and a resistance to killing.
A University at Buffalo research team published a paper last month suggesting a ban on "killer robots" may only be a temporary solution and the focus should shift from the technological developments to the cultural and political motivations behind them.