icon bookmark-bicon bookmarkicon cameraicon checkicon chevron downicon chevron lefticon chevron righticon chevron upicon closeicon v-compressicon downloadicon editicon v-expandicon fbicon fileicon filtericon flag ruicon full chevron downicon full chevron lefticon full chevron righticon full chevron upicon gpicon insicon mailicon moveicon-musicicon mutedicon nomutedicon okicon v-pauseicon v-playicon searchicon shareicon sign inicon sign upicon stepbackicon stepforicon swipe downicon tagicon tagsicon tgicon trashicon twicon vkicon yticon wticon fm
3 Jun, 2021 13:01

Rise of the killer robot: An autonomous drone hunted down a human target in Libya last year & no one is talking about it

Rise of the killer robot: An autonomous drone hunted down a human target in Libya last year & no one is talking about it

The prospect of AI-led warfare is inching ever closer if events in Libya last year are anything to go by. Despite calls for a ban, killer robots present a terrifying, apocalyptic vision of lawless conflict in the future.

When I was studying international human rights law at university almost a decade ago, I recall our lecturer stating that experts anticipate military enthusiasts to have fully developed killer robot technology within the next 20 years. Those predictions seem to be fairly accurate thus far, as recent developments suggest we are more or less halfway down this pathway, if not further.

According to a 548-page United Nations report seen by the New Scientist, a “lethal” weaponized drone may, in fact, have “hunted down a human target” without being told to, i.e. completely autonomously. Allegedly, a Kargu-2 quadcopter took it upon itself to seek out a human target during a skirmish between Libyan government forces and those loyal to rival Khalifa Haftar in Libya in March last year.

The Kargu, known as a ‘loitering drone,’ uses machine learning-based object identification to determine and engage targets. It also has swarming capabilities, which allow up to 20 drones to work together.

Recent reports also note that the Kargu, directed to detonate on impact, was operating on a setting which required no human controller and subsequently targeted one of Haftar’s soldiers while he tried to retreat. While it is technically legal under the rules of armed conflict to attack a retreating soldier, at least two immediate questions come to my mind. 

Firstly, can a killer drone differentiate between a surrendering soldier and a combatant who is simply retreating? And, secondly, why on earth would we trust a drone to make these differentiations autonomously in the first place? Could you even surrender to the drone if you wanted to?

Also on rt.com Autonomous drones may have ‘hunted down’ and attacked troops in Libya without human control – UN report

People seem to forget that war has rules. It can’t just be a matter of shrugging our shoulders and saying, ‘well, we were going to kill these poor bastards anyway, so why not keep our own troops and pilots safe in the process?’ If we are to take the law of armed conflict seriously, we need to ask ourselves if we truly believe that fully autonomous weapons are even capable of meeting international humanitarian laws and rules? Hell, even Israel (dubbed by some as the most “moral army in the world”) is accused of breaching these international rules on a routine basis. If the world’s “most moral army” is in the business of destroying disability centres, refugee camps and international media headquarters, I shudder to think what a self-governing robot could be capable of.

If you don’t believe me, just take a look at this recent excerpt from the Bulletin of the Atomic Scientists which brilliantly outlines the flaws in placing our reliance on this technology: “OpenAI –a world-leading AI company– developed a system that can classify an apple as a Granny Smith with 85.6 percent confidence. Yet, tape a piece of paper that says “iPod” on the apple, and the machine vision system concludes with 99.7 percent confidence the apple is an iPod. In one case, AI researchers changed a single pixel on an image, causing a machine vision system to classify a stealth bomber as a dog. In war, an opponent could just paint “school bus” on a tank or, more maliciously, “tank” on a school bus and potentially fool an autonomous weapon.”

As far as we know, the incident in Libya is suspected to be the first time a drone has attacked a human without receiving instructions to do so. Unsurprisingly, rights groups such as Human Rights Watch have called for the end of “killer robot” technology, campaigning for a ban on the development, production and use of these weapons. The UN has also in the past debated a ban on this technology. At the time, UN secretary general Antonio Guterres referred to the machines as being “morally repugnant.” 

Even Elon Musk has reportedly called for a ban on the future of this technology. That’s right, bitcoin tampering, union-despising, flamethrowing maniac Elon Musk actually has a sound opinion on what is a very serious and terrifying issue. (But, given his past, don’t be surprised if this opposition lapses once he obtains the sole rights to develop the technology or something).

Unfortunately, it is unlikely the US or any other party developing this technology will heed the calls and warning signs anytime soon. Within the last decade, the US military has on numerous occasions tested a brain implant which allows a human operator to control up to three drones simultaneously using only their minds. According to the Marine Corps Times, the US military has been looking to advance a “swarm of suicide drones,” which would give a single operator control over 15 suicide drones with “minimal operator burden.” The UK military has invested significantly in this area as well.

Also on rt.com Despite the rise of AI 'super-brains' that help tanks and robots target the enemy, humans will always triumph over machines in war

According to the Guardian, within the next decade more than 80,000 surveillance drones and almost 2,000 attack drones are expected to be purchased around the world. As it turns out, attack drones are not cheap, so Americans are going to have to continue to sit on the waiting lists for healthcare and affordable housing, as the US appears to be the leading purchaser of military drone technology, and this is not expected to change anytime soon.

As I’ve queried in the past, what exactly are we to do when these systems are hacked or manipulated in any way by cyber criminals, foreign rivals, terrorist entities and the like? Terror groups like Islamic State have been weaponizing drones for quite some time. Five years ago, Paul Scharre, senior fellow and director of the 20YY Future of Warfare Initiative Center for a New American Security, released a report warning of a very severe “novel risk of mass fratricide, with large numbers of weapons turning on friendly forces.” Yet no one is listening.

At the end of the day, the US is hardly concerned with the fate of terror groups such as Islamic State. The real reason drone technology is being advanced in the frightening direction that it is is to confront one enemy and one enemy only: China. 

Or perhaps it’s just a coincidence that just two months ago the US ran an Agile Reaper exercise together with the Marine Unit, providing close air support as the US marines made amphibious landings simulating “island hopping” (the World War II Pacific strategy of invading one island after the other). A strategy which, Forbes noticed,“would only seem relevant for a conflict with China.”

Think your friends would be interested? Share this story!

The statements, views and opinions expressed in this column are solely those of the author and do not necessarily represent those of RT.

Podcasts
0:00
27:19
0:00
26:12