Weaponized AI in fiction: A killer robot from the hit science fiction 'Terminator' franchise. Credit: 20th Century Fox.

It’s being called, a turning point in military history.

A United Nations report about a March 2020 skirmish in the Libyan military conflict says a drone, known as a lethal autonomous weapons system — or LAWS — may have attacked humans for the first time without being instructed to do so, media outlets reported.

The 548-page report, published in March, claimed that the AI drone – a Kargu-2 quadcopter produced by Turkish military tech company STM, attacked retreating soldiers loyal to Libyan General Khalifa Haftar, NPR.org said.

But the report, by UN Security Council’s Panel of Experts on Libya, does not say explicitly that the LAWS killed anyone.

“If anyone was killed in an autonomous attack, it would likely represent an historic first known case of artificial intelligence-based autonomous weapons being used to kill,” Zachary Kallenborn wrote in Bulletin of the Atomic Scientists.

Over the course of the year, the UN-recognized Government of National Accord pushed the Haftar Affiliated Forces (HAF) back from the Libyan capital Tripoli, and the drone may have been operational since January 2020, the experts noted.

“Logistics convoys and retreating HAF were subsequently hunted down and remotely engaged by the unmanned combat aerial vehicles or the lethal autonomous weapons systems such as the STM Kargu-2,” the UN report noted.

Kargu is a “loitering” drone that uses machine learning-based object classification to select and engage targets, according to STM, and also has swarming capabilities to allow 20 drones to work together, The Independent reported.

“The lethal autonomous weapons systems were programmed to attack targets without requiring data connectivity between the operator and the munition: in effect, a true ‘fire, forget and find’ capability,” the experts wrote in the report.

Many robotics and AI researchers in the past, including Elon Musk, and several other prominent personalities like Stephen Hawking and Noam Chomsky have called for a ban on “offensive autonomous weapons,” such as those with the potential to search for and kill specific people based on their programming, The Independent reported.

It may not exactly look like a ‘Terminator’ robot, but it can kill, and do so autonomously, in league with up to 20 other drones. The Kargu-2 quadcopter is armed with an explosive charge. Credit: Kargu.

Experts have cautioned that the datasets used to train these autonomous killer robots to classify and identify objects such as buses, cars and civilians may not be sufficiently complex or robust, and that the artificial intelligence (AI) system may learn wrong lessons. 

They have also warned of the “black box” in machine learning, in which the decision making process in AI systems is often opaque, posing a real risk of fully autonomous military drones executing the wrong targets with the reasons remaining difficult to unravel.

Kallenborn, a national security consultant specializing in unmanned aerial vehicles (UAVs), believes there is greater risk of something going wrong when several such autonomous drones communicate and coordinate their actions, such as in a drone swarm, The Independent reported.

“Communication creates risks of cascading error in which an error by one unit is shared with another,” Kallenborn wrote in The Bulletin.

“If anyone was killed in an autonomous attack, it would likely represent an historic first known case of artificial intelligence-based autonomous weapons being used to kill,” he added.

Drone warfare itself is not new.

Military forces and rebel groups have used remote-controlled aircraft to carry out reconnaissance, target infrastructure and attack people. The US in particular has used drones extensively to kill militants and destroy physical targets.

Azerbaijan used armed drones to gain a major advantage over Armenia in recent fighting for control of the Nagorno-Karabakh region.

Just last month, the Israel Defense Forces reportedly used drones to drop tear gas on protesters in the occupied West Bank, while Hamas launched loitering munitions — so-called kamikaze drones — into Israel.

What’s new about the incident in Libya, if confirmed, is that the drone that was used had the capacity to operate autonomously, which means there is no human controlling it, essentially a “killer robot,” formerly the stuff of science fiction and movies like Terminator.

A global survey commissioned by the Campaign to Stop Killer Robots last year found that a majority of respondents — 62% — said they opposed the use of lethal autonomous weapons systems.

Sources: The Independent, NPR.org, Bulletin of the Atomic Scientists