Tech

UN reports a killer drone hunted down humans all by itself

The STM Kargu-2 drone was deployed by Turkey to attack retreating soldiers in Libya.

ZHOUSHAN, CHINA - APRIL 22, 2021 - Police patrol commandos use a drone to search the suspect area in Zhoushan, Zhejiang province, China, April 22, 2021. (Photo credit should read Costfoto/Barcroft Media via Getty Images)
Barcroft Media/Barcroft Media/Getty Images

The United Nations says that Turkey deployed a fully autonomous drone to hunt down and attack Libyan soldiers all by itself. In a report on the March 2020 incident, the drone, the STM Kargu-2, was referred to as a “lethal autonomous weapon.”

The report goes on to explain that the Kargu-2 was programmed to attack logistics convoys and retreating forces “without requiring data connectivity between the operator and the munition.” Which is to say, the drone was relying not on human control but rather on its own image processing and machine learning capabilities to identify and pursue targets.

It’s unclear if there were actually any casualties at the hands of the drone.

Killer robots — Of course, someone had to program the drone to specify its targets. But that’s not reassuring considering that computer intelligence is not very reliable. From above, it’s hard to imagine the drone really knows how to distinguish between civilians and security forces. We’ve already seen wrongful arrests as a result of AI-powered instances of mistaken identity, which is cause enough for concern, but in this instance, mistakes are potentially lethal.

The UN and other organizations have protested the proliferation of fully autonomous weapons, or “killer robots,” on the grounds that they would violate the international human rights rule of distinction, which states that parties to an armed conflict must at all times distinguish between the civilian population and combatants.

STM describes the Kargu-2 as a “rotary wing attack drone that has been designed for asymmetric warfare or anti-terrorist operations.” STM

Drone proliferation — Killer robots could also go against the principle of proportionality, indiscriminately attacking anything because they’re autonomous and easy to deploy remotely. Drones in particular have become a major concern for the reason that they’ve become incredibly cheap thanks to mass market adoption, and their autonomous flying systems are becoming good enough that they can navigate more freely.

What’s more, it’s not just national security forces deploying drones. Gangs and other terrorist groups have repeatedly staged attacks by strapping explosives to off-the-shelf drones. It’s a dystopian world we live in if anyone can deploy an autonomous killer machine against their enemies.

“Killer robot proliferation has begun,” tweeted researcher Max Tegmark. “It's not in humanity's best interest that cheap #slaughterbots are mass-produced and widely available to anyone with an axe to grind. It's high time for world leaders to step up and take a stand.” The UN and Human Rights Watch have unsuccessfully tried to pass international bans on autonomous attack drones.

Libya at the time of the 2020 attack was in the midst of a civil war; experts say Turkey intervened for various reasons including an attempt to secure more influence in the Middle East. But whatever the reason, we shouldn’t let this set a dangerous precedent.