The UN Panel of Experts on Libya has just published a report1 ,2 , denouncing the total ineffectiveness of the arms embargo imposed on Libya in 2011 by the Security Council. A previous report by UN experts had notably denounced Turkey's violations of the embargo and the behavior of one of its warships escorting a cargo ship suspected of violating the embargo. This turkish warship behaved aggressively towards a French frigate attempting to control the suspected cargo ship. NATO had concealed the results of its investigation3 into this incident because of the geostrategic importance of Turkey, which benefits from this situation in general, and in the Eastern Mediterranean in particular, in defiance of international maritime law.

But in the 500 pages of the Panel of Experts' last report, Laurent Lagneau highlighted a much more worrying element4 : the Government of National Accord used Turkish Kargu-2 drones in autonomous mode against Haftar Affiliated Forces5 ,6 . In other words, autonomous weapons using artificial intelligence have been used to kill in a mode that excludes human intervention.

In its 2019 report on the emergence of autonomous weapons7 , the Dutch NGO Pax already denounced this drone produced by the Turkish state-owned company SMT, because of its use of artificial intelligence (SMT describes the use of machine learning algorithms), which makes Turkey the first country to use drones capable of finding, tracking and killing people without any human intervention.

The classic description of emerging risk dynamics applies here: a technological innovation carried out by some actors creates knowledge deficits among the other actors. In particular among the actors in charge of producing rules and laws, which will delay actions aimed at filling the regulatory deficits created by this innovation. Moreover, the poor circulation of information relating to this innovation, in particular in the major media, creates a deficit of information for the public, which in practice does not encourage political decisions, which in this case would make it possible to get an agreement on the ban on autonomous weapons. Last, the ethical question is essential: who can reasonably consider it ethical to build and disseminate killer robots capable of autonomously killing humans? In 2019 António Guterres considered these killer robots to be "politically unacceptable" and "morally repugnant"8 .

In 2012, a coalition of NGOs was formed to call for a ban on autonomous weapons, and launched the "Stop Killer Robots" campaign. At the end of 2012 Human Right Watch published the report "Loosing Humanity: The Case against Killer robots". Then in 2015 Stuart Russel, Director of the Center for Intelligent Systems at Berkeley, launched an open letter calling for a ban on autonomous weapon systems. This letter was signed by 30,000 people, including 4,500 researchers in robotics or artificial intelligence.

At the UN, the Convention on Certain Conventional Weapons (CCW) has devoted several meetings to the threat posed by the development of killer robots, and several states have called for a negotiation of a ban on weapons systems deprived of human control, but some states like the United States and Russia obstruct it. The French stance appears ambiguous, even if Emmanuel Macron is clearly "dead against" killer robots. However, French experts who attended the August 2019 CCW meeting appear to adopt a restrictive definition of lethal autonomous weapons systems, which could create controversy if this were to exclude Turkish autonomous drones used in Libya from the scope of this definition and encourage some actors to allege that lethal autonomous weapons systems "do not exist"9 .

There is a major perception disparity here, since this French assertion is frontally contradicted by the report of the UN Panel of Experts on Libya, which states: "retreating HAF were subsequently hunted down and remotely engaged by the unmanned combat aerial vehicles or the lethal autonomous weapons systems such as the STM Kargu-2".

Human Right Watch also reports that the Geneva CCW meeting scheduled for August 2020 had been canceled due to the pandemic, with Reaching Critical Will pointing out that for other matters, UN meetings could have taken place on the internet.

On a global scale, with regard to the risk of dissemination of lethal autonomous weapons systems, the situation is marked by a significant conflictuality: several States are opposed to a ban, probably for fear of losing an arms race that have started precisely due to the lack of such a ban. Eric Schmidt, the former Google CEO, provides precisely this rationale in his latest report to President Biden, urging him to reject any ban so as to be ready to face China or Russia from 2025. In addition, the economic interests of military-industrial complexes, especially in the United States, are important enough to affect political decisions. Faced with these stances, the mobilisations of civil societies suffer from a relative lack of media coverage, given the importance of the stakes, which suggests the need for reflection on the responsibility, priorities and ethics of media industry.

3 LAGNEAU, Laurent. Libye : Un rapport de l’ONU accable la marine turque pour l’incident avec la frégate française Courbet. Zone Militaire. 17 mars 2021.
4 LAGNEAU, Laurent. Un rapport de l’ONU confirme l’utilisation de systèmes d’armes létaux autonomes turcs en Libye. Zone Militaire. 17 mars 2021.
5 "Les convois de logistique et les unités des forces affiliées à Haftar qui battaient en retraite ont été pourchassés et pris à partie à distance par des drones de combat ou des systèmes d’armes létaux autonomes tels que le Kargu-2 de STM (...) et d’autres munitions rôdeuses. Les systèmes d’armes létaux autonomes avaient été programmés pour attaquer des cibles, sans qu’il soit besoin d’établir une connexion des données entre l’opérateur et la munition, et étaient donc réellement en mode d’autoguidage automatique."
Rapport final du Groupe d’experts sur la Libye créé par la résolution 1973 (2011) du Conseil de sécurité
6 "Logistics convoys and retreating HAF were subsequently hunted down and remotely engaged by the unmanned combat aerial vehicles or the lethal autonomous weapons systems such as the STM Kargu-2 (...) and other loitering munitions. The lethal autonomous weapons systems were programmed to attack targets without requiring data connectivity between the operator and the munition: in effect, a true “fire, forget and find” capability."
Final report of the Panel of Experts on Libya established pursuant to Security Council resolution 1973 (2011)
7 SLIJPER, Frank. Slippery Slope, The arms industry and increasingly autonomous weapons. Peace organisation PAX. 11 novembre 2019.
8 "Autonomous machines with the power and discretion to select targets and take lives without human involvement are politically unacceptable, morally repugnant and should be prohibited by international law."
António Guterres. 25 Mars 2019.
9 "Il doit être précisé cependant que de tels systèmes n’existent pas à ce jour et ne présenteraient, en toute hypothèse, qu’un intérêt opérationnel limité eu égard à l’impossibilité pour le commandement humaine (SIC) de les contrôler." Convention sur Certaines Armes Classiques (CCAC). Réunion du Groupe d’Experts Gouvernementaux. Genève, 20-21 août 2019. Discours de la France sur la Caractérisation.