Home » From Libya to Ukraine: the taboo on the use of autonomous weapons has fallen

From Libya to Ukraine: the taboo on the use of autonomous weapons has fallen

by admin
From Libya to Ukraine: the taboo on the use of autonomous weapons has fallen

On March 27, 2020, according to what documented by a UN reportan autonomous weapon was used for the first time: it was a swarm of drones launched in Libya by the forces of General Haftar against those loyal to the then Prime Minister al-Sarraj.

Unlike similar weapons already seen at work in the past, the Kargu-2 STMs supplied (via Turkey) to the Tripoli government were completely autonomous: equipped with artificial intelligence software, they could attack targets “without communication between the operator and the ammunition”as the UN wrote.

The case

Dying for a video game (or streaming): what swatting is and how it works

by Emanuele Capone


Those who hoped that it could remain an isolated case were disappointed: following the invasion of Ukraine by Russia, even in that theater of war some autonomous weapons began to appear, including (as reported by the NGO Stop Killer Robots) the drone KUB-BLA supplied to the Russian forces; for their part, Ukrainian forces used Turkish Bayraktar TB2 drones, which have some autonomous capabilities.

Despite 40 nations in the world have called for a ban on autonomous weapons and others ask for regulation (and despite the UN Secretary General has defined them as “morally repugnant”) the fear is that Pandora’s box has been opened and that there is no turning back from here. Is that so? “The taboo has definitely fallen,” he explained Mariarosaria Taddeo, lecturer in digital ethics at the Oxford Internet Institute (which will address these issues at the Trieste Next Festival on 22-24 September). Again: “Until recently it was insisted that these weapons were still at an experimental level and instead we have seen how recently they have been used heavily. The risk is also that Ukraine, while hoping that the war will end as soon as possible, will become a territory for the experimentation of autonomous weapons. For instance, China could provide Russia with its technologies to experiment with them and in the meantime collect the many data relating to a battle context, necessary to train these tools ”.

See also  Warhammer 40,000: Boltgun Review - Gamereactor

Compared to the weapons we saw at work first in Libya and today in Ukraine, however, a doubt remains: are they completely autonomous or semi-autonomous weapons, in which therefore there is an operator who supervises them remotely? “These are weapons that can often be used in both ways – Taddeo told us again – For an external observer it is it is impossible to state whether a model has been used autonomously or semi-autonomously. It is an important issue, because this nuance complicates the application of a ban on autonomous weapons, should we ever get there ”.

Yet it is precisely this difference that has been put at the center of many debates on the subject: the same United States, which opposes the banthey have a directive that explicitly requires lethal autonomous weapons “be designed to allow commanders and operators to exercise appropriate levels of human judgment.” It is the policy known as Human in the Loop, which requires that there is always a human being involved in the decision-making process of these tools. Isn’t that a crucial difference? “It is mostly a distraction – the Oxford Internet Institute professor replied – That is not the solution. The first reason is moral: it is legitimate to delegate such an important role to machines in choice to take the life of a human being? Furthermore, when we launch an autonomous weapon we cannot guarantee 100% that it will only act under certain conditions and that it will cause certain consequences. With AI this is not possible, because we are talking about systems that reason in a probabilistic way and that develop new behaviors based on the environment in which they act. Furthermore, we cannot even guarantee the principle of distinction, which always asks to distinguish between belligerent and civilian population and that it is a principle that international law requires us to respect. The Human in the loop principle becomes a bit of a scapegoat, also given the speed with which these weapons work and the highly refined skills that would be required of each individual operator “.

See also  Copyright: for Confindustria Digitale, the proposed decree is in contrast with the EU name

The case

In the US (perhaps) the guns that are unlocked with the fingerprint arrive

by Andrea Nepori


Put like this, it seems that the only solution is a complete ban on any war instrument equipped with artificial intelligence: “I believe that at the moment they cannot be used weapons capable of attacking humans maintaining a morality in war – is the reflection of Taddeo – If then technology were to change and become 100% predictable, or if we were able to distribute moral responsibilities and a whole series of other very complex conditions to be achieved, then I could also change my mind . I believe instead a different speech can be made on non-lethal weapons, used only to destroy objects and which could have a role in war contexts. For the moment, however, we are too distracted by other topics and we are not working on good regulation ”.

In this regard, at what point are the works for a regulation of autonomous weapons? “It is a very difficult path, also due to the polarized debate. Yes it is anyway defined a general framework of 11 principles to guide states in adopting autonomous weapons. This is not a road to ban, but to regulation. However, given the use that has been made of it in Ukraine, if at least we could adopt these principles it would already be a step forward. But it is very difficult”.

You may also like

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More

Privacy & Cookies Policy