Autonomous weapons are the third revolution of the war. After the introduction of firearms in the fourteenth century, after the introduction of nuclear bombs in 1945, they promise to change the way conflicts are handled again. They are armed robots that can operate independently, without humans being involved in decisions because they use systems controlled by artificial intelligence. For the moment, these are mainly kamikaze drones. They do not employ men, therefore they do not produce casualties among the soldiers, they look for specific targets, they destroy the strategic objectives also known as circuit ammunition. They solve a technical problem: they can hit an enemy protected by observation, they can easily penetrate gates and they can remain in standby, circling an area of interest, waiting for the right moment to arrive.
Facial recognition
Ukraine: artificial intelligence to recognize war victims
by Mariella Bussolati
In the confrontation between Russia and Ukraine they have already been used. Perhaps the Ukrainian armies announced they had implemented them on their Facebook page at the end of February. These are Turkish-made Tb2 Bayraktars drones, which can take off and fly on their own, although they still need an operator to decide where to drop the bombs they carry. The Tb2 is 11 meters long, with a wingspan of 6 meters, can fly at a height of 5,000 meters, goes at a speed of over 200 kilometers per hour, can carry a load of 150 kilos and has a payload ISR that allows the acquisition of images in real time specific for reconnaissance and surveillance. Turkey has already used it in Syria and against the Kurds, the Azerbaijani army against the Armenians in the Nagorno-Karabakh war. Ukraine bought it in 2019 for $ 69 million and it already served in the Dombass against Russian separatists last October. He also proposed co-production in a local factory to Turkey.
Startup
FlyingBasket, the Italian drone better than cranes and helicopters
by Dario D’Elia
Russia is also at stake. You have even decided that artificial intelligence is a priority. It is one of the countries that spends the most on defense, just behind the US and China. The Ministry of Defense has created a special department to develop this type of vehicle. The KUB, the drone already operational, was developed by Kalashnikov and Zala Aero Group, it can be launched from a platform and once in the air it reaches the designated area and then hits with a vertical trajectory going to crash together with the warhead it must to blow up. It is perfect for taking down tanks. Thanks to the AI recognition systems, a photo is enough to show him how to recognize the target. Proof that it was put to work it arrived in March and it also arrived in Kiev.
However, we are in the midst of global hostility, so China could decide to offer Russia new devices, with the advantage that in this way it would have direct data on how they work on the ground. On the other hand, Biden has promised Ukraine to send 100 Switchblades, small drones already used in Afghanistan.
Russia could also decide to make the Uran-9 Ucgv operational, armed tracked vehicles that contain two robots: one dedicated to reconnaissance, the other to fire support, also these already seen in Syria. They are equipped with cannons and anti-missiles.
Space economy
Forty Starlink satellites are burned but Elon Musk can be happy
by Pier Luigi Pisa
Thanks to autonomous weapons it could be hoped that in the future there will be no more victims, but only destruction. However, they can be very ruthless. Drones are also able to recognize a single person, by their face, or by the cell phone they have in their pocket. And they can selectively kill even en masse, for example on the basis of ethnicity. Unlike nuclear bombs, in this case the principle of deterrence cannot be applied, according to which no one adopts them for fear of retaliation. It is therefore to be expected that they will become increasingly important and widespread because they guarantee precision and less involvement of men. Because they make choices, they can still make mistakes, especially on a smoky and debris-filled terrain and, in this case, it would be difficult to find a liability, which could be due to the software and not the owners. There would be a process of dehumanization that could lead to even greater damage.
Civil society groups and researchers have raised the alarm several times. In 2016 the Future for Life institute launched an appeal to prevent the risks of this technology, signed among others by Elon Musk. Even the United Nations, within the Convention on Certain Conventional Weapons, discussed a ban, or at least restrictions, but the discussion at the end of 2021 did not lead to any point of agreement. Many countries did not want to sign any limits and the Russian and Ukrainian delegates also clashed on this issue, then asking to cancel the debate.