Ukraine has become a laboratory for these new “smart” weapons which worry the UN

Artificial Intelligence is starting to proliferate in the defense sector. No army can afford to do without it anymore. If Ukraine is an experimental battlefield for AI-doped drones, the UN is trying to regulate their use.

Published


Update


Reading time: 5 min

A Ukrainian soldier from the Main Intelligence Directorate, April 11, 2024. (GENYA SAVILOV / AFP)

Where is the development of autonomous weapons today? Should we be worried about the proliferation of artificial intelligence? Deepfakes, disinformation, electoral interference, exam fraud, the risks are very real. But if there is one area where AI arouses fear, it is in weapons. Almost all major armies already use or invest at least in artificial intelligence and the war in Ukraine has become an open-air laboratory for AI, not without worrying the UN.

In Ukraine, drones doped with AI

The Ukrainian government itself says: “We are here today a kind of training ground for the use of artificial intelligence”, so claims the Minister of Digital Transformation Mykhaïlo Fedorov. Ukraine, at war since the Russian invasion, uses AI in its operations extremely actively, generally via drones “doped” with artificial intelligence.

This makes it possible to better monitor the enemy. The Ukrainians compare the data collected with very modern algorithms, to better exploit information on Russian positions. Thanks to AI, these drones will also identify targets despite camouflage, and even offer an artificial image, to recreate what was invisible.

This AI is also used to act, possibly, with a process which alone guides the drone towards its target, which can also predict the trajectory of the targeted element, and possibly manage the final strike alone.

Make up for the lack of missiles?

Ukrainian troops are not yet fully equipped, and not everywhere. But these improvements are happening gradually. Minister Mykhaïlo Federov assures that around twenty Ukrainian companies are working on the development of AI specifically for these drones. British drones have also already arrived in certain units, but this requires training time for the soldiers themselves.

If several experts on the issue believe that these new tools are not enough to compensate for the lack of missiles, this could have a real impact and bring a “new dynamic” to the battlefield. The head of the NATO military committee even claims that “Ukraine’s use of drones combined with artificial intelligence could be more effective than Russian artillery fire.” Obviously, Russia, which is increasing its production of drones, remains very secretive about its way of also inserting this artificial intelligence. In Ukraine, in any case, the authorities are communicating and talking about a massive deployment of these drones connected to AI, in the coming weeks.

UN concerns about autonomous weapons

In the fall of 2023, UN Secretary General Antonio Guterres published an appeal – together with the International Committee of the Red Cross – to ask States one thing: to prohibit or in any case restrict the use of weapons AI-led autonomous vehicles. The fact that a machine can decide on its own, through its algorithm, to attack human beings is a moral red line that must not be crossed, said the UN boss. This would require, for example, a ban treaty. As with anti-personnel mines. The sooner the better because Antonio Guterres would like the text to be ready in 2026.

In December 2023, the United Nations General Assembly took a step in this direction. A resolution, which recognizes the risks that autonomous weapons systems pose to global security, was widely adopted. We could count 152 “yes”, including that of the United States, which had been dragging its feet until then. Among the 4 “no’s”, there were those of India and Russia and among the abstainers were China and Israel, to name but a few.

The Israel case

Israel already uses artificial intelligence extensively in the war in Gaza. It’s anything but science fiction. The Israeli army itself claims the use of an AI which makes it possible to quickly produce a large number of targets for the army. Code name: Habsora, “gospel” in Hebrew. But there are many others. With Lavender for example, the system makes it possible to tell if a Palestinian has a good chance of belonging to Hamas, or to Islamic jihad, by analyzing their behavior on social networks, their contacts, their changes of address. The margin of error is 10%, according to Israeli media which revealed its existence. When we know the human toll of the war, this obviously raises many questions.

But let us be clear: 100% autonomous weapons, like drones, which decide to eliminate a target without ever having a human intervene, do not exist. At least, not officially. Technically, it is entirely feasible. The technology is there. But no country has any interest in saying it was the first to break the status quo.


source site-29