facial recognition, a controversial tool to identify dead Russian soldiers

“We don’t give up on ours.” This is one of the slogans used by the Russians who present the war in Ukraine as an operation in support of the Ukrainian Russian-speaking minorities, using Kremlin propaganda. However, since the start of the conflict, kyiv has claimed that Moscow has refused to recover the remains of its fallen soldiers. “The Russian authorities do not want these bodies”affirmed the Deputy Prime Minister of Ukraine, Iryna Vereshchuk, at the Guardian* in March. It took until June 4 for the two countries to publicly acknowledge a first body swap, although others appear to have taken place less officially, according to Ukrainian media*.

>> War in Ukraine: follow our live

Countries at war have an obligation to search, recover and identify the bodies of victims “without delay”, as provided for in the Geneva Conventions*. But Moscow has not updated its official toll (1,351 victims) since March 25 and does not provide Ukraine with a list of its missing either. However, these bodies can be difficult to identify, because they have been degraded by the fighting, by decomposition or because they did not carry identity documents. To give them a name, Ukraine has chosen to turn to a controversial technology: facial recognition.

“We are using artificial intelligence to search the social media accounts of dead Russian soldiers based on photos of their bodies”explained in March the Minister of Digital Transformation, Mykhailo Fedorov, on Telegram (in Ukrainian). “Those who have access to this software can enter a photo of a Russian soldier and it will send them to the most similar images, with a link to their source, such as a profile on social networks”explains Théodore Christakis, researcher specializing in artificial intelligence at the University of Grenoble Alpes.

It is still necessary that the names of the soldiers reach the families in Russia. From the beginning of the conflict, kyiv established several platforms to group the names and images of Russian dead or prisoners identified by the software: a site called Look for yours (in Russian), an associated Telegram channel and a telephone line called “Come back alive from Ukraine” which had already received several hundred calls as of February 27, according to the Ukrainian Parliament*. The Seek Yours site was quickly blocked in Russia, but people can access it through a VPN.

The Ukrainian authorities are also directly contacting the families they have been able to find, explains Mykhailo Fedorov on CNN*. Less formal groups, such as the pro-Ukraine hacker network “IT Army”, are also notifying the families of victims via social media, reports the Moscow Times*.

The Minister of Digital Transformation, whose services did not respond to requests from franceinfo, explains on CNN that this approach has two objectives. “Give families an opportunity to find the bodies”but also “show them that there is a real war [en Ukraine]fight Russian propaganda, show them that they are not as strong as they are told on TV and that people are really dying here”. By revealing to families the human cost of the conflict, kyiv undermines the narrative repeated since February 24 by the Kremlin, which always uses euphemisms “special operation” and refuses to communicate the number of victims in its ranks.

Biometric data has already been used in war zones such as Iraq or Afghanistan, but “the development of facial recognition has made it possible to use it in Ukraine more than in any other conflict”says Christine Dugoin-Clément, researcher at the CAPE Europe think tank and specialist in Ukraine. “It has become more efficient, easier to use and the algorithms have access to much more data on the internet than a few years ago to compare faces”adds the specialist.

Behind this development, we find in particular an American company well known in the sector: Clearview AI. “Clearview massively collects images that are publicly found on the Internet and social networks to feed its database, with which it can then compare the faces that are submitted to it”, describes Theodore Christakis. In February, the company touted its catalog of 10 billion images, including more than two billion taken from Russia’s favorite social network, VKontakte, according to Reuters*.

The leader of the company, Hoan Ton-That, claims to have made its commercial software available to the Ukrainian authorities free of charge out of altruism. “I remember seeing videos of captured Russian soldiers and Russia claiming they were actorshe explained to New York Times*. I thought if Ukrainians could use Clearview, they could get more information to verify their identities.” Contacted by franceinfo, Hoan Ton-That says that as of June 17, more than 40,000 searches have been carried out by more than 500 Ukrainians trained and authorized to use his application.

A saving good deed for a controversial company. Clearview has already been condemned by numerous data protection authorities, including in Europe, the United Kingdom, Australia and Canada. The reason: the massive collection of images transformed into biometric data without the agreement of their owners does not respect the rights of Internet users. Facial recognition systems are also accused of racist biases. Matching errors (“false positives”) are 10 to 100 times more common for Asian and African-American faces than white faces, according to a 2019 American study*. “The war in Ukraine is also an opportunity for Clearview to regain its virginity”notes Theodore Christakis.

More generally, facial recognition is not yet 100% reliable. “The older the photos in the databases, the more people have changed, so the less easy the match will be”, explains Jean-Luc Dugelay, professor of digital security at the Eurecom engineering school. Anything that alters the face can have an effect on the reliability of this software, recalls the facial recognition specialist: time, but also in the case of Russian soldiers, death, injury or decomposition.

With franceinfo, Hoan Ton-That affirms that he is not aware of any case where the identification returned by the software was erroneous. But “there is no independent scientific feedback on the quality of their technology”points out Jean-Luc Dugelay. “The risk is to misidentify a person and tell the wrong family that their loved one is dead”described in Guardian* Albert Fox Cahn, founder of the Surveillance Technology Oversight Project, an association fighting against surveillance.

A mistaken identification could have even more serious consequences, because Ukraine does not only use this technology on corpses. “She also uses it to identify prisoners, at entry points into Ukraine to detect possible Russian spies, or to put a name to Russian soldiers whose crimes have been filmed”, Théodore Christakis list, which Hoan Ton-That confirms to franceinfo. Mismatches could then become a matter of life and death.

Even if the identification of the Russian soldiers is correct, the approach of the Ukrainian authorities is the subject of criticism. “It is also part of a battle to demoralize the enemy and its civilian population”, emphasizes Theodore Christakis. It could also be counterproductive, because to prove to the families that one of their relatives is dead, the Ukrainian authorities come to show images of his corpse, at the risk of fueling their hatred of Ukraine. On CNN*, Mykhailo Fedorov explains that “80% of the responses from the families are: ‘We are going to come to Ukraine ourselves and kill you, you deserve what is happening to you.'”

Some also fear that this application of facial recognition will serve as an example to extend its use, in Ukraine and elsewhere. “War zones are often used as testing grounds, not just for weapons but also for surveillance tools which are then deployed on civilian populations for law enforcement purposes”explained to New York Times* activist Evan Greer, who heads the NGO Fight for the Future. Theodore Christakis ensures that “Clearview will not be able to be used as widely in Europe as in Ukraine, because the data protection authorities have set very strong limits”. But kyiv did not expect to use it either, before the war, as Mykhailo Fedorov pointed out in March on Telegram: “We started doing things that we couldn’t imagine a month ago.”

* All links followed by an asterisk lead to links in English.


source site-29

Latest