Stage
The serious face lit by a perfect light despite the darkness, in a dramatic pose reminiscent of Jesus on the cross, Donald Trump is escorted by police. The image is striking… but it is false. “The tricked montages are not so new,” observes Martin Gibert, researcher in the ethics of artificial intelligence at the University of Montreal. In the pre-computer era, you had to cut images and stitch them together to get a result… usually quite rough. Image-editing software like Photoshop made it possible to refine these works. And now, all you have to do is type in a description of the image you want to get for a generator, inspired by trillions of images taken from the Internet, to create a fairly realistic montage. These new artificial intelligence tools – like MidJourney, Stable Diffusion, DALL-E… – currently fascinate creators as much as they worry ethicists.
Hands
However sophisticated they may be, these artificially intelligent generators are not yet fully developed. By studying their works more closely, we detect their weak point: the hands. They are often deformed and have an abnormal number of fingers, as can be suspected by looking at the photo more closely (especially the hands of the policeman in the background). “It seems difficult for these tools to create correct hands,” says Martin Gibert. But it’s only a matter of time before these software fix these flaws. “In a way, it’s good that this image exists and that it’s done badly because it allows us to see how the image algorithm works. »
Head
The wise eye of Martin Benoît, a photography teacher at the Cégep du Vieux Montréal, focuses on another element of the image. “Trump’s head to neck connection is questionable, which initially led me to believe it was a really bad photomontage where someone took a real photo of an arrest and just replaced the head, created a tie and adjusted colors and brightness,” he says. The former president’s flawless hairstyle is also suspect. “The images created by artificial intelligence all have more or less the same difficulties in properly generating hair, which is a complex structure to reproduce realistically,” he adds.
Police officer
There are tools that create the images, and other tools that dissect them. By using Fotoforensics, which allows manipulations to be detected on an image, Martin Benoît was able to notice different compression rates on certain elements of the image, in particular on the tie, a sign that different files were used. “These detection software are not infallible,” he notes, but they do provide some clues. Moreover, another suspicious element: the low resolution of the broadcast image, which notably ensures that the badges of the police are illegible. “It’s very convenient to provide a poor quality image, because it hides the flaws of the AI’s ‘errors’,” says Martin Benoît.
Image Source
All observers agree that as the tools improve, it will become more and more difficult to distinguish the true from the false. “It reminds us of the importance of the work of journalists,” says Martin Gibert, from UdeM. The source of the image is crucial. Is it a photo published by a credible media? Or another recognized authority? This photomontage, for example, was posted on the Twitter account @TheInfiniteDude, a member of a group that explores the potential of artificial intelligence. For Martin Gibert, one must ask who benefits from the circulation of a faked image. “Because it is particularly useful for platforms that derive advertising revenue from clicks…”