The supervisory board of Meta (Facebook, Instagram) announced on Tuesday that it was taking up two cases concerning false pornographic images of female public figures, two months after the “deepfakes” scandal featuring Taylor Swift.
The two cases were selected “to assess whether Meta’s regulations and their application are effective in addressing (the problem of) sexual images generated by artificial intelligence (AI)”, notes in a press release this council nicknamed “court supreme”.
Set up by the social media giant and made up of independent members, the council is responsible for deciding thorny issues of content moderation.
The rise of generative AI, which automates the production of sophisticated content, has given new impetus to the phenomenon of “deepfakes”, or “hyperfakes”, in particular manipulated and sexualized images which feature women, for purposes intimidation or harassment.
The first case chosen by Meta’s supervisory board involves an AI-generated image of a nude woman posted on Instagram, “resembling an Indian public figure,” the statement said.
A user complained that the Californian company does not remove this image.
“Meta determined that its decision to leave the content posted was erroneous and removed the post for violating its rules on bullying and harassment,” notes the board.
The second case concerns an image posted on a Facebook group for content creation with AI, showing “a naked woman with a man groping her breasts”. The woman “looks like an American public figure,” also named in the caption.
Meta had removed the image, and added it to a content bank that is part of its enforcement system, to automatically find and remove from its platforms images already identified by employees as problematic.
Last January, a fake pornographic image of American superstar Taylor Swift was viewed 47 million times on X (formerly Twitter) before being deleted by the social network, around fifteen hours after being posted online.
The affair aroused the indignation of his fans, many personalities and even the White House.
According to a 2019 study by Dutch AI company Sensity, 96% of fake videos online are non-consensual pornography and most of them depict women, famous or not.