One of these images has been viewed more than 47 million times on the social network. It remained online for more than 17 hours before being deleted.
Published
Reading time: 1 min
General indignation. Fake pornographic images featuring singer Taylor Swift, created using generative Artificial Intelligence (AI), were widely shared on the social network X (formerly Twitter). One of these images has been viewed more than 47 million times on the social network. According to American media, the image remained on X for more than 17 hours before being deleted. Representatives of the American singer, already targeted by conspiracy theories, have not yet spoken.
In response, in a press release, “a zero tolerance policy” on the non-consensual publication of nudity images. The platform declared “concretely remove all identified images” of the singer and “take appropriate action against the accounts that posted them.”
Women and girls, main targets of harassment
The White House has also expressed concern on the subject. “We are alarmed by reports of the circulation of these false images,” declared Karine Jean-Pierre, the spokesperson for the American executive, to the press. She recalled the important role of social networks in content moderation. “Unfortunately, too often we know that lack of enforcement has a disproportionate impact on women and girls, who are the main targets of online harassment”she added.
The fact that such images this time affect Taylor Swift, second among the most listened to artists in the world on the Spotify platform, could however help to raise awareness of the problem among the authorities, given the indignation of her millions of fans. “The only ‘good thing’ about this happening to Taylor Swift is that she’s big enough for a law to be passed to eliminate this. You guys are sick,” posted Thursday on X Danisha Carter, an influencer whose audience reaches several hundred thousand people on social networks.