Fake pornographic images of Taylor Swift spark outrage

(Washington) The American political class and fans of Taylor Swift expressed their indignation on Friday as false pornographic images featuring the singer, and created using generative AI, were widely shared in recent days on X and d other platforms.


One of these images has been viewed more than 47 million times on the social network. According to American media, the image remained on X for more than 17 hours before being deleted.

False images (“ deepfakes “) pornographic images of famous women, but also targeting many anonymous people, are not new.

But the development of generative artificial intelligence (AI) programs risks producing an uncontrollable flow of degrading content, according to many activists and regulators.

The fact that such images this time affect Taylor Swift, the second most listened to artist in the world on the Spotify platform, could however help to raise awareness of the problem among the authorities, given the indignation of her millions of fans.

“The only ‘good thing’ about this happening to Taylor Swift is that she is influential enough for a law to be passed to eliminate this. You are sick”, posted on X Danisha Carter, an influencer with an audience of several hundred thousand people on social networks.

X is known for having less strict rules on nudity than Instagram or Facebook.

Apple and Google have a right to control the content circulating on applications via the rules they impose on their mobile operating systems, but they have tolerated this situation on X, until now.

In a press release, X assured that it had “a zero tolerance policy” on the non-consensual publication of nudity images.

The platform declared “concretely remove all identified images” of the singer and “take appropriate measures against the accounts having posted them”.

Representatives of the American singer have not yet commented.

“What happened to Taylor Swift is not new, women have been the target of false images without their consent for years,” recalled the elected Democrat Yvette Clarke, who supported a law to fight against the phenomenon. “With advances in AI, creating these images is easier and cheaper.”

A study carried out in 2019 estimated that 96% of videos deepfakes were pornographic in nature. According to the magazine Wired113,000 of these videos were uploaded to major porn sites during the first nine months of 2023.


source site-53