Fake porn images of Taylor Swift created by artificial intelligence

Fake porn images of Taylor Swift created by artificial intelligence cause outrage in US G1

1 of 1 Taylor Swift, singer nominated for her “Taylor Swift: The Eras Tour,” arrives at the 2024 Golden Globes Photo: Jordan Strauss/Invision/AP Taylor Swift, singer nominated for her “Taylor Swift: The Eras Tour.” , arrives at the 2024 Golden Globes Photo: Jordan Strauss/Invision/AP

Fake pornographic images of Taylor Swift generated by artificial intelligence went viral on social media, causing outrage among the singer's fans and American politicians this Friday (26).

One of the images was viewed 47 million times on X, the old Twitter, before being removed on Thursday (25). According to the American press, the publication was visible on the platform for about 17 hours.

These “deepfakes” fake but extremely realistic pornographic images of celebrities are nothing new. But activists and authorities fear that userfriendly tools that leverage generative artificial intelligence (AI) will unleash an uncontrollable avalanche of toxic or harmful content.

The attack on Swift, the world's second most listened to artist on Spotify, could fuel debate about the phenomenon.

“The only glimmer of hope that has happened to Taylor Swift is that she probably has enough power to pass laws to stop this. You are sick,” wrote influencer Danisha Carter on X.

The social network is one of the largest pornographic content platforms in the world, according to some analysts, because its nudity policies are more flexible than those of metanetworks Facebook or Instagram.

In a statement, X clarified that “posting nonconsensual nude images (NCN)” on its platform is strictly prohibited. “We have a zerotolerance policy for this type of content.”

The platform, owned by mogul Elon Musk, said it was actively removing all identified images and taking appropriate action against the accounts responsible for posting those images.

Additionally, it stressed that it is “closely monitoring the situation to ensure any further violations are addressed immediately and the content is removed.”

Swift's representatives did not immediately respond to a request for comment from the France Presse news agency.

Yvette Clarke, a Democratic congresswoman from New York who sponsored legislation to combat fake pornographic photos, emphasized that “with advances in AI, creating deepfakes is easier and cheaper.”

For his part, Republican lawmaker Tom Keane warned that “AI technology is advancing faster than necessary obstacles.” “Whether the victim is Taylor Swift or another young person in our country, we must take protective measures to counter this alarming trend to counteract it,” he added.

According to a study cited by Wired magazine, 113,000 “deepfake” videos were sent to the most popular porn sites in the first nine months of 2023.