Microsoft Updates AI That Generated Fake Hardcore Photos of Taylor Swift

IPA

Microsoft has updated Designersthe artificial intelligence tool capable of creating images from textual information that would be used to create them fake hard photos of Taylor Swift. The app now does not allow linking sexually explicit terms to generate similar images, even of famous people. The pop star's fake pictures went viral: one of them became over 47 million views. The platform X has blocked any research into the artist.

The website 404 Media, which reported on the update, reported that Taylor Swift's nude photos were created thanks to AI
They came from the 4chan forum and a Telegram channel where people used designers to create images of celebrities using artificial intelligence.

Found a way to get around the bans Before Swift's news spread on social media, the designer prevented content creation by typing terms like “Taylor Swift nude.” However, users of the Telegram channel and 4chan knew that they could get around the protections by mistyping the name and using words that were only sexually suggestive. 404 Media was able to confirm that these issues have been resolved with a recent update.

Microsoft's investigations “We are investigating the reports and taking appropriate action to address them,” a Microsoft spokesperson said shortly after the deepfakes were published. “The Code of Conduct prohibits the use of our tools to create intimate, adult or non-consensual content, and any repeated attempts to create content that contradicts our policies may result in loss of access to the Service. We have teams working directly on monitoring in line with the principles of responsible artificial intelligence, including content filtering and abuse detection, to create a safer environment for users,” the Redmond company concluded.