X is suspending the account that posted AI porn of

X is suspending the account that posted AI porn of Taylor Swift – only for another account to show it – as the same graphic images are now circulating on Facebook and Instagram

  • The pictures are also circulating on Facebook and Instagram
  • Reddit took action and deleted a channel named “TaylorSwiftLewd.”
  • They show the singer posing provocatively in Kansas City Chief clothes

X has suspended an account that posted AI pornography of Taylor Swift – but several others have already surfaced with the same graphic images.

The highly graphic AI-generated images that showed the singer posing provocatively in Kansas City Chief gear sparked outrage among her fans on Thursday, with many calling for legal action.

The backlash led to the suspension of an X account that shared the images, but not before they were shared by dozens of other accounts.

The pictures are also circulating on Facebook and Instagram.

The new images show Swift in various sexualized poses and are reportedly from a website that publishes AI-generated pornographic images of celebrities.

X has suspended an account that posted AI pornography of Taylor Swift - but several others have already surfaced with the same graphic images

X has suspended an account that posted AI pornography of Taylor Swift – but several others have already surfaced with the same graphic images

The extremely graphic AI-generated images (not shown), which showed the singer posing provocatively in Kansas City Chief gear, sparked outrage among her fans

The extremely graphic AI-generated images (not shown), which showed the singer posing provocatively in Kansas City Chief gear, sparked outrage among her fans

On Thursday morning, “Taylor Swift AI” was a trending topic on X, formerly known as Twitter.

Meanwhile, Reddit appears to have taken action against the fake images, deleting posts containing them and suspending a channel called “TaylorSwiftLewd.”

Reddit doesn't allow intimate or sexually explicit photos of someone to be posted without their consent, but found that the images are still circulating on the site and on 8chan.

has reached out to Reddit, Meta and X for comment on this story.

The AI ​​images do not appear to be circulating on TikTok, which does not allow any nudity.

1706197105 630 X is suspending the account that posted AI porn of 1706197106 125 X is suspending the account that posted AI porn of  discovered that the images are still circulating on Reddit

discovered that the images are still circulating on Reddit

Reddit appears to have taken action against some of the fake images, deleting posts containing them and suspending a channel called

Reddit appears to have taken action against some of the fake images, deleting posts containing them and suspending a channel called “TaylorSwiftLewd.”

X, on the other hand, allows some nudity, making this type of content difficult to moderate.

has seen the images in question but will not publish them.

They are the latest example of the dangerous rise in popularity of deepfake porn websites, where celebrities and others find their likenesses featured in explicit videos and photos without their permission.

Meta has recently taken steps to supposedly make its platforms safer for children, including banning teenagers under 18 from sending messages to strangers on Instagram or Facebook.

However, both Facebook and Instagram are currently flooded with the fake Swift pornography.

Non-consensual deepfake pornography is illegal in Texas, Minnesota, New York, Virginia, Hawaii and Georgia. In Illinois and California, victims can sue pornography creators in court for defamation.

“I need the entire adult Swiftie community to log on to Twitter, search for the term “Taylor Swift AI,” click on the “Media” tab, and report every single AI-generated pornographic photo of Taylor they can see, 'Cause it's me Shit, done with this nonsense. Get it together Elon,” one angry Swift fan wrote.

“Man this is so inappropriate,” wrote another. While another said: “Whoever takes these AI pictures of Taylor Swift is going to hell.”

“Whoever is producing this trash needs to be arrested.” “What I saw is just absolutely disgusting and something like this should be illegal…we MUST protect women from things like this,” another person added.