Anger as extremely graphic AI images of Taylor Swift go

Anger as extremely graphic AI images of Taylor Swift go viral, with outraged fans slamming the image makers for harassment and predatory behavior

Extremely graphic and suggestive AI-generated images of Taylor Swift are circulating online, themed around her Kansas City Chiefs fan base, and sparking outrage among the singer's fans, who are demanding legal action.

This season, Swift adopted the Chiefs as her NFL team as she began dating star player Travis Kelce.

The new images show Swift in various sexualized poses. It is not clear where the images come from. On Thursday morning, “Taylor Swift AI” was a trending topic on X, formerly known as Twitter.

has seen the images in question but will not publish them.

“Why isn’t this considered sexual assault?” I can’t be the only one who finds this strange and uncomfortable? We're talking about a woman's body/face being used for something she would probably never allow/feel comfortable with. Why are there no regulations or laws to prevent this?,” one fan tweeted.

Non-consensual deepfake pornography is illegal in Texas, Minnesota, New York, Virginia, Hawaii and Georgia. In Illinois and California, victims can sue pornography creators in court for defamation.

Swift is pictured leaving the Nobu restaurant after dinner with Brittany Mahomes, the wife of Kansas City Chiefs quarterback Patrick Mahomes

Swift is pictured leaving the Nobu restaurant after dinner with Brittany Mahomes, the wife of Kansas City Chiefs quarterback Patrick Mahomes

Brittany Mahomes, Jason Kelce and Taylor Swift react during the second half of the AFC Divisional Playoff game between the Kansas City Chiefs and the Buffalo Bills at Highmark Stadium

Brittany Mahomes, Jason Kelce and Taylor Swift react during the second half of the AFC Divisional Playoff game between the Kansas City Chiefs and the Buffalo Bills at Highmark Stadium

The images sparked outrage among Taylor Swift fans around the world

The images sparked outrage among Taylor Swift fans around the world

“I need the entire adult Swiftie community to log on to Twitter, search for the term “Taylor Swift AI,” click on the “Media” tab, and report every single AI-generated pornographic photo of Taylor they can see, 'Cause it's me Shit, done with this nonsense. Get it together Elon,” one angry Swift fan wrote.

“Man this is so inappropriate,” wrote another. While another said: “Whoever takes these AI pictures of Taylor Swift is going to hell.”

“Whoever is producing this trash needs to be arrested.” “What I saw is just absolutely disgusting and something like this should be illegal…we MUST protect women from things like this,” another person added.

Explicit AI-generated material predominantly harmful to women and children is booming online at an unprecedented rate.

More than 143,000 new deepfake videos were posted online this year, more than every two years combined, according to an analysis by independent researcher Genevieve Oh shared with The Associated Press in December.

Affected families are desperate for solutions and are urging lawmakers to introduce strict protections for victims whose images are manipulated using new AI models or who have a variety of apps and websites openly advertising their services.

Advocates and some legal experts are also calling for federal regulation that provides uniform protections across the country and sends a strong message to current and potential perpetrators.

The problem with deepfakes is not new, but experts say it is only getting worse as the technology to make them becomes more available and easier to use.

The obscene images revolve around Swift's passion for the Kansas City Chiefs, which began after she started dating star player Travis Kelce

The obscene images revolve around Swift's passion for the Kansas City Chiefs, which began after she started dating star player Travis Kelce

Biden speaks before signing an executive order regulating artificial intelligence (AI) in October 2023

Biden speaks before signing an executive order regulating artificial intelligence (AI) in October 2023

Researchers are sounding the alarm this year about the explosion of AI-generated child sexual abuse material that uses depictions of real victims or virtual characters.

In June 2023, the FBI warned that it continues to receive reports from victims, both minors and adults, whose photos or videos were used to create explicit content that was shared online.

In addition to the states that already have laws in place, other states are also considering their own laws, including New Jersey, which is currently drafting a bill that would ban deepfake porn and impose penalties – either prison time, a fine or both those who spread it.

President Joe Biden signed an executive order in October that, among other things, called for banning the use of generative AI to produce child sexual abuse material or non-consensual “intimate images of real people.”

The order also directs the federal government to issue guidelines for labeling and watermarking AI-generated content to help distinguish between authentic and software-generated material.

Some are urging caution — including the American Civil Liberties Union, the Electronic Frontier Foundation and The Media Coalition, an organization that works for trade groups representing publishers, film studios and others — saying careful consideration is needed to avoid proposals , which may conflict with the First Amendment.

“Some concerns about abusive deepfakes can be addressed under existing online harassment laws,” said Joe Johnson, an attorney with the ACLU of New Jersey.

“Whether at the federal or state level, there must be extensive conversation and stakeholder input to ensure that a bill is not overly broad and addresses the stated problem.”

Mani said her daughter created a website and founded a charity to help AI victims. The two have also been in talks with state lawmakers pushing New Jersey's bill and are planning a trip to Washington to advocate for more protections.

“Not every child, boy or girl, has the support they need to cope with this problem,” Mani said. “And they may not see the light at the end of the tunnel.”