The rapid online spread of deepfake pornography images of Taylor Swift has led to renewed calls from US politicians, among others, to criminalize the practice of using artificial intelligence to synthesize fake but convincing explicit images.
The US pop star's pictures were shared across social media and seen by millions this week. One of Swift's X-hosted images, previously shared on the app Telegram, was seen 47 million times before it was removed.
X said in a statement: “Our teams are actively removing all identified images and taking appropriate action against the accounts responsible for publishing these images.”
Yvette D. Clarke, a Democratic congresswoman for New York, wrote on X: “What happened to Taylor Swift is nothing new. Women have been the target of deepfakes for years [without] their consent. And [with] Thanks to advances in AI, creating deepfakes is easier and cheaper. This is an issue on both sides and even Swifties should be able to find a solution together.”
Some individual US states have their own laws against deepfakes, but there is growing pressure to change federal law.
In May 2023, Democratic Congressman Joseph Morelle introduced the proposed “Preventing Deepfakes of Intimate Images Act,” which would make it illegal to share deepfake pornography without consent. Morelle said the images and videos “can cause irreparable emotional, financial and reputational damage – and unfortunately, women are disproportionately affected.”
In a tweet condemning the Swift images, he called them “sexual exploitation.” His proposed law has not yet become law.
Republican Congressman Tom Kean Jr. said: “It is clear that AI technology is advancing faster than the necessary guardrails.” Regardless of whether the victim is Taylor Swift or any other young person in our country, we must We are taking protective measures to address this alarming trend.” He co-sponsored Morelle's bill and introduced his own AI Labeling Act, which would require all AI-generated content (including more benign chatbots used in customer service, for example) to be labeled as such .
Swift has not spoken publicly about the images. Her U.S. publicist had not responded to a request for comment at the time of publication.
Convincing deepfake videos or audios have been used to impersonate some high-profile men, particularly politicians such as Donald Trump and Joe Biden, and artists such as Drake and The Weeknd. In October 2023, Tom Hanks urged his Instagram followers not to be lured by a fake dentist advertisement featuring his likeness.
But the technology overwhelmingly targets women, and does so in sexually exploitative ways: A 2019 DeepTrace Labs study cited in the U.S. bill found that 96% of deepfake video content was non-consenting pornographic material.
The problem has worsened significantly since 2019. Fake pornography, where photo editing software is used to insert a non-consenting person's face into an existing pornographic image, is a long-standing problem. However, thanks to the sophistication of artificial intelligence, new frontiers have opened up, with the help of which completely new and extremely convincing images can be created, even through the use of simple text commands.
Prominent women are particularly at risk. In 2018, Scarlett Johansson spoke about the widespread fake pornography featuring her likeness: “I've unfortunately been down this road many, many times. The fact is, trying to protect yourself from the Internet and its depravity is largely a hopeless cause.”
The UK government made non-consensual deepfake pornography illegal in December 2022 through an amendment to the Online Safety Act, which also banned any explicit images taken without another person's consent, including so-called “downblouse” photos.
Dominic Raab, the then deputy prime minister, said: “We must do more to protect women and girls from people who take or manipulate intimate photos in order to stalk or humiliate them.” Our changes give police and prosecutors the powers to they need to bring these cowards to justice and protect women and girls from such vile abuse.”