X has disabled searches for Taylor Swift on its platform to curb the spread of fake pornographic images featuring the singer's likeness that have been circulating on social media since last week.
Since last Sunday, searches for “Taylor Swift” on X have returned the error message “Oops, something went wrong.” X blocked the search term after promising to remove the deepfake AI-generated images from the platform and take “appropriate action” against accounts that shared them.
“Posting non-consensual nudity (NCN) images is up
Click here to view related media.
Click to expand
Fraudsters are using voice deepfakes to target banks, finds report
Still, some fake images of the pop star continue to circulate on the social network, with some malicious actors bypassing search blocks by manipulating search terms, such as inserting words between the entertainer's first and last name, CBS MoneyWatch observed in an internal test of X Search engine.
Reached for comment by CBS MoneyWatch, X replied: “Busy now, please check back later.”
The deepfake images reached 27 million viewers and about 260,000 likes in 19 hours last week, NBC News reported. They also ended up on other social networks, including Reddit and Facebook.
The vast reach of the images highlights an increasingly important issue facing tech companies: How to remove deepfakes or “synthetic media” images from their platforms? According to the latest report from cybersecurity firm Home Security Heroes, more than 95,000 deepfake videos were shared online in 2023, a 550% increase over the number of fake videos online in 2019.
More from CBS News
Elizabeth Napolitano
Read more