WASHINGTON, June 7 (Portal) – The Federal Bureau of Investigation has warned Americans that criminals are increasingly using artificial intelligence to create sexually explicit images to intimidate and blackmail victims.
In a warning issued this week, the bureau said it had recently seen a rise in the number of extortion victims who said they had been attacked with manipulated versions of harmless images from online posts, private messages or video chats.
“The photos are then sent directly to the victims by malicious actors to induce sextortion or harassment,” the warning said. “Once the manipulated content has been distributed, they face significant challenges in preventing the manipulated content from continuing to be shared or removed from the Internet.”
The bureau said the images were “lifelike” and in some cases children had been specifically targeted.
The FBI did not elaborate on the program or programs used to generate the sexual images, but noted that technological advances “continuously improve the quality, customizability, and accessibility of artificial intelligence (AI) content creation.” .
The bureau did not respond to a follow-up message Wednesday asking for details of the phenomenon.
Manipulating innocent images to create sexually explicit images is almost as old as photography itself, but the release of open-source AI tools has made the process easier than ever. The results are often indistinguishable from real photos, and in recent years several websites and social media channels have sprung up specializing in the creation and sharing of AI-powered sexual imagery.
Reporting by Raphael Satter; Edited by David Gregorio
Our standards: The Trust Principles.
Raphael Satter