About ten young girls filed a complaint in Spain after becoming victims of the distribution of false nude images generated by an artificial intelligence application, a case that sparked strong controversy in the country.
• Also read: According to the study, ChatGPT diagnoses “as well” as a doctor
• Also read: Artificial intelligence, a new toolbox for hackers
“We have received 11 complaints from underage victims,” a police spokesman in the southwestern town of Almendralejo told AFP.
The alleged perpetrators of these images, some of whom have been identified, “manipulated photos of underage girls” to place their faces on other people’s “(naked) bodies,” she added.
According to another police source, these false images were created using an artificial intelligence (AI) application capable of creating very realistic photomontages.
The investigation was launched into allegations of invasion of privacy. The prosecutor’s office told AFP that given the age of the victims, the crime of child pornography could also be charged.
According to Spanish media, around twenty young girls may have fallen victim to these manipulated photos.
The mother of one of the victims, Miriam Al Adib, denounced a “disgrace” on Instagram. “When I arrived home, one of my daughters told me with great disgust: “Look what they have done” (..) They took a photo of her and edited it as if she were naked (…), with artificial intelligence.
She claimed that police told her that these photos could have been posted online on the Onlyfans platform or on “pornographic sites.”
Another mother told Spanish public television TVE that the images had been used as a blackmail tool to extort money from her daughter.
AI is a growing concern worldwide as this technology can be used for malicious purposes such as deepfake pornography.
According to a 2019 study by Dutch AI company Sensity, 96% of fake online videos are non-consensual pornography, and most of them feature women.