It can destroy a life American teenagers fall victim to

‘It can destroy a life’: American teenagers fall victim to fake nude photos created by AI

Ellis, a 14-year-old teenager from Texas, woke up one October morning to several missed calls and frantic messages on her phone. Fake nude photos of her were circulating on social media.

• Also read: Dismay over OpenAI: ChatGPT creator fired

• Also read: Microsoft announces the hiring of former OpenAI boss Sam Altman

• Also read: American TV presenters, although they themselves are stars of AI tricks

“They were photos of one of my best friends and me from Instagram. Naked bodies could be seen on it,” the student told AFP.

“I remember being very scared because these fake nude photos were being sent around me. And I’ve never done anything like that before.”

Like several classmates from her college on the outskirts of Dallas, Texas, the young girl fell victim to hyper-realistic montages (deepfakes) of a sexual nature, made by a student without her consent and then shared on the Snapchat network.

With the popularization of artificial intelligence (AI), it has become easier to make these photo or video montages hyper-realistic, paving the way for their use to harass or humiliate.

“The girls cried and cried and cried, they were ashamed,” recalls Anna Berry McAdams, who was immediately informed by her daughter Ellis.

“She said to me, ‘Mom, it looks like me,’ and I said, ‘But no, darling, I was with you when the photo was taken and I know that’s not the case,'” continues the fifty-year-old, who says she is “horrified” by the realism of the photos.

A growing phenomenon?

At the end of October, further “deepfakes” of a sexual nature were discovered in a high school in New Jersey, in the northeastern United States. An investigation has been launched to identify all victims and the perpetrator(s).

“I think that this will happen more and more often (…) and that there are victims who don’t even know that they are victims and that there are photos of them,” complains Dorota Mani, mother of one of the identified students 14 years old.

“More and more cases are emerging (…), but when it comes to sexual abuse, “revenge porn” (malicious disclosure of intimate images) or “pornographic deepfakes”, many people do not demonstrate and suffer in silence because they are afraid to bring the matter to the public,” explains Renée Cummings, criminologist and artificial intelligence researcher.

While it is impossible to estimate the extent of the phenomenon, “anyone with a smartphone and a few dollars can now create a deepfake,” explains this professor from the University of Virginia.

This is thanks to the recent advances and democratization of generative AI, capable of producing texts, lines of code, images and sounds in everyday language upon simple request.

In fact, hyperrealistic montages that previously marred the image of celebrities “who had stacks of photos and videos of themselves online” now affect everyone, explains Hany Farid, a professor at the University of California at Berkeley.

“If you have a LinkedIn profile with a photo of your head, someone can create a sexual image of you,” continues the digitally manipulated image detection specialist, pointing out that these montages “mainly target women and young girls “.

Lack of legal framework

In the face of this growing threat, the school and justice systems appear to be overwhelmed.

“Even though your face was superimposed on a body, that body isn’t actually yours, so it’s not like someone shared a nude of yours,” Ms. Cummings says in the eyes of the law.

In the United States, there is no federal law criminalizing the creation and distribution of false sexual representations, and only a handful of states have specific laws.

In late October, Joe Biden called on lawmakers to put safeguards in place, particularly to prevent “generative AI from producing child criminal content or non-consensual intimate images of real people.”

If the responsibility of the authors of these images – difficult to identify today – is of central importance, that of the companies behind the websites or the software used and the social networks that transmit the content must also be taken into account, emphasizes Mr. Farid .

Because if these images are false, “the trauma is very real,” adds Renée Cummings, describing people “who suffer from anxiety, panic attacks, depression or even post-traumatic syndrome after falling victim to pornographic deepfakes.” “It can destroy a life.”

Texan Ellis, who describes herself as a “sociable” and sporty teenager, says she is now “constantly afraid” even though the student behind the “deepfakes” has been identified and temporarily suspended from school.

“I don’t know how many photos he was able to take or how many people they were able to receive,” she explains, saying she asked to change schools.

Faced with this unknown, his mother is committed to ensuring that these images are recognized as “child pornography” and punished as such.

“It could affect them for the rest of their lives. This will never go away from the internet. So if they apply to university, for example, who knows if (these images) won’t show up again?” worries Anna Berry McAdams.