1689246005 Child pornography is entering a new era

Child pornography is entering a new era

Police got quite a surprise when they searched Steven Larouche’s computer equipment.

Over the past few years, the 61-year-old Sherbrooke resident has built up a collection of child pornography. And not just any: one of the largest collections unearthed in the history of the Canadian judiciary. A batch of over 545,000 files.

“Every day people exchange videos of me when I was little and being raped in the most sadistic way possible,” said one of the victims the Quebec court met.

“They don’t know me, but they’ve seen me from every angle. They laugh at my shame and my pain. »

– A quote from “A Victim” by Steven Larouche

Steven Larouche was sentenced to eight years in prison in April. The man, who studied computer science and worked in this field for a number of years, used state-of-the-art techniques, including the use of artificial intelligence, to achieve his goals.

The Sherbrooke resident’s case is testament to a booming phenomenon that threatens to monopolize more and more justice systems around the world. Courts across Canada’s jurisdictions agree that the continued proliferation and increasing ease of use of the internet to commit sex crimes against children is an evil that urgently needs to be eradicated, Judge Benoit Gagnon writes in his decision.

Launch of the widget. Skip widget?

end of the widget. Back to the top of the widget?

Several voices have joined the judge in recent months: the new tactics used by cybercriminals could quickly spiral out of control.

But what are these tactics?

Robots in children’s shoes

Many researchers have tried to uncover the latest machinations of cyber thieves. The problem, as Christian Jordan Howell, a professor of criminology at the University of South Florida, explains, is that methods are evolving so rapidly that by the time academic papers are published, most of them are already outdated.

The studies analyze old data, mostly from police reports, he says. It can take three to ten years for these reports to become available, and it can take a year to write and review the article. By the time the article is published, the cyberpredation techniques described in it will often no longer be used.

The professor teamed up with Eden Kamar, a cybersecurity doctoral student at the Hebrew University of Jerusalem, to try a different approach. In order to study the behavior of the still active cyberpredators, the researchers got in direct contact with them. Or rather, through chatbots.

They created fake profiles of girls aged 13-14, which they published on about 30 popular teen chat sites. Profile photos were sourced from the website This Person Does Not Exist, which uses artificial intelligence to create realistic images of people who don’t exist.

1/ of 4

Example of a photo of a fictional young girl created by artificial intelligence using the “This person does not exist” website. Photo: Radio-Canada / This person does not exist

Show previous image

See next picture

  • Image 1 of 4Fictional girl portrait.

    Example of a photo of a fictional young girl created by artificial intelligence using the “This person does not exist” website. Photo: Radio-Canada / This person does not exist

  • Image 2 of 4Fictional girl portrait.

    Example of a photo of a fictional young girl created by artificial intelligence using the “This person does not exist” website. Photo: Radio-Canada / This person does not exist

  • Image 3 of 4Fictional girl portrait.

    Example of a photo of a fictional young girl created by artificial intelligence using the “This person does not exist” website. Photo: Radio-Canada / This person does not exist

  • Image 4 of 4Fictional girl portrait.

    Example of a photo of a fictional young girl created by artificial intelligence using the “This person does not exist” website. Photo: Radio-Canada / This person does not exist

The researchers trained their algorithms to respond to strangers chatting with them. The robots never started a conversation themselves. When a stranger started talking to them, they routinely asked their age, gender, and location — a common practice in chat rooms — and only continued to converse with the person if they said they were 18 or older. .

The news flooded us quickly. In total, over a period of six months, the robots had nearly 1,000 conversations with people who identified themselves as adults.

“Almost all of these conversations resulted in some form of sexual abuse. »

– A quote from Eden Kamar, PhD student in cybersecurity at the Hebrew University of Jerusalem

The research team states that the constraints imposed at the time of the ethical approval of their experiment prohibit them from revealing the pseudonyms of the cyber-robbers and the chat sites studied. However, Radio-Canada had access to screenshots of conversations.

Some strangers very clearly expressed a desire to receive videos in which the child performs a sexual act in exchange for money or a romantic relationship. Two in five strangers took a more subtle approach: they sent a URL link during the conversation.

In 19% of cases, the links contained malware and in 5% of cases, they led to a phishing page. For example, these techniques allow access to children’s personal information, passwords, and webcams.

More than 40% of the links lead to the platform of the Norwegian company Whereby.

It’s a Zoom-like platform, says Eden Kamar, and we found in our study that it’s possible for a predator to use Whereby to embed code that allows them to open the child’s camera and without their consent record.

Launch of the widget. Skip widget?

end of the widget. Back to the top of the widget?

Fake pictures, real problem

In Canada, the Cybertip program established by the Canadian Center for Child Protection (CCPE) collects reports of child sexual exploitation. Computer lure reports have increased by 815% in the last five years in the country.

The Dark Web [ou web clandestin] gives pedagogues the opportunity to come together in a community and step out of their anonymity, explains René Morin, spokesman for the CCPE. Here all the tips are shared to explain how to use such a new child abuse website or application.

However, the exchange of child pornographic material usually takes place on the visible network, he adds. On the underground network, there are technologies that encrypt shared information, which causes videos to slow down. Nobody wants to see a video that is in slow motion.

Mr. Morin points out that if predators found a flaw in the Whereby platform, they could have exchanged their stuff on the underground network. However, Cybertip has only had a handful of reports related to this platform in recent years.

Launch of the widget. Skip widget?

end of the widget. Back to the top of the widget?

The CCPE spokesman is particularly concerned about another phenomenon: the use of artificial intelligence to produce child pornography. This is one of the methods that allowed Sherbrooke resident Steven Larouche to get hold of so many files. More than 86,000 of these featured scenes inspired by real children — but entirely fake.

The man used deep-faking techniques to realistically embed children’s faces into other people’s bodies.

Judge Gagnon writes in his decision that the use of deep-faking technology by criminal hands sends shivers down your spine. This type of software makes it possible to commit crimes that could involve virtually any child in our communities. A simple video clip of children available on social networks or a secret video recording of children in a public place could make them potential victims of child pornography, he adds.

“A new era of cybercrime is clearly beginning for the police. »

– A quote from Judge Benoit Gagnon

The same is true for the Stanford Internet Observatory and the anti-sexual exploitation organization Thorn, who recently produced a joint report on the subject.

In particular, they point to the abuse of Stable Diffusion, a DALL-E-like machine learning tool from OpenAI that allows the creation of images from text descriptions.

Unlike DALL-E, Stable Diffusion’s source code is public. As a result, users could change the code to remove restrictions on creating pornographic material.

The researchers showed that these generated images can even be altered to make the faces resemble those of a specific person. In their example, they take an artificial intelligence generated image of a young girl to create a rejuvenated version of what looks like a photograph of actress Audrey Hepburn.

1 of 3

Fictional image of a girl created by Stable Diffusion and used by the researchers for their transformation example. Photo: Radio-Canada / Courtesy of Stanford Internet Observatory

Show previous image

See next picture

  • Image 1 of 3Fictional image of a girl.

    Fictional image of a girl created by Stable Diffusion and used by the researchers for their transformation example. Photo: Radio-Canada / Courtesy of Stanford Internet Observatory

  • Image 2 of 3Girls with facial features similar to Audrey Hebpurn.

    Modification of the previous image to give the young girl similar physical features to Audrey Hepburn. Photo: Radio-Canada / Courtesy of Stanford Internet Observatory

  • Image 3 of 3Young girl with facial features similar to Audrey Hepburn.

    Final transformations to rejuvenate the girl who looks like Audrey Hepburn. Photo: Radio-Canada / Courtesy of Stanford Internet Observatory

One of the dangers then is revictimization, argues Rebecca Portnoff, head of data science at Thorn and co-author of the report. These technologies open the door to the production of new material featuring images of abused children. New pictures of the child are therefore circulating on the Internet, while the young person is already trying to somehow recover from his abuse.

The researcher notes that there is also a risk in identifying the true victims. It becomes extremely difficult to quickly sift through images showing actively vulnerable children as they drown in an ever-increasing amount of pornographic content.

In June, the FBI warned that more and more criminals are turning non-explicit photos and videos into pornographic material, which they distribute online to harass their victims.

Rebecca Portnoff.

Rebecca Portnoff, Director of Data Science at Thorn Photo: Radio-Canada / Courtesy Thorn

The power to act

In their experiment with fake profiles of young girls, Christian Jordan Howell and Eden Kamar programmed some of their chatbots to simulate active parental supervision and others to simulate weaker or no supervision at all.

Their finding is unequivocal: 92% of cyber predators stopped their requests when their victim informed them that a parent was actively monitoring. Parents must fulfill their role as parents, argues Prof Howell, who also suggests covering webcams when not in use

Beginning of the mosaic of 2 elements. Skip the mosaic?Eden Kamar.Enlarge image Eden Kamar, PhD student in cybersecurity at the Hebrew University of Jerusalem. Photo: Radio-Canada / Courtesy of Eden KamarChristian Jordan Howell.Enlarge Image Christian Jordan Howell, Professor of Criminology at the University of South Florida.Photo: Radio-Canada / Courtesy of Christian Jordan HowellEnd of 2-element mosaic. Back to the beginning of the mosaic?

For their part, the Thorn organization and the CCPE have each developed a technological tool, based on vast databases of child pornography, to report when those images reappear online.

However, all the experts polled by Radio-Canada agree that the digital giants also have a role to play.

Police forces are already overwhelmed with child sexual exploitation cases, and there is an influx of material that will inevitably increase in the coming years, René Morin warns.

In particular, Mr. Morin regrets the lack of a regulatory framework for the digital space in Canada and in many regions of the world. By leaving the platforms free, we now know that this is the Wild West and that these platforms have become gold mines for people who want to exploit children, he concludes.