Child abuse Pedophiles reportedly use artificial intelligence to return celebrities

Child abuse: Pedophiles reportedly use artificial intelligence to return celebrities to childhood – Le Journal de Montréal

Pedophiles are allegedly using artificial intelligence (AI) to create fake “realistic” images of famous childhood singers and actresses and share them on child abuse sites, according to a recent report.

“Earlier this year we warned that AI images could soon become indistinguishable from real images of children being sexually abused and that we could see a much wider spread of these images. We are now past that point,” Susie Hargreaves, executive director of the UK’s Internet Monitoring Foundation (IWF), said on Wednesday, according to the BBC.

In its latest report, the foundation reports a “nightmare come true,” the director said, while researchers allegedly found a total of 11,108 AI-generated images in a single child abuse forum on the “Dark Web.” within a month.

Of these, almost 3,000 images were classified as illegal because they depicted child sexual abuse, while 564 were classified as category A, the most serious for this type of image, according to British media.

Except researchers have noticed new trends including the rejuvenation of celebrities, especially singers and actresses, to create realistic images that are indistinguishable without a trained eye, according to the BBC.

These pedophiles would even go so far as to create new images of real young victims of sexual abuse based on photos already present on the forums.

They could also use photos of young, clothed agency models to place them in scenes of Grade A sexual abuse.

But even if the children in these generated photos were not directly involved in the content creation process, the foundation reiterated the seriousness of these types of images that normalize predatory behavior. They can also waste vital police resources when teams investigate children who don’t exist, the IWF concluded.