Instagrams algorithms make it easier to sell child pornography

Instagram’s algorithms make it easier to sell child pornography

According to a report by Stanford University and the Wall Street Journal (WSJ), Instagram, a subsidiary of Meta, is the main platform used by pedophile networks to promote and sell content depicting child sexual abuse.

• Also read: Social media can be “extremely dangerous” for young people

• Also read: Sexting, threats and intimidation: the municipal police adapt to social networks

“Large networks of accounts that appear to be operated by minors openly encourage the sale of content” of child pornography, researchers at the prestigious University of Silicon Valley’s Cyber ​​Policy Center said on Wednesday.

“Instagram is currently the premier platform for these networks, with features like content recommendation algorithms and messaging that help sellers connect with buyers,” they added.

And neither the pedophiles nor these networks need to show much ingenuity.

According to WSJ, a simple search with keywords like #pedowhore (“pedo whore”) or #preteensex (“sex for teens”) leads to accounts using those terms to promote content depicting abusive sex with a minor.

Oftentimes, these profiles “claim to be driven by the kids themselves and use overtly sexual aliases with words like +little bitch for you+,” the article states.

The accounts don’t directly state that they sell these images, but they do offer menus of options, which in some cases also ask for specific sexual acts.

Stanford researchers also discovered listings for sodomy and self-harm videos. “Children are available for face-to-face ‘meetings’ at a certain price,” the article continues.

The report underscores the role played by the popular social network’s algorithms: a test account created by the business daily was “inundated with content that sexualizes children” after clicking on some such recommendations.

Meta did not immediately respond to a request from AFP.

According to the WSJ, the social networking giant acknowledged there were problems within its security services and said it had set up a “task force” to resolve the issue.

Last March, pension and mutual funds filed a complaint against Meta for “turning a blind eye to human trafficking and pedophile crime on its platforms.”

Instagram is also regularly accused by associations and authorities of not adequately protecting children from the risks of harassment, addiction and self-image problems.