Algorithm highlighted suicide and eating disorder videos appear on TikTok

Algorithm highlighted suicide and eating disorder videos appear on TikTok

The algorithm of the social network TikTok is questioned in a report published on Wednesday December 14 by the Center for Combating Online Hate (CCDH) in the United States. Research conducted by the center shows how harmful content, including self-harm and eating disorder videos, is recommended by the social network’s algorithm for its young users.

To show the risks of the social network for young people’s mental health, the American non-profit organization conducted a life-size experiment. CCHR experts opened fake profiles of thirteen-year-old teenagers in the United States, United Kingdom, Canada and Australia, using profile presentations that indicated these youth’s particular vulnerability to developmental disabilities – including, for example, the words “lose weight”.

Researchers then brought these accounts to life by “liking” videos dealing with these harmful topics to see if the TikTok algorithm would respond in that way. Within minutes of joining the platform, in just two minutes and six seconds, TikTok’s algorithm recommended them suicide-related videos (razor blades, discussions about suicide and self-harm…). He also suggested content on weight loss and eating disorders in just eight minutes.

Also read: Article reserved for our subscribers Social networks must be “more transparent” in their moderation policy, Arcom warns

Opacity regarding the working of the algorithms

“It’s like being stuck in a room full of distorted mirrors where you’re constantly being told you’re ugly, you’re not good enough, maybe you should kill yourself,” said the center’s president, Imran Ahmed. whose organization has offices in the United States and the United Kingdom. He added that the social network “literally sends the most dangerous messages to young people. »

The problem highlighted by the experiment is primarily how the algorithms that make laws on social networks work. These work by identifying the topics and content that interests a user, who gets more and more identical suggestions the more times they consult them.

The Chinese social network is typically blocked for users under the age of thirteen, while its official rules ban videos that encourage eating disorders or suicide. Two errors uncovered by the study.

Given this circular and opaque working of algorithms, adolescents and children are the most vulnerable because they spend more time on social networks, are exposed to very strong peer pressure and see content proliferate there. Harmful, according to Josh Golin, executive director of the NGO Fair Play, which advocates for more regulation of online content to protect children. “All of this damage is related to the economic model” of social networks, according to Mr. Golin, regardless of the platform.

Also Read: TikTok Acknowledges Its European Users’ Data Is Accessible From China

TikTok questions study results and methodology

TikTok disputed the study’s findings and questioned its methodology in a statement released shortly after the report was published. In its statement, the platform notes that the researchers did not use them as typical users, and therefore claims that the results are biased. The company also said that a user’s account name does not affect the type of content they receive.

“We regularly consult with healthcare professionals, eliminate violations of our policies, and provide access to support resources for anyone who needs them,” read the statement from TikTok, which is owned by ByteDance Ltd, a now-established Chinese company, in Singapore located.

US users searching for eating disorder content on TikTok will typically receive a message with mental health resources and contact information for the National Eating Disorder Association.

However, despite the platform’s efforts, CCDH researchers found that eating disorder content garnered billions of views: 55 hashtags on the topic accounted for more than 13 million views. They also noticed that young users were using scrambled language about eating disorders to bypass content moderation.

Also read: On TikTok propaganda videos of the paramilitary group Wagner

A law for more regulation is being examined in the US Congress

“The amount of harmful content being offered to teenagers on TikTok shows that self-regulation has failed,” Mr Ahmed said, arguing for the introduction of federal regulations in the United States to force platforms to do more to protect themselves to protect.
He also noted that the version of TikTok offered to Chinese audiences is more regulated as it is designed to encourage math and science content for younger users and limit the time that 13- and 14-year-olds can spend daily in the social network TikTok.

The world

Special offer for students and teachers

Get unlimited access to all of our content from EUR 8.49 per month instead of EUR 9.99

Subscribe to

In the United States, a bill has been introduced to Congress that imposes new rules that limit the data about young users that social media platforms can collect. It aims to create a new office within the Federal Trade Commission tasked with protecting the privacy of young social media users.

One of the bill’s sponsors, Democratic Massachusetts Sen. Edward Markey, said Wednesday he hoped lawmakers from both parties could agree on the need for stricter regulations.
“Data is the raw material that Big Tech uses to stalk, manipulate and traumatize our country’s young people every day,” he said.

The world with AP