Image of the article titled
Image: Martin Bureau (Getty Images)
A small army of overworked content moderators is the last line of defense for the public against a flood of corrupt and horrifying content uploaded to social media. Moderators help ordinary users avoid the worst and worst, but their constant exposure to the darkest impulses of humanity can have a major impact on their mental health. Two former TikTok moderators now claim that the fast-growing social media giant has downplayed the mental health treatments of moderators struggling to cope with the onslaught of digital nightmares. increase.
“You’ll see people shot in the face,” he said in an interview with NPR. “And another video of the child being beaten made me cry for two hours in a row.”
The proceedings filed Thursday violate California labor law by not ensuring that TikTok and its parent company, ByteDance, properly protect moderators from the emotional trauma caused by exposure to images. Claims to have done. The proceedings moderator acts as “a gatekeeper between unfiltered, nasty, offensive content uploaded to the app and hundreds of millions of people who use the app every day.” The proceedings specifically condemn TikTok’s negligence, the negligence of retained control, and the violation of California’s unfair competition law.
“defendant [TikTok and ByteDance] We are aware of the negative psychological impact of displaying graphics and offensive content on moderators, “the proceedings claim. “Nevertheless, the defendant is unable to carry out the approved standard of care to protect the moderator from harm.”
For now, secret content moderators don’t have to interact with some terrible material, but the proceedings claim that TikTok’s moderation is actually worse than other platforms. Other companies are implementing industry group-recommended mitigation efforts, such as using filtering techniques to distort images and providing essential counseling to moderators, but the proceedings said. TikTok doesn’t.Became a plaintiff [the moderators] They are exposed to thousands of violent and offensive videos, including violent scenes, sexual assault, and child pornography, “the proceedings claim.
G / O media may receive fees
Up to $ 1,500 off
Samsung Neo QLED TV 4K (2021)
Quantum matrix technology
Experience this stunningly intense image with a huge number of small light cells that use proprietary mini LED design technology to achieve hyperfocused brightness and dimming in all the right areas. ..
Former TikTok moderators Ashley Velez and Reece Young claim that not mitigating proper mental health treatments led to an unsafe work environment. The two moderators claim to have been exposed to the most horrifying shit laundry list on the internet. Videos of child pornography, rape, bestiality, murder, headlines, and suicide crossed the desk.
Young reported witnessing a 13-year-old girl being executed by a cartel member on a camera. Velez said NPR images and videos, including her underage child, make up the bulk of the annoying content she was exposed to. “Someone has to suffer and see something like this, so no one else needs to do that,” Beres told NPR.
The proceedings argue that demanding productivity allocations imposed on workers is “inconsistent with applicable standard of care.”
According to the proceedings, moderators are instructed to review the video within 25 seconds and determine with 80% or better accuracy whether a particular video violates TikTok’s rules. Within 25 seconds, plaintiffs said moderators need to consider and consider 100 possible tags that could be applied to label problematic content. According to the proceedings, the moderators take turns for 12 hours, taking an hour’s lunch break and two 15-minute breaks.
“By screening social media posts for offensive content, content moderators are at the forefront of the fight against corruption,” Stephen Williams, one of TikTok’s leading lawyers, said in a statement. “The psychological trauma and cognitive and social disabilities faced by these workers are serious, but they are ignored and the problem only gets worse — for the company and these individuals.”
TikTok did not respond to Gizmodo’s request for comment.
Austin, Texas-March 5: Content Moderator works at Facebook office in Austin, Texas Photo: Washington Post (Getty Images)
At the top of the mountain of digital terrorism, the proceedings allege that moderators are regularly exposed to conspiracy theories and torrents of false information.
It is worth noting that Velez and Young were both contractors who worked through the companies of Telus International and Atrium Staffing Services, respectively. Although both moderators technically work for separate companies, the proceedings still seek to hold TikTok and ByteDance accountable, setting assignments, monitoring workers and taking disciplinary action. Claims to be. A Telus International spokesman told NPR that he provided mental health counseling to contractors, but Velez claims it was very inadequate. The moderator said he met with a counselor who seemed to be flooded with requests from other suffering moderators for only 30 minutes.
Through the proceedings, moderator lawyers hope to obtain financial compensation for Velez and Young and pressure TikTok to provide mental health screening and treatment to thousands of current and previous content moderators. increase. Gizmodo asked the company for comment, but did not respond.
Moderators, like many other moderators of rival tech companies, argued that they needed to sign a nondisclosure agreement that prevented them from discussing the images they saw. After spending a day struggling with the darkest depressions of mankind, workers have to fill those stories and can’t even talk about them with friends and family.
“They saw so many people that they didn’t seem to have time to actually help you suffer,” Beres told NPR.
TikTok, like other major content providers, is deploying artificial intelligence to capture most of the problematic content, but the desperate flood of potentially harmful content uploaded to the site , Means that human moderators are still essential. These moderators are often independent contractors who generally work at lower wages with less employment security and lower profits than workers employed by technology companies.
Researchers at the University of Texas and St. Mary’s University published a treatise last year recording academic literature on content moderators, finding ample evidence of repeated exposure to PTSD and other harmful content that could lead to psychological harm. ..
“Moderation work may be expected to be unpleasant, but repeated prolonged exposure to certain content, coupled with limited workplace support, can significantly impair the psychological well-being of human moderators. There is a perception today that there is sex, “the researchers write.
In other cases, Youtube and Facebook moderators have been reported hospitalized for acute anxiety and depression after repeated exposure to content. And unfortunately, for everyone, the internet isn’t that messed up. Just this week, the National Missing and Exploited Children’s Center announced that 29.3 million child sexual abuse materials were removed from the Internet last year. This is a record, an increase of 35% over the amount of material deleted a year ago.
The mental health struggle that plagues content moderators across the tech industry has gained public attention in recent years, thanks to the outflow of revealing reports and other legal action. Little is written about TikTok moderators, but many media documents often shocking working environments for Facebook and Youtube moderators.
Two years ago, Facebook settled a lawsuit filed by thousands of moderators for $ 52 million. The same law firm representing Velez and Young also represented Facebook moderators. The settlement originally stemmed from a 2018 lawsuit in which Facebook moderator Serena Scola claimed to have developed PTSD after seeing cases of rape, suicide and murder during her work. The $ 52 million settlement was spread over thousands of contracts, each with at least $ 1000 indemnity. A former Youtube content moderator also sued her employer in 2020, claiming she had developed PTSD-related depression and symptoms after seeing images of decapitation and child abuse. It’s no wonder that TikTok, one of the fastest growing social media sites online, is also on the receiving side of the proceedings.
Image of the article titled