1706167952 The horrors experienced by meta moderators I didnt know what people

The horrors experienced by meta-moderators: “I didn’t know what people were capable of”

The horrors experienced by meta moderators I didnt know what people

After repeatedly watching several videos of suicides, murders, dismemberments and rapes, he had a panic attack and asked for help. This employee, who worked as a content moderator at a company that provides services to Meta – owner of Facebook, Instagram and WhatsApp – was told to go to the fun floor: a large games room on one of the floors of the Glòries Tower in Barcelona, ​​​​where the Californian tech giant's content moderation offices are located. He sat in front of a ping pong table and stared into space. The fun floor didn't help him at all. On another occasion, her boss gave her permission to see an industrial psychologist two hours after another panic attack. It was on a different floor, the psychology floor. He spent more than half an hour talking to her and getting everything out. When he finished, she told him that his work was very important for society, that they were all heroes, and that he should be stronger. And that time was up.

Content moderators are responsible for keeping the Facebook wall or Instagram feeds clean and peaceful, platforms that millions of people use every day and are unaware that this dark part exists. These employees are the ones who decide whether to publish fake news or photos that do not follow Meta's guidelines. But they are also the ones who have to deal with the most brutal content: see it, rate it, censor it and, if necessary, send it to the police.

In 2018, the company CCC Barcelona Digital Services established itself in a dozen floors of the Glòries Tower. The announcement was very positively received by the Catalan authorities, as the subcontractor of this large technology company was included in the list of innovative companies based in Barcelona and occupied part of a building that had just been lost due to the process of opportunity to become the headquarters of the European Medicines agency to accommodate.

The company began hiring people, particularly young foreigners, who spoke multiple languages ​​to moderate content from different markets. Last October, an investigation by La Vanguardia revealed the conditions under which these moderators work. Previously, the Generalitat Labor Inspectorate opened an investigation in 2021 and imposed a fine of more than 40,000 euros on the company the following year for deficiencies in the assessment and prevention of psychosocial risks in the workplace. In 2020, the company was acquired by Canadian Telus International, which assures that the allegations are false and has sufficient security measures in place.

This employee joined in 2018 and remained until 2020, when he was placed on medical leave due to his mental health issues. The company and the mutual described it as a common condition. “We then applied for a variation of the contingent liabilities as his case fitted perfectly with that of an industrial accident. The National Social Security Institute agreed with us and the company appealed, thus starting the legal process,” explains Francesc Feliu, partner at the law firm Espacio Jurídico Feliu Fins, which specializes in health matters.

On January 12, the 28th Social Court of Barcelona dismissed the company's lawsuit and ruled that the vacation should be classified as an industrial accident. It is the first ruling to recognize that a content moderator's mental illness is due to their work. “Work stress is the sole, exclusive and undoubted cause” of the disorders, says the ruling, which can be appealed. Feliu has about 25 other workers waiting to have their illness recognized as an industrial accident, and in October he also filed a criminal complaint against the company denouncing its lack of safety measures.

The employee requests anonymity because he is subject to a strict obligation of confidentiality and would prefer not to talk about his feelings or very personal topics because the scars that this job has left on him are still visible: he finds it difficult to report in the news of the sentence because they let him relive what he saw. “But at least this encourages more people to seek justice,” he emphasizes.

When he started working at the company, he had no idea of ​​the violence of the videos he would see. “They told me, but on top of that you can see that things are much, much worse…” he says. The lawyer explains that the work is well paid (about 2,400 euros gross per month, although there are differences in salaries between workers serving different markets, which another law firm has also accused in court), that no experience or training is required, and this attracts young foreigners: “They say: 'Look, that's cool, I work for Meta,'” explains Feliu. The affected worker points out that the illusions will soon fade: “People are completely unaware of what is happening. Before I worked there, I assure you I didn’t know what people were capable of.”

The workers' suspicion: They are training an AI

Feliu explains that at this point – “conditions may have changed now,” he emphasizes – the content moderators who achieved the best ratings for their effectiveness (there was a monthly evaluation of employees) were moved to a high priority section arrived. That is, through one channel they continued to receive videos of all kinds (normal publications, but also violent videos when they appeared) and through another channel they received only content (videos, photos or publications) in which suicides and terrorist attacks occurred.

The composition worker found himself in this section: “When you see this all the time, you become more sensitive to everything.” After a while, I couldn’t even see a suicide note anymore,” he explains. You had to strictly adhere to Meta's guidelines and often watch the videos to the end, several times and from different moderators. “For example, a live video of someone declaring that they wanted to commit suicide, you had to watch it constantly and you couldn't delete it or notify the police if you didn't see anything in the scene that suggested suicide, a weapon , an open window… “Sometimes they suddenly pulled out the gun and shot themselves without anyone being able to do anything about it,” he complains.

To remove a video, they had to explain the decision in detail: “You had to rate the video on a scale of the worst that happened.” If the video started with some kind of violence, we had to wait to see if something more serious came out, such as murder, dismemberment or sexual abuse to classify it according to the most serious form. If the most serious violence appeared at the beginning, the system would allow you to eliminate it.”

This approach made her suspicious. “If you can see something is violent after 10 seconds, why do you have to wait? “You come to the conclusion that they are training artificial intelligence (AI), they are cannon fodder,” says Feliu. A spokesman for the subcontractor interviewed in this regard did not clarify whether this project existed and referred to Meta.

Around 2,000 people work in the company after cuts at Meta last year reduced the subcontractor's workforce through an employment law act. The works council did not respond to this newspaper's questions and the company has appealed the ruling. In a statement, Telus explains that “thanks to the comprehensive welfare program” sick leave was reduced to 14% of the workforce in December last year and that only “between 1% and 2%” were on sick leave due to work-related mental illness.

The company says it has hired outside medical support, has a number of consultants available to the team 24/7, can request calm and emergency sessions if they see disturbing content, and has technology to unblur videos or turn off the sound if needed. “Any claim that employees are constantly exposed to disturbing content eight hours a day is false,” the statement said, ensuring employee well-being is a priority. During the trial, the company denied a connection between the person's mental illness and his work and argued that he had visited a psychologist at the age of 16.

The worker explains that during his working hours there was a five-minute break every hour during which he could not go out to get some air because he would have run out of time just going down in the elevator. The lunch break lasted 20 minutes and included activities such as yoga sessions and games, “but no dedicated supervision” for the staff, who evaluated about 400 pieces of content each day.

In addition, the changing schedules—one week in the morning, another in the afternoon, another at night—disrupted her rest, “which was already difficult due to nightmares.” “25% of people were systematically on sick leave, plus everyone who left their job before being on sick leave,” recalls Feliu, who believes that the ruling and possible follow-up will help the company change things: “Content moderators are essential to social networks, but so are their conditions.”

Follow all information Business And Business on Facebook and Xor in our weekly newsletter

The five-day agenda

The most important business quotes of the day, with the keys and context to understand their significance.

RECEIVE IT IN YOUR EMAIL

Subscribe to continue reading

Read without limits

_