Beheadings Suicides Rapes Child Pornography Facilitators in Africas War on

Beheadings, Suicides, Rapes, Child Pornography: Facilitators in Africa’s War on Meta

Trevin Brownie hasn’t forgotten his first day as a content moderator for Facebook at the premises of a subcontractor based in the Kenyan capital of Nairobi.

• Also read – Digital Information: Young people are turning to influencers and celebrities

“My first video showed a man committing suicide. (…) A child of two or three years was playing nearby. After hanging himself after about two minutes, the man understood that something was wrong,” says the 30-year-old South African, before describing how the child tries to save this man, his father.

“It made me sick. (…) It was like nausea, vomiting. But I kept doing my job,” he continues.

Between 2020 and 2023, he watched hundreds of violent, hateful videos every day and blocked them from being shared with Facebook users.

He worked in Nairobi for Sama, a California company that Meta – the parent company of Facebook, Instagram and WhatsApp – contracted to moderate Facebook content for sub-Saharan Africa between 2019 and 2023.

This continental “hub” employed hundreds of facilitators from different African countries, recruited in particular for their knowledge of the local languages.

Trevin Brownie says he’s seen “hundreds of beheadings,” “organs ripped out of bodies,” “rape and child pornography to the last degree,” “child soldiers preparing for war”…

“People do things to other people that I could never have imagined,” he says: “People have no idea about the unhealthy videos (…) they escape from.”

litigation

Trevin Brownie is involved in one of three cases involving Meta and Sama, formerly known as Samasource, in Kenya.

Along with 183 former employees, he is challenging his dismissal from Sama, which has announced it will stop working as a content moderator. They are demanding compensation for salaries that are “insufficient and disproportionate to (…) the risk they were exposed to” and for the “damage caused to their mental health”.

This legal offensive was initiated by the filing of a complaint in a Nairobi court in May 2022 by another former content moderator at Sama, Daniel Motaung, in which he specifically denounced “inhumane” working conditions, fraudulent hiring methods, inadequate remuneration and a lack of psychological support .

Meta, who declined to comment on the details of the cases, specifically assured AFP that she would request psychological help from her subcontractors, which would be available 24 hours a day, 7 days a week.

Contacted by AFP, Sama said he was “unable” to comment on current events.

Trevin Brownie challenges his dismissal from Sama.

Photo Tony KARUMBA / AFP

Trevin Brownie challenges his dismissal from Sama.

Call center

Testimonies collected by AFP in late April from former Sama content moderators – who are among 184 plaintiffs contesting their firings – corroborate the facts alleged by Daniel Motaung.

Two of them, Amin and Tigist (first names have been changed), were hired by Sama in 2019 and said they responded to offers to work in call centers sent to them by acquaintances or recruitment firms.

It was only after signing their contracts, which included confidentiality clauses, that they found out they would be working as content presenters.

Amin and Tigist did not argue and did not even think about leaving. “I had no idea what a content moderator was, I had never heard of it,” says Tigist, an Ethiopian who was hired because of her knowledge of the Amharic language.

“Most of us didn’t know the difference between a call center and a content moderation center,” confirms Amin, who works in the Somali “market”. But for “the group that was recruited after us, the job postings clearly mentioned content moderation,” he insists.

“Before they showed us the pictures on the first day of training, they (the trainers, editor’s note) reminded us that we had signed confidentiality clauses,” he says.

“During the training, they downplayed the content. “What they showed us was nothing compared to what we would see,” he adds. “After that, the problems started.”

trauma

Eight hours a day, their screens played content that was more shocking than the rest.

“We don’t choose what we see, it happens randomly: suicide, violence, child sexual exploitation, nudity, incitement to violence…” says Amin.

They are given an “average turnaround time” of 55 to 65 seconds per video, they claim, or between 387 and 458 “tickets” viewed per day. Work that was too slow ran the risk of being called to order or ultimately even dismissed.

Meta, for its part, assured AFP in an email that content moderators “are not required to rate a set number of posts, have no quotas, and are under no obligation to make hasty decisions.” “We allow and encourage the companies we work with to give their employees the time they need to make a decision,” it said.

None of the three content moderators interviewed by AFP could have imagined the impact this work would have on them.

They have not consulted a psychologist or psychiatrist due to lack of funds, but all report showing symptoms of post-traumatic stress disorder and experiencing new difficulties in their social interactions or with loved ones.

Trevin Brownie says he’s “afraid of children because of the child soldiers, of the brutality I’ve seen in children” or even scared of crowded places “because of all the videos of attacks that I’ve seen.” “I was crazy about parties,” he says: “I haven’t been to a club in three years now. I can’t, I’m scared.”

Slender Amin says he’s seen the impact on his body, going from 96kg when he started work to ’69-70kg’ today.

Everyone tells of going deaf from death or terror. “My heart has turned to stone,” summarizes Tigist.

When asked by AFP, neither Sama nor Meta wanted to comment on the current events.

Photo ALAIN JOCARD / AFP

When asked by AFP, neither Sama nor Meta wanted to comment on the current events.

” Need money “

Meta told AFP it has “clear contracts with each of our partners that detail our expectations in a number of areas, including the availability of personalized advice and additional support for those exposed to more difficult content.”

“We require all companies we work with to provide 24/7 on-site support with trained doctors, on-call service and access to private health care from day one,” the company said.

According to the content moderators, the support provided by “wellness consultants” by Sama did not meet the requirements. They denounce vague interviews with no real follow-up and question the confidentiality of the exchange.

“It was no use. I’m not saying they weren’t qualified, but I think they weren’t qualified to deal with people who do content moderation,” Amin said.

Despite their trauma, they stayed because they “needed the money.”

With a salary of 40,000 shillings (270 euros) – and 20,000 extra for non-Kenyans – they earned almost three times the Kenyan minimum wage (15,200 shillings).

“From 2019 to today I have never had the opportunity to find another job elsewhere, although I have applied a lot. I had no choice. That’s why I stayed so long,” says Amin.

” Sacrifice “

In order to hold on, the moderators had to find “defence mechanisms,” explains Trevin Brownie.

Some use drugs, particularly cannabis, say the moderators surveyed.

Trevin Brownie, a former comedy lover, became deeply involved with horror films. “It was a way of blurring my reality. It allowed me to imagine that what I was dealing with (at work, ed.) wasn’t real – even if it were,” he analyzes and explains that he also has an “addiction “ developed after violent images.

“But one of our main defenses was that we believed in the importance of this work,” he adds. “I felt like I was hurting myself, but for the right reasons. (…) that the sacrifice was worth it.” for the benefit of society. »

“We are the first line of defense for Facebook, (…) like the police of social networks,” he explains, calling in particular for the removal of ads for drug sales or the “removal of targets” on people who are the target of death threats are or harassment.

“Without us, social networks cannot exist,” he adds: “No one would open Facebook if it was full of shocking content, drug sales, extortion, harassment…”

“We deserve better”

“It hurts and we sacrifice ourselves for our community, for the world. We deserve to be treated better,” says Tigist.

Neither of them would re-sign for that job. “My personal opinion is that no human should do such a thing. “It’s not a human job,” explains Trevin Brownie. “Honestly, I wish artificial intelligence could do this job.”

Despite tremendous progress, he doubts that this will be possible in the near future.

“Technology plays and will continue to play a central role in our content verification operations,” Meta told AFP.

Until now nobody, not even with his family, had spoken about this work. Because of the confidentiality clauses, but also because “nobody can understand what we’re going through”.

“For example, if people find out that I’ve seen pornography, they will judge me,” Tigist explains.

With her husband she remained in the dark about her work. She hid everything from her children: “I don’t want them to know what I did. I don’t even want them to imagine what I saw.”