Meta said he should overhaul the moderation system for high-profile users like Donald Trump

Meta said it should revamp the moderation system for high-profile users like Donald Trump

Liu Guanguan/China News Service

Meta has been told its treatment of high-profile users like former US President Donald Trump has left dangerous content online that serves business interests at the expense of its human rights obligations.

A damning report released on Tuesday by the company’s oversight body — a “Supreme Court”-style panel created by the parent company of Facebook, Instagram and WhatsApp to rule on sensitive moderation issues — slammed the social media Giants asked to be “significant”. Changes to its internal system for reviewing content from politicians, celebrities, and its business partners.

The panel, which began evaluating cases last year, is coordinated by the tech giant’s policy chief and former UK Deputy Prime Minister Sir Nick Clegg, and provides independent judgments on high-profile moderation cases, as well as recommendations on specific policies.

The board was asked to investigate the system after The Wall Street Journal and whistleblower Frances Haugen revealed its existence last year and raised concerns that Meta favored elite figures.

Clegg also has until Jan. 7 to decide whether to allow Trump back on the platform following a separate board recommendation.

After a lengthy investigation spanning more than a year, Meta’s board of directors has requested that Meta scrutinize who is on the so-called “cross-check” list and be more transparent about its review procedures.

The report is one of the most thorough investigations into moderation issues at Meta yet, as the independent panel — made up of 20 journalists, academics and politicians — grappled with concerns it has little power to hold the company accountable.

It’s putting further pressure on CEO Mark Zuckerberg, who last month announced plans to lay off 11,000 employees amid falling revenue and growth to ensure Meta’s content is fairly monitored.

advertisement

Meta has already started to overhaul the system. In a Tuesday blog post, Clegg said it was originally designed to “doublecheck cases where there might be a higher risk of failure or when the potential impact of failure is particularly severe.” He added that the company has now developed a more standardized system with further controls and annual reviews.

It remains unclear how many people are on the secret list. The Wall Street Journal, which first reported on the list, estimated there were 5.8 million users listed as of 2020. Meta has previously said it was 666,000 as of October 2021.

The system meant content posted by well-known figures like Trump and Elizabeth Warren would remain on the platforms until human moderators reviewed it, even if the messages would have been automatically removed had they been posted by a regular user.

It would take an average of five days for this human review to take place, with the content remaining on the platform during that time, and in one case up to seven months for the report to be found.

Meta’s “own understanding of the practical implications of the program was lacking,” the board said, adding that the company had not assessed whether the system worked as intended.

The board also accused the company of providing “inadequate” responses to the probe, which sometimes took months to respond to.

The board referred to a Wall Street Journal report that detailed how Brazilian footballer Neymar posted non-consensual intimate pictures of another person to his Facebook and Instagram accounts, which garnered more than 50 million views before it were removed. According to Meta, this was due to a “delay in reviewing the content due to a backlog at the time.”

Thomas Hughes, director of the oversight body, said the Neymar incident was an example of how business partnerships could impact moderation processes.

“It raises concerns … about the relationships between individuals in the company and whether that might affect decision-making,” he said.

“There was probably a mingling of different interests within this cross-check process,” he added.

The report follows previous public tensions between the board and Meta after the former accused the social media company of withholding information about the system in September 2021. Many see the board of directors as an attempt to create distance between company executives and difficult decisions surrounding free speech.

Meta now has 90 days to act on the recommendations.

© 2022 The Financial Times Ltd. All rights reserved. May not be redistributed, copied or modified in any way.