1658970137 Meta thinks Facebook could use more harmful misinformation Updated

Meta thinks Facebook could use more “harmful misinformation.” [Updated]

Meta thinks Facebook could use more

The US continues to struggle with pandemic management. Where cases are currently rising, some cities and counties are considering reintroducing mask requirements, and many hospitals are facing chronic nursing shortages.

However, despite new concerns and a recent spike in daily deaths in the US and around the world, Meta is already considering what a return to normal might look like. That includes recent speculation that normality could mean it’s time to return to the company’s heyday, where health misinformation was circulated through posts on Facebook and Instagram.

On Tuesday, Meta’s President of Global Affairs, Nick Clegg, wrote in a statement that Meta is considering whether Facebook and Instagram should continue to remove any posts promoting untruths about vaccines, masks, and social distancing. To help them decide, Meta is asking its oversight board to consider whether “current COVID-19 misinformation policies are still appropriate” after “extraordinary circumstances early in the pandemic” have passed and many “countries around the world are making a return.” aspire to a more normal life.”

Clegg says Meta began removing entire categories of information from the site for the first time during the pandemic, and that created tensions it’s now trying to resolve between two of the company’s values: protecting “freedom of expression and safety.” ” the user.

“We seek the opinion of the Oversight Board on whether Meta’s current actions to address COVID-19 misinformation remain appropriate under our policy on harmful health misinformation, or whether we should address this misinformation in other ways, e.g. by flagging or downgrading directly or through our third-party fact-checking program,” Clegg wrote.

advertisement

The oversight board has already accepted Meta’s request and is making public comments here. The board expects “a large volume of submissions”. Once the board has reviewed all submissions and issued its policy recommendation, Meta has 60 days to publicly respond and state how it will or will not act on recommendations.

However, Meta is not required to abide by decisions of the oversight body, and even if a move to less extreme content moderation is approved, critics will likely interpret this as Meta looking for a scapegoat so that easing restrictions is not perceived as an internal decision.

Why change policy now?

Clegg told The Verge that Meta is now seeking advice from the oversight board because “it can take months for the oversight board to produce an opinion,” and the company now wants feedback so Meta can be “more prudent” about moderating content during future pandemics can.

Long before Facebook changed its name to Meta, it spent the year leading up to the pandemic “taking steps” to stop the spread of anti-vax misinformation. These steps are similar to the steps that Clegg suggests it appropriate to return to now. In 2019, the company began fact-checking more misinformation posts, limiting the reach of some, and banning misinformation ads.

Then the pandemic began, and research found that despite these moves, anti-vaccination content on Facebook increased and spread faster to neutral audiences who had not yet formed an opinion on the COVID-19 vaccine, compared to official information. Bloomberg reported that this vaccine hesitancy was dangerously amplified during the pandemic, and Facebook knew this was happening but was motivated by profits not to act quickly. One study showed that the pages with the greatest reach in neutral newsfeeds were “people selling or profiting from misinformation about vaccines.”

Eventually, Congress investigated and Facebook changed its name and then its policy, ruling that “some misinformation may result in an imminent risk of physical harm and we have a responsibility not to disseminate that content.” The company has made it an official policy to remove “misinformation on an unprecedented scale” and take down 25 million pieces of content it would otherwise likely have left behind under its free speech policies.

advertisement

Now, Clegg says Meta has a duty to reconsider whether it acted prematurely by unilaterally deciding to remove all of those posts so that next time there’s a pandemic, there’ll be clearer guidance that the Properly balance freedom of expression and concerns about harmful misinformation. The idea is that Meta’s harmful misinformation policy should only be used to limit the spread of misinformation at times when official sources of information are scarce, like at the beginning of the pandemic, but not anymore.

Meta is basically asking the oversight body to consider: In an era when there are obvious official sources of information, should tech companies have less of an obligation to curb the spread of misinformation?

As more people prepare to dress up to limit transmission in the US, and vaccine hesitation continues to be a driving force behind transmission, this question comes prematurely from a platform that has already proven how difficult it is to to control the spread of misinformation, even where a total ban is in place harmful misinformation.

Meta did not immediately respond to Ars’ request for comment. (To update: A meta spokesperson told Ars that “in accordance with our community standards, we remove misinformation in public health emergencies when public health authorities conclude that the information is incorrect and likely to contribute directly to the risk of imminent physical harm.” During the pandemic, “COVID-19 was declared a Public Health Emergency of International Concern (PHEIC)” so Meta “applied this policy to content containing claims related to COVID-19 that public health officials have determined are either ‘false’ or ‘probable to be.’ contribute to imminent physical harm.” Now they are seeking input from the oversight panel to “review current policies ahead of a potential future pandemic so that we can appropriately adjust those policies.” This month, a World Health Organization COVID-19 Emergency Panel convened ” Concluded unanimously that the COVID-19 pandemic still meets the criteria of an extraordinary event continuing to adversely affect the health of the world population”.)