1694835983 According to the study Facebooks design makes it easier for

According to the study, Facebook’s design makes it easier for anti-vaxxers to spread

Covid-19 anti-vaccinesA woman at an anti-vaccine protest in Barcelona in 2021. Carles Ribas (EL PAÍS)

News like “Vaccines kill more people than Covid” or “Vaccines contain baby tissue” are among the false statements shared by Facebook users since March 2020. To combat vaccine misinformation, Facebook removed these statements from its walls, but it could not escape the interest that this type of content generated on its social network. Anti-vaccine users took advantage of Facebook’s architecture to forge new paths toward vaccine misinformation: their content became “more misinformative, more politically polarized, and more likely to appear in users’ news channels,” according to Science’s report published this Friday published study Advances magazine.

The study’s researchers used Facebook parent company Meta’s CrowdTangle tool to download the company’s public data. They analyzed 200,000 posts from Facebook pages and groups created before the platform took down Stop Mandatory Vaccination, one of the largest anti-vaccination pages on the social network, in November 2020. On December 3, 2020, Mark Zuckerberg’s group announced that they would be removing false claims about Covid vaccinations and accounts that posted such claims from the platform. Meta, Facebook’s parent company, has removed more than 27 million pieces of content, according to Bloomberg. Now the company no longer wants to remove false Covid claims.

According to Statista, the largest networking platform in the world with 2,073 million active users is characterized by a high level of flexibility in adapting content to user needs. Its “layered hierarchy” design, consisting of pages (top), groups (middle) and other users (bottom), provides alternative routes to anti-vaccination content and makes it easier to access, the study says. “When a page’s content is deleted, the layered hierarchy allows users and groups to find similar (or even the same) content posted on another page,” explained David Broniatowski, a professor at George Washington University and a professor at EL PAÍS, the author of EL PAÍS the study. In other words, the administrators of these sites can share content with links to other sites.

More information

And there, on Facebook pages and groups, the political information that is consumed by the fundamentally conservative target group is shared. This is what Spanish researcher Sandra González-Bailón from the University of Pennsylvania explained to EL PAÍS in August: “In terms of volume, there are many more users than pages or groups. However, if we remove pages and groups, there is less separation. There’s something intuitive about it because you don’t choose your family members, you decide which pages and groups to follow. There is more self-selection. But it is a decision of platform design and its controls. We see that sides and groups are creating more division instead of helping to build bridges,” he explains.

In addition to design, likes can also promote anti-vaccination content. The more likes or angry reactions there are to Facebook messages, the more likely they are to show up in other users’ news feeds because of their “meaningful social interaction,” researchers said. When anti-vaccine content creators manipulate these reactions, they increase the visibility of their content online. Asked about the study, Professor David García from the University of Konstanz (Germany) points out that “reducing the value of the angry emoji in medical misinformation is a good idea because we know that expressions of moral outrage increase polarization.” “. Although Facebook reduced the weight of angry reactions to zero in September 2020, the study reports that the rate of engagement with false content did not decrease compared to interactions before the removal policy.

Rafael Rubio, an expert in disinformation at the Complutense University of Madrid, explains that Facebook’s algorithm favors disinformation because “it reduces the plurality of information received and multiplies the disclosure of similar information.” To do this, he proposes certain changes to the platform’s rules: “The complaints procedure can be improved, its transparency increased and the volume of news dissemination reduced, which directly affects the algorithm.”

The study and experts agree that the social network needs to review its policies against misinformation to avoid risks and propose alliances between major platforms to combat the problem. Broniatowski suggests that social media designers work together to develop a “building code” that promotes the health of everyone on the network, comparing it to building houses. “Building architects must balance a property’s design goals with code compliance to protect the people who use it,” he concludes.

You can follow EL PAÍS technology on Facebook and Twitter or sign up here to receive our weekly newsletter.