Study exposes gaps in Meta’s policies against COVID-19 misinformation


A new research published on Science Advances has revealed that the COVID-19 vaccine misinformation policies of Facebook owned by Meta were not effective in reducing its spread on the platform.

The study titled the efficacy of Facebook’s vaccine misinformation policies and architecture during the COVID-19 pandemic” was published on September 15, 2023.

The study led by researchers at the George Washington University (GWU) in Washington, D.C. and Johns Hopkins University (JHU), Baltimore – both in the United States – revealed that Facebook’s efforts were undermined by the core design features of the platform itself. The research pointed out the need for the platform to focus on design and architecture.

The researchers include David A. Broniatowski (GWU), Joseph R. Simons (United States Department of Health and Human Services, Washington DC), Jiayan Gu (GWU), Amelia M. Jamison (JHU) and Lorien C. Abroms (GWU).

“Our results show that removing content or changing algorithms can be ineffective if it doesn’t change what the platform is designed to do – enabling community members to connect over common interests – in this case, vaccine hesitancy,” they said.

The research noted that misinformation on social media, particularly during a public health crisis like the COVID-19 pandemic, poses serious risks and fosters distrust in science. It added that it undermines public health efforts, and can even incite civil unrest. 

While attention is often placed on content and algorithms alone, the study argues that addressing misinformation requires a focus on design and architecture as well.

Despite Facebook’s efforts to remove anti-vaccine content during the pandemic, the study found that engagement with such content remained unchanged or even increased. 

In cases where content was not removed, the research shows that there was an uptick in links to off-platform, low-credibility sites and misinformation on alternative social media platforms, especially within anti-vaccine groups. This content became more misinformative, containing sensationalist false claims about vaccine side effects, it noted.

The study also highlighted “collateral damage” as pro-vaccine content may have been inadvertently removed, leading to increased political polarization around vaccine-related topics. Anti-vaccine content producers were found to be more effective at coordinating content delivery across Facebook.

The research compared Facebook’s architecture to a building, suggesting that the platform’s design inherently supports certain behaviours, making it challenging to balance public health and safety concerns. It emphasized the need to alter the platform’s architecture to address these issues effectively.

It proposed that social media platform designers collaborate to establish “building codes” informed by scientific evidence to reduce online harms. These codes would resemble the regulations governing building design, which prioritize public safety and well-being, and would involve partnerships between industry, government, and community organizations.

The study highlighted the importance of addressing platform design and architecture in combating misinformation on social media, particularly in critical public health contexts like the COVID-19 pandemic.

+ posts

Nurudeen Akewushola is a fact-checker with FactCheckHub. He has authored several fact checks which have contributed to the fight against information disorder. You can reach him via [email protected] and @NurudeenAkewus1 via Twitter.


Please enter your comment!
Please enter your name here

Most Read

Recent Checks