Policies implemented at Facebook to stop the spread of misinformation reduced users’ interactions with vaccine misinformation, according to research published in Vaccine.
Researchers at George Washington University examined whether new policies on Facebook established in 2019 to stop the spread of misinformation about vaccines actually worked to stop the spread of misinformation.
Researchers identified 172 anti- and pro-vaccine Facebook pages and collected posts from these pages six months before and after the policy went into effect. The study found that Facebook’s March 2019 vaccine misinformation policy moderately curtailed the number of “likes” of anti- vaccine content on Pages on its platform.
“There is a growing consensus that health misinformation on social media platforms presents a critical challenge for public health, especially during the COVID-19 pandemic,” said Lorien Abroms, professor of Prevention and Community Health at GWU. “While new policies by social media companies are an important first step to stopping the spread of misinformation, it is also necessary to rigorously evaluate these policies to ensure they are effective.”
Researchers concluded that social media companies can take measures to limit the popularity of anti-vaccine content on its platform.
“This research is a good first step in developing a process to evaluate the effectiveness of social media policies that are created to stop the spread of misinformation,” said Jiayan Gu, PhD student at GWU. “We are excited to continue this work and grow our understanding of how social media policy interventions can positively change online information sharing ecosystems.”
vaccine misinformation on facebook