Facebook considers eliminating its COVID-19 misinformation policies, even as dangerous content still spreads on the platform
New data from Media Matters submitted to the Facebook Oversight Board reveals the continued prevalence of harmful COVID-19 misinformation
Meta has asked its oversight board to reevaluate Facebook’s COVID-19 misinformation policies, weighing in on whether it should continue to remove or label such content. Media Matters recently provided new data to the oversight board that shows the prevalence of harmful COVID-19 content — which typically comes from right-leaning pages.
Under its current COVID-19 misinformation policies, Facebook removes misinformation that is “likely to directly contribute to the risk of imminent physical harm” and reduces the prevalence of other related misinformation that has been deemed false by third-party fact-checkers.
In June, Meta asked its oversight board to advise the company on whether to maintain its policies against dangerous COVID-19 misinformation, claiming that “the COVID-19 situation has evolved.” But massive amounts of inaccurate and harmful content still spreads on the platform, and there is still substantial transmission in communities across the U.S. and there are still people who are at risk of getting very sick and/or developing long-term conditions if they contract COVID-19.
The company’s oversight board accepted the request last month, kicking off its advisory process and requesting public comments to inform its recommendations. (The oversight board’s policy advisory opinions are not binding; they require only that Meta provide a public response and follow-up actions.)
In response to the board’s request, Media Matters submitted a public comment, calling on the oversight board to consider the prevalence and harm of COVID-19 misinformation and encouraging the continued implementation and improved enforcement of the policies.
The Board has asked respondents for comments on Meta’s COVID-19 misinformation policies and recommendations on whether Meta should continue removing and/or labeling content on these grounds. If anything, the prevalence of COVID-19 and health misinformation on Meta’s platforms and the company’s failure to consistently and adequately enforce its policies against such misinformation warrants continued implementation and improved enforcement of the policies.
The development of vaccines and therapeutic treatments and the evolution of disease variants mean that COVID-19 is less deadly — just as Meta has claimed. Yet, there are still high levels of transmission in communities across the U.S. and people who are at risk of getting very sick and/or developing long-term effects if they contract COVID-19.
Throughout the pandemic, health misinformation played a role in how individuals responded to preventative measures, such as mask wearing, and COVID-19 vaccines. In fact, several studies have found a negative correlation between exposure to misinformation and protective behaviors. And some experts agree, including Food and Drug Administration Commissioner Robert Califf, who recently said that he believes “misinformation is now our leading cause of death.” With more than 3 billion people using its platforms and its vast societal influence, Meta should be dedicated to minimizing the risk of harm from misinformation.
Even when accurate information is available on Meta's platforms, it is often buried by the sheer volume of inaccurate and harmful content — typically from right-leaning pages — that the company has been unable to control.
Media Matters also informed the board about its new data, which reveals the prevalence of COVID-19 misinformation among right-leaning Facebook pages as different viral strains spread across the country. Media Matters compiled and analyzed nearly 2.9 million posts related to COVID-19 from news and politics pages since January 1, 2020, and found that:
- The proportion of posts with COVID-19 misinformation from right-leaning pages compared to posts about COVID-19 from all news and politics pages increased as new variants spread across the U.S.
- For right-leaning pages, the ratio, or percent, of posts with COVID-19 misinformation to posts about COVID-19 more generally increased from 3.4% during the height of the original COVID-19 outbreak and alpha strain to over 5.7% during the omicron strain.
- When compared to posts about COVID-19 from all news and politics pages, the ratio, or percent, of posts with COVID-19 misinformation from right-leaning pages increased from 0.6% during the height of the original strain and alpha variant to over 1% during the omicron strain. The ratios of interactions earned on the posts followed a similar pattern.
- Posts with misinformation from right-leaning pages actually earned more interactions on average across strains than posts about COVID-19 from either right-leaning pages or all news and politics pages.
- Between January 1, 2020, and April 30, 2021 — when the original COVID-19 strain and alpha variant were the dominant strains in the U.S. — right-leaning pages earned roughly 4,100 average interactions per post containing COVID-19 misinformation. During the same time frame, posts about COVID-19 from right-leaning pages earned nearly 2,400 average interactions per post, and posts about COVID-19 from all U.S. news and politics pages earned over 1,600 average interactions per post.
- Similar patterns were observed when the delta and omicron strains were the dominant strains in the U.S.
Even if the oversight board recommends that Facebook keep its COVID-19 policies, the structure of the body itself means there is no guarantee it gets implemented, as policy advisory opinions from the board are not binding — regardless of CEO Mark Zuckerberg’s latest claim that the board “gets to make the final binding decision.” Additionally, there have been obvious problems with, and glaring loopholes in, Facebook’s previous implementation of board recommendations; the board has even admitted that it lacks data “to verify progress on or implementation of the majority of recommendations.”
During the first quarter of 2022, the Facebook Oversight Board made 22 recommendations to the company. Of these recommendations, Meta said it would fully or partially implement only 10 of them, while still “assessing feasibility” on seven and refusing to take any action on five.
Meta’s request for nonbinding recommendations on whether to eliminate its COVID-19 misinformation policies is the latest example of the company’s willful ignorance about the harm of its platforms and the prevalence of that harm.