Three times Mark Zuckerberg misled Congress about extremism and misinformation on Facebook
What Facebook says it’s doing and what it is actually doing can sometimes be two very different things
On Thursday, Facebook CEO Mark Zuckerberg, Google CEO Sundar Pichai, and Twitter CEO Jack Dorsey testified before the House Energy and Commerce Committee during a hearing on “social media’s role in promoting extremism and misinformation.”
Overall, the hearing was pretty tame. Zuckerberg stood out, however, both for what he said and also for what he didn’t say when responding to questions about harmful content, radicalization, and whether Facebook has profited from ads containing false information.
When asked whether Facebook played a “leading role” in the January 6 insurrection, Zuckerberg placed blame entirely on the actual participants. At another point, Zuckerberg tried to explain that the reason Facebook was cited so frequently in charging documents related to the Capitol attack was because the company worked closely with law enforcement to help identify many of the insurrectionists and not necessarily that Facebook served as a staging ground. In a sense, these are less statements of fact than deflections that fall into a more nuanced place.
But Zuckerberg made three statements that were either outright false or missing such important context that they deserve their own individual fact checks.
Claim: Facebook removes harmful content
In Zuckerberg’s opening testimony, he said: “At Facebook, we do a lot to fight misinformation. We remove content that could lead to imminent real-world harm. We’ve built an unprecedented third party fact-checking program. And if something is rated false, then we add warning labels and significantly reduce its distribution.”
While this is something Facebook has claimed to be true for some time, its enforcement has left a lot to be desired -- leaving the platform a veritable hotbed of often dangerous misinformation.
- During a November 17 Senate Judiciary Committee hearing, Zuckerberg testified that Facebook “performed well” during the 2020 election. But Media Matters research has shown that Facebook allowed the Trump campaign and right-wing media to spread viral misinformation and even profited off of it. [Media Matters, 11/20/20]
- As votes were being counted in the days following the 2020 election, Facebook became a hub for misinformation and a staging ground for protests calling for ballot counting to stop. [Media Matters, 11/5/20]
- Anti-mask Facebook groups rife with dangerous misinformation about the spread and prevention of COVID-19 flourished in 2020. Media Matters was able to identify at least 55 groups on the platform dedicated to anti-mask protests. [Media Matters, 7/9/20]
- Though it claimed to be “searching for and removing” misinformation and content praising the January 6 Capitol insurrection, Facebook nonetheless served as a haven for lies that “antifa” were behind the assault. [Media Matters, 1/11/21]
- Roughly a quarter of former President Donald Trump’s 2020 Facebook posts contained COVID-19 misinformation, election lies, or extreme rhetoric about his critics. [Media Matters, 2/18/21]
- Ahead of the January 6 Capitol insurrection, Facebook enabled right-wing media outlet The Epoch Times to spread misinformation about the election results. This lie formed the basis for the deadly “Stop the Steal” movement, but Facebook did almost nothing about it. [Media Matters, 3/1/21]
Claim: Facebook doesn’t allow ads that have been flagged as false to run on its platform
Asked by Rep. Jan Schakowsky (D-IL) about whether he believes that Facebook is exempt from liability for ads containing false information, Zuckerberg claimed that “any ad that has been fact-checked as false, we don’t allow to run as an ad.”
Zuckerberg’s response omits important context clarifying that political ads are exempt from fact checks. Even on ads that are fact-checked, Facebook has a spotty record when it comes to enforcing its policy against misinformation.
- Facebook ads and Instant Articles monetized a page that pushed plagiarized content and false news, including the Pizzagate and QAnon conspiracy theories. [Media Matters, 8/1/18]
- Facebook refused to take down a false Trump ad spreading misinformation about Ukraine after a request from Joe Biden's campaign. [Media Matters, 10/9/19]
- Facebook let a pro-Trump super PAC lie repeatedly about the Biden family -- even though its own fact-checkers had debunked the claim. [Media Matters, 1/22/20]
- Facebook allowed the Trump campaign to publish at least 529 ads with false claims of voter fraud, laying the groundwork to contest the 2020 election results. [Media Matters, 5/19/20]
- Facebook profited off of the QAnon conspiracy theory by allowing Q-related ads. [Media Matters, 7/22/20]
- Facebook allowed a pro-Trump PAC to run hundreds of ads with false information about Biden, earning over 5.7 million impressions. [Media Matters, 8/4/20]
- After Facebook claimed it was removing ads with voting misinformation, Media Matters was able to find at least 80 active ads on the platform that had earned more than 2 million impressions as of October 1 -- despite appearing to violate Facebook’s policy. [Media Matters, 10/1/20]
- Facebook allowed the Trump campaign to run ads touting free access to an unproven COVID-19 treatment after the president called it a “cure.” [Media Matters, 10/14/20]
- Facebook allowed the Trump campaign to run thousands of ads with manipulated photos of Biden in the months preceding the 2020 election. [Media Matters, 10/15/20]
Claim: Facebook stopped recommending political groups for users to follow
Rep. Adam Kinzinger (R-IL) asked Zuckerberg about studies that showed Facebook’s algorithms were actively promoting “divisive, hateful, and conspiratorial content,” to which Zuckerberg responded by explaining that the company had recently stopped recommending civic and political groups to users.
This isn’t necessarily a lie, but Zuckerberg’s response once again leaves out key context that is essential to understanding why these groups posed such a problem in the first place. Facebook groups and pages have served as planning hubs for extremist actions such as the January 6 attack on the U.S. Capitol. While Facebook has tried to curb the platform’s groups from being used that way, its enforcement has lacked teeth.
For instance, as of the company’s March 17 update, Facebook said that it would “start to let people know when they’re about to join a group that has Community Standards violations, so they can make a more informed decision before joining,” when it seems that the wiser thing to do would be to simply not have groups that have Community Standards violations. At another point in the hearing, Zuckerberg acknowledged that not recommending political or civic groups is unlikely to meaningfully decrease engagement, making it an odd point to tout, overall.
- Dozens of Republican Party groups used Facebook to help organize bus trips to D.C. for the pro-Trump insurrection. [Media Matters, 1/12/21]
- A massive Facebook study on users’ doubt in vaccines found a small group appears to play a big role in pushing the skepticism and may be causing “substantial” harm. [The Washington Post, 3/14/21]
- “A new outside report found that Facebook has allowed groups — many tied to QAnon, boogaloo and militia movements — to glorify violence during the 2020 election and in the weeks leading up to the deadly riots on the U.S. Capitol in January.” [The Associated Press, 3/23/21]
- Another report found hundreds of far-right militias are still using Facebook pages and groups to promote violence after the Capitol insurrection. [BuzzFeed News, 3/24/21]