Facebook has been working hard to spin recent press around the company in its favor
Written by Camden Carter
Research contributions from Kayla Gogarty & E. Rosalie Li
Published
Facebook has once again found itself trying to downplay and dismiss bad press as a result of bombshell whistleblower testimony, leaked documents, and recent reports on how its products harm users and foster misinformation.
On September 13, The Wall Street Journal released the first installments in the The Facebook Files, a nine-part series drawn from internal documents showing how its platforms have harmed teens, been ineffective in stopping the spread of vaccine misinformation, and struggled to rein in divisive content, among a host of other issues. The whistleblower behind these documents was later revealed to be Frances Haugen, who worked at Facebook for nearly two years as a product manager on a team that was designated to counter election interference. Haugen testified in a Senate hearing on October 5, reiterating that during her time at the company, she saw a pattern of Facebook executives ignoring evidence of the harm being caused by their products.
In the weeks since the release of the Wall Street Journal’s reporting and the resulting Senate hearing, Facebook and its spokespeople have been on the defensive, desperately trying to salvage their company’s already tarnished reputation. Here are a few narratives Facebook and its spokespeople have pushed, misrepresenting the company’s history and obscuring the harms it’s caused.
Discrediting whistleblower Frances Haugen
Haugen, who worked at Facebook for two years, provided slides of Facebook’s research to back up her claims around the platform’s poor content moderation, incentivisation of sensational content, and lack of transparency. Her statements are also consistent with Media Matters' own research, in addition to the research of many other journalists and organizations.
Nevertheless, her former employer has repeatedly attacked her credibility on Twitter, led by Facebook Policy Communications Director Andy Stone:



On Sunday, October 10, Facebook’s Vice President of Global Affairs Nick Clegg made multiple television appearances, including on ABC’s This Week, in which he tried to write off Haugen’s statements, while making other dubious claims about Facebook’s practices:
Citation From the October 10, 2021, edition of ABC's This Week
GEORGE STEPHANOPOULOS (HOST): How about going back to, having Instagram go back to the way it was before? Having posts show up in chronological order rather than ranked by an algorithm that focuses on engagement?
NICK CLEGG (VP OF GLOBAL AFFAIRS, FACEBOOK): So we do actually already give people on Facebook the option to just override the algorithm and see posts come in the order in which they’re presented, chronologically. In fact, we have gone further. There are new tools in recent months so you can in effect curate and compose your own news feed by picking out your favorite pages and so on. But here’s the thing. And I heard, I think, from Frances Haugen and her team that for them one of the central recommendations is that you just remove the algorithms that help rank the content, the order in which you see the content on Facebook. If you were just to sort of across the board remove the algorithm, first thing that would happen is people would see more, not less hate speech, more not less misinformation, more not less harmful content. Why? Because those systems precisely are designed like a great sort of giant spam filter to identify and deprecate and downgrade bad content.
Both Clegg and Instagram head Adam Mosseri have claimed that removing the content-ranking algorithm will increase the reach of harmful content instead of reducing it. While only Facebook knows the exact ways in which its algorithms function, several studies and reports have shown that Facebook’s platforms promote plenty of harmful content, and that they may not be all that effective in reducing it.
During his appearance on NBC’s Meet The Press, Clegg also refuted Haugen’s statement that certain protections Facebook put in place prior to the 2020 election were removed immediately after. Clegg claimed that an example of Facebook’s continued protections is that the platform has permanently stopped recommending political groups to users. However, reporting by The Markup in January and again in June showed that the company did not follow through on this action after it was initially promised by Facebook CEO Mark Zuckerberg in October 2020.
Citation From the October 10, 2021, edition of NBC's Meet the Press
CHUCK TODD (HOST): Welcome back. On Tuesday the Facebook whistleblower, Francis Haugen, told Congress that in the interest of making more money, Facebook eased some security safeguards, just after the election, and that it ended up helping to incite the January 6 Capitol riot.
(VIDEO BEGINS)
FRANCES HAUGEN (FACEBOOK WHISTLEBLOWER): ... and Facebook changed those safety defaults in the run up to the election because they knew they were dangerous, and because they wanted that growth back, they wanted the acceleration on the platform back after the election, they returned to their original defaults.
(VIDEO ENDS)
TODD: Well, joining me now is Facebook's vice president for global affairs, Nick Clegg. Mr. Clegg, welcome to Meet the Press, sir.
NICK CLEGG: Morning.
TODD: Thank you. I want to get you to respond to that specific quote from Ms. Haugen but I also want to put something up that you wrote after her initial 60 Minutes appearance, you said, “This is also why the suggestion that is sometimes made that the violent insurrection of January 6 would not have occurred if it was not for social media is so misleading. Mature democracies in which social media use is widespread hold elections all the time, for instance Germany's election last week.” Now, two weeks ago, “without the disfiguring presence of violence.” I understand why you wrote that sentence, Mr. Clegg. But why put in the safeguards before the election, if you didn't think -- if you guys at Facebook didn't think you had a role in potentially inciting folks?
CLEGG: So just for folks who don't sort of follow this very closely, what we did in the run up to the election, where we put in obviously, because it was an exceptional election happening at the time of a pandemic, the obviously very stark polarization in this country, put in a number of exceptional measures, it's simply not true to say that we lifted those measures immediately. We in fact kept the vast majority of them, right through to the inauguration, and we kept some in place permanently so for instance we permanently now don't recommend civic and political groups to people.
Facebook’s official statement, written by its Director of Policy Communications Lena Pietsch and posted to Stone’s Twitter, also directly undermined Haugen’s credibility:

Claiming that Facebook has been asking for regulation
Facebook has been boasting of its calls for internet regulation as another tactic to offload blame in the aftermath of the Wall Street Journal reports and Haugen’s statements. The platform has pushed the narrative that it has been calling for regulation for years but is also “not waiting on Congress" and has “made progress on issues like data portability, safety and security.” By doing so, Facebook is able to reinforce the idea that it is already going above and beyond in making its platforms safe and secure for users. However, it has been pointed out that the regulations that Facebook is calling for wouldn’t have a substantial impact on the platform.

On October 12, USA Today published an op-ed by Clegg, in which he laid out how Facebook has been “advocating for new rules for several years.” He beat the same drum during his appearances on NBC’s Meet the Press and ABC’s This Week on October 10.
As Sen. Amy Klobuchar (D-MN) discussed in the Senate hearing, tech lobbyists and the money they spend have been an impediment to passing legislation around these issues. She pointed out how this has impacted the antitrust subcommittee’s ability to pass legislation around consolidation, which has allowed companies like Facebook to act like “bullies in the neighborhood, buy out the companies that maybe could have competed with them, and added the bells and whistles.” (Additionally, Facebook has spent over $9.5 million on federal lobbying in just the first half of this year, calling into question the sincerity behind its public support of greater internet regulation.)



Ultimately, the regulations Facebook claims that it has been gunning so hard for are not those that would make the most impact. They are predominantly measures that the company is already taking and are centered primarily around content, not advertising -- meaning they wouldn’t impact its bottom line. On its website, Facebook identifies the regulations it deems the most necessary, and while these do include support of regulations around political advertisements, it has been reported that the company has simultaneously lobbied against passing these similar acts.
Claiming that Facebook works hard to address the issues with its platform
One of the most common themes in Facebook’s current PR blitz is an attempt to emphasize that it is working very hard and doing the best it can to address the issues brought to light by Haugen and The Wall Street Journal. These claims have been debunked by numerous experts who continually make recommendations on how Facebook could address the harm its platform causes -- recommendations that the company ignores. This framing from Facebook seem to be geared toward deflecting responsibility from the platform for the harm it has caused without making real commitments -- a tactic frequently used by Facebook, as we saw in the Senate hearing on September 30.
Despite these claims about all the hard work Facebook has put into addressing the issues with its platforms, it does nothing to address the core flaws such as its business model, which leads to the prioritization of user engagement over user safety.



On October 8, Stone retweeted a lengthy Twitter thread by David Gillis, director of product design, reiterating the same sentiment -- that Facebook is working hard and that we should be giving it more credit:

Additionally, as a part of his PR media tour on October 10, Clegg made many similar statements implying that Facebook is doing the best it can to address very tricky issues -- which, let us not forget, it has also caused:
Citation From the October 10, 2021, edition of NBC's Meet the Press
NICK CLEGG (VP OF GLOBAL AFFAIRS, FACEBOOK): Hate speech, the prevalence of hate speech, the presence of hate speech on Facebook now stands at 0.05%. That means for every 10,000 bits of content you'll see on Facebook, only five will be hate speech. I wish we could bring it down to zero. We're not going to do with that. With a third of the world's population on our platforms, you'll see the good, bad and ugly of human nature on our platforms. Our job is to mitigate and reduce the bad and amplify the good. And I think those investments, the technology and the evidence of how little hate speech there is now compared to a few years ago shows we're moving in the right direction.
CHUCK TODD (HOST): I want to go to the issue of how to regulate Facebook. The founder and CEO wrote this, Mark Zuckerberg. He said, "Similar to balancing other social issues, I don't believe private companies should make all of the decisions on their own. That's why we have advocated for updated internet regulations for several years now. We're committed to doing the best work we can. At some level the right body to address trade offs between social equities is our democratically elected Congress." On one hand this is a very reasonable statement, on the other hand it sounds like Facebook is going to do much until Congress tells us what to do. Do you want Congress to write Facebook's moral and ethical code?
CLEGG: No, no, no. We're not advocating regulation to divest ourselves of our own responsibilities. Of course with the success of a big global platform like Facebook, comes accountability, scrutiny, criticism and comes responsibility. That's why we make those very considerable investments that I said. That's why we are being evermore transparent in how our systems operate so people can hold us to account. We're the first company, for instance, every 12 weeks to publish data on all the content we act on, that we remove, and subject to independent audit. There are certain things that no private company can do. Only lawmakers can pass federal privacy legislation. We don't have nationwide privacy legislation in this country which we clearly need. You do have it in other jurisdictions in Europe, but not here. Only lawmakers can pass legislation to strike the right balance so if people move data from one platform to the other, which is good for competition, you strike the right balance with the privacy safeguards which should be in place at the same time. That has to be enshrined in law. Only lawmakers can create a digital regulator which we believe would be a good thing.
So absolutely, you're right. We're not saying this is somehow a substitution for our own responsibilities, but there are a whole bunch of things that only regulators and lawmakers can do. And at the end of the day, I don't think anyone wants a private company to adjudicate on these really difficult tradeoffs between free expression on the one hand and moderating or removing content on the other. About which, as you know, there is fundamental political disagreement. The right thinks we take down too much content, we censor too much content. The left thinks we don't take down enough. In the end, we make the best judgement we possible can. But we're caught in the middle in this political debate. In the end, lawmakers have to resolve that themselves.
Clegg’s last point reinforces the perception that Facebook is working to meet demands from its users on both sides of the political aisle, and it implies that these demands are equally valid. Sure, the right may claim that conservative voices are being censored, but Media Matters' research and other reporting has consistently shown that not only is conservative content not censored on Facebook, it often gets greater reach than left-leaning content.
Claiming The Wall Street Journal misrepresented Facebook’s research
Facebook spokespeople have been repeating the claim that The Wall Street Journal mischaracterized the leaked Facebook internal research. This sentiment is both frustrating and unsurprising, considering Facebook has a record of carefully curating what information it chooses to release to the public.
In a Twitter thread on September 18, Clegg said that while Facebook should be held accountable, The Wall Street Journal articles “contains deliberate mischaracterizations of what we are trying to do & confers egregiously false motives to our leadership & employees. Its central allegation is just plain false: that we systematically & willfully ignore research that is inconvenient.”
Stone made similar comments on Twitter:

The company did eventually provide the documents -- with “its own running commentary” -- about its findings on Instagram’s impacts on teens. But many lawmakers and experts disagree with Facebook’s conclusion that its findings don't demonstrate a significant negative impact. Instagram itself announced that it will be pausing development of Instagram Kids, following the initial Wall Street Journal reporting.
On October 5, Zuckerberg himself addressed the whistleblower statements in a Facebook post, which hit on many of the narratives already discussed, claiming that the statements Haugen made were “illogical,” that Facebook needs and wants regulation, and that the research was largely mischaracterized by The Wall Street Journal.
Pushing generally positive PR about FB
Throughout the bad news cycle, Facebook has taken the tried-and-true approach of simply plowing on by pushing positive stories about the company in front of its users.
Several high-level employees at Facebook tweeted out stories about the positive impact Facebook is having around the world.

Multiple official Facebook accounts also shared positive messages about the company.

Facebook has also continued to run paid ads on its own platform that push more good press in front of its users.

While Facebook continues to frantically spin the narrative and push a positive image, the reality is the company has allowed its platform to foster division and instability for years by failing to adequately address issues such as hate speech, violence, election misinformation, COVID-19 vaccine misinformation, and many more.
During the company’s earlier days, its team operated under Zuckerberg’s motto: “Move fast and break things.” While it has since distanced itself from the phrase, it seems that Facebook is incapable of taking the desperately needed step of slowing down and fixing things, and it would prefer to focus its energy on its public image rather than the ways its platform is harming the public.