Facebook
Melissa Joskow / Media Matters

Facebook has repeatedly caved to bogus right-wing pressure

As Facebook refuses to institute real change in response to the Stop Hate For Profit campaign, it's worth looking back at just how much it catered to the right over the years

Facebook executives Mark Zuckerberg and Sheryl Sandberg met with leaders of the Stop Hate For Profit campaign today. Following the meeting, members of the campaign were clear that Facebook offered “the same old talking points” and intended to take “no real action” in response to the massive campaign in which dozens of companies have pulled advertising from the platform due to Facebook profiting from the spread of demonstrable lies, hate, and racism on its platforms.

Civil rights groups have been clear about what they want from Facebook: an end to hate speech and harassment of Black users on the platform. Even after years of complaints, Facebook’s recent actions have been half-measures at most. When Facebook announced it would apply its anti-hate speech policy to ads, the NAACP blasted the company in a statement, saying that the promises will not “make a dent in the problem.” The statement concluded by noting that Facebook has “made apologies in the past. They have taken meager steps after each catastrophe where their platform played a part. But this has to end now.”

This inaction comes after years of Facebook giving tangible, meaningful concessions to bogus right-wing pressure campaigns. Here are some examples.

Update (8/6/20, 8/10/20, 11/24/20): This post has been updated with additional reporting from BuzzFeed News, NBC News, and The New York Times.

  • Trending Topics

    This is the origin story of Trumpism at Facebook. When Gizmodo published a flimsy report in May 2016 claiming that Facebook employees had “blacklisted” right-wing outlets and stories from the platform’s Trending Topics news section, the tech company sprung into action, quickly arranging a meeting with conservatives and a representative from Donald Trump’s campaign.

    The meeting was reportedly set up by Facebook executives and former Republican operatives Joel Kaplan and Katie Harbath.

    Even though an investigation revealed “no evidence of systematic political bias” in the decisions, the company fired the editors of the section anyway and replaced them with an algorithm that repeatedly surfaced fake and conspiratorial news up until the point that Facebook just got rid of it entirely in June 2018. Nowadays, we only know what’s trending on the platform thanks to apps like CrowdTangle -- which Facebook bought and now limits access to. And the answer to what is trending, more often than not, is right-wing sources.

  • Featuring partisan right-wing fact-checkers

    2016 saw an explosion of blatant hoaxes and fabrications on Facebook. After a pressure campaign, Zuckerberg said that he would allow fact-checkers on the platform in a limited capacity. Right-wing media immediately and repeatedly protested. In response to the complaints, Zuckerberg allowed multiple right-wing media organizations to serve as fact-checkers on the site: The Daily Caller and The Weekly Standard.

    While the Standard would later be shut down by its owner for being insufficiently pro-Trump, The Daily Caller’s Check Your Fact division continues to serve as a fact-checker on the platform to this day, and it serves a political purpose: When Trump went on a rant in March at a campaign rally downplaying the seriousness of the coronavirus outbreak and potential for it to take on pandemic status as a “hoax” by Democrats, Check Your Fact used its status to get posts about his comments from Politico and NBC News marked as “false information.” No nonpartisan fact-checker, as far as I am aware, had a similar finding.

    Meanwhile, Facebook has shown that it will disregard fact checks when right-wing media complain about them. Recent reporting reveals that Facebook gave a special exemption from fact checks for climate crisis denial, and Facebook completely caved in late 2019 regarding a right-wing propaganda video about abortion, explicitly overruling medical professionals who fact-checked it.

    Zuckerberg is complicit in all of this. He misled Congress when asked about the fact-checking program in 2019. When Rep. Alexandria Ocasio-Cortez (D-NY) asked Zuckerberg why The Daily Caller is included in the Facebook fact-checking project given the publication is “well-documented with ties to white supremacists,” Zuckerberg demurred, saying that Facebook doesn’t appoint the fact-checkers. That’s just false: Facebook requires independent certification by the International Fact Checking Network, but the company itself appoints the fact-checkers, not the IFCN; to wit, there were at the time 62 companies with that certification, but Facebook had only allowed six to serve as fact-checkers in the U.S.

  • Including Breitbart as a trusted news source

    In October 2019, Facebook included the far-right website Breitbart as part of a new initiative to curate news on the platform, despite supposed standards concerning misinformation and community guidelines against hate speech.

    To list all of Breitbart’s racism and ties to white supremacy would take all day. In addition to its white nationalism and hate speech, the site also failed in 2017 to prove that it was editorially independent from key Trump supporters.

    In 2018, Facebook’s Kaplan personally made an effort to “protect” Breitbart's account from being downsized in the news feed, and Zuckerberg himself agreed.

    In announcing the News Tab in 2019, Zuckerberg personally defended including Breitbart, saying that Facebook “needs to have a diversity of, basically, views in there. ... You want to have content that represents different perspectives but is doing so in a way that complies with the standards that we have for this.”

  • Giving up on fighting misinformation and instead boosting right-wing sources

    Just days ago, The Washington Post reported that in response to right-wing complaints, Facebook abandoned efforts to combat misinformation and instead focused on boosting right-wing sources in its news feed algorithm, specifically to “neutralize claims that it was biased against conservative publishers.” Former Facebook spokesperson Nu Wexler told the Post that Trump’s baseless claim of anti-conservative bias “succeeded in getting [platforms] to revise their rules for him.”

    Earlier in 2020, the Post reported that Kaplan personally defended keeping up a network of fake news sites, saying, “We can’t remove all of it because it will disproportionately affect conservatives,” and that prominent right-wing figures did not believe that the content they pushed was fake news. A former Facebook employee told the Post that Facebook does not know any other way to engage with Republicans other than to just give in, saying, “Facebook does not speak Republican. … This is what they know about Republicans: Tell them ‘yes’ or they will hurt us.”

    In May 2020, The Wall Street Journal reported that Facebook just gave up on efforts to make the platform less polarizing. The Journal reported that a Facebook researcher found in a 2016 case study that extremist content was thriving in private groups on the platform, and that the company’s own algorithms were responsible for “64% of all extremist group joins.” Facebook shelved the research and did nothing about it.

    A further finding in that Journal report was that there was more far-right content on Facebook than far-left content; the impact of this phenomenon was that efforts like reducing clickbait affected the right more than the left. (You still see that today in things like James O’Keefe’s recent Facebook exposes, which only proves -- if it’s all true, which knowing O’Keefe is a colossal assumption unlikely to pan out -- that Facebook’s rules impact the right more than the left.)

    A third part of the Journal report noted that Facebook’s algorithm favors hyperpartisan “super-sharers.” When a team proposed a change to favor content from more typical Facebook users, Kaplan once again intervened to argue against the proposal. Zuckerberg himself made the final call to implement the proposal but to “cut the weighting by 80%,” per the Journal. The report adds that “Zuckerberg also signaled he was losing interest in the effort to recalibrate the platform in the name of social good, they said, asking that they not bring him something like that again.”

  • Secret meetings with the right

    In late 2019, Zuckerberg met President Trump for dinner. We only learned of it much later, and due to the secrecy of both Trump’s White House and Facebook, there’s no record of what they discussed, aside from Trump’s occasional rant that he’s “No. 1 on Facebook.” Zuckerberg still refuses to disclose what the two talked about.

    Additionally, following the Trending Topics meeting in 2016, Zuckerberg has continued to have private meetings with right-wing media figures, including hosting Ben Shapiro in his own home. Facebook even took advice from a far-right figure who blamed gay marriage for hurricanes. There’s simply no track record like that on the left. Facebook also reportedly had a meeting in 2016 with former Trump campaign manager Corey Lewandowski that only came to light this year.

  • Making advertising decisions out of fear of angering Republicans

    Facebook reportedly may be making advertising decisions out of fear of angering Republicans. The company has continued to allow narrow targeting of political advertising to individuals by “their home address, gender, education level, income, marital status, job or other characteristics,” even as Democrats warned that such targeting undermined transparency and eased the spread of disinformation. Google has limited targeting, and Twitter has banned political ads. The Washington Post’s report quotes former Facebook executive Alex Stamos: “I think Facebook is looking at their political advertising policies in explicitly partisan terms, and they’re afraid of angering Republicans.”

    In 2019, Joe Biden’s campaign sent a request to Facebook asking the company to remove a Trump campaign advertisement that falsely suggested the former vice president had offered Ukraine $1 billion in aid to fire a prosecutor investigating a company tied to his son. Facebook refused to take the ad down. Katie Harbath, the Facebook executive who made the decision, previously worked for the National Republican Senatorial Committee as well as current Trump lawyer Rudy Giuliani.

    In 2020, Facebook let the Trump campaign run ads with demonstrably false claims about voter fraud. The platform also reportedly let a pro-Trump super PAC lie repeatedly in ads, even though its own fact-checkers had previously debunked the claim.

  • Staying silent on Russian activity in 2016

    According to The Washington Post, Facebook knew about Russian ads and interference in the 2016 election, and its 2017 white paper on the topic avoided mentioning Russia -- allegedly in part to avoid angering the Trump administration.

  • Different reactions to calls for regulation

    Zuckerberg has been very clear in taking a harsh stance toward calls from progressives like Elizabeth Warren to regulate Facebook, even railing against the Democratic senator in leaked audio.

    By contrast, he has been much more conciliatory toward Trump even as the Justice Department began an antitrust review of tech companies, including Facebook.

  • Special treatment for Ben Shapiro

    Judd Legum has been on top of this story from the beginning. He’s reported at length about secret networks amplifying content from Shapiro’s The Daily Wire. As Legum notes, Facebook has done nothing when this is reported, even as it “has taken down smaller and less coordinated networks that promoted liberal content.”

    When Legum uncovered a financial relationship between Mad World News and The Daily Wire, Facebook punished only Mad World News. Meanwhile, there is “evidence that The Daily Wire is engaged in similar deals with other large Facebook pages.”

  • Right-wing ties

    We’ve noted above that Zuckerberg has hired former Republican operatives Katie Harbath and Joel Kaplan to senior positions. In addition, earlier this year Facebook hired a former longtime Fox & Friends producer to head up video strategy for Facebook News.

    Facebook executive Nick Clegg also famously was part of Conservative Prime Minister David Cameron’s coalition government in the United Kingdom while he was a member of the Liberal Democrats.

    Facebook board member Peter Thiel also has a long history of right-wing activism.

  • No actual evidence of conservative censorship

    In 2019, after urging by right-wing media, Facebook issued a report regarding cries of anti-conservative bias on the platform. Notably, the report includes no actual data supporting those claims. Nor does the Wall Street Journal op-ed about the investigation penned by former Republican senator and Facebook apparatchik Jon Kyl.

    To the contrary, Media Matters has found in extensive studies that right-wing content outperforms other content on Facebook.

    Facebook interactions 1
    Facebook interactions 1
    Facebook weekly interactions 3

    Right-wing memes received the most engagement of all:

    Facebook weekly interactions meme
  • Facebook’s special treatment of conservatives, exposed

    On August 6, 2020, BuzzFeed News reported that Facebook had fired an engineer who attempted to expose how the company had systematically intervened in fact checks of right-leaning content posted by the likes of Breitbart, Turning Points USA founder Charlie Kirk, former Fox Nation hosts Diamond & Silk, and right-wing propaganda network Prager University. According to the report, some Facebook employees “see it as part of a pattern of preferential treatment for right-wing publishers and pages, many of which have alleged that the social network is biased against conservatives.”

    A Facebook spokesperson acknowledged in a statement that “when a fact checker applies a rating, we apply a label and demotion. But we are responsible for how we manage our internal systems for repeat offenders.”

    BuzzFeed reported that one employee “said a partly false rating applied to an Instagram post from Charlie Kirk was flagged for ‘priority’ escalation by Joel Kaplan, the company’s vice president of global public policy.” This led to someone at Facebook calling PolitiFact to discuss Kirk’s post, apparently pushing to see if PolitiFact would change its rating. In the end, it did not.

    BuzzFeed also reported on retribution against Facebook employees who tried to speak out:

    Individuals that spoke out about the apparent special treatment of right-wing pages have also faced consequences. In one case, a senior Facebook engineer collected multiple instances of conservative figures receiving unique help from Facebook employees, including those on the policy team, to remove fact-checks on their content. His July post was removed because it violated the company’s “respectful communication policy.”

    After the engineer’s post was removed, the related internal “tasks” he’d cited as examples of the alleged special treatment were made private and inaccessible to employees, according to a Workplace post from another employee.

    “Personally this makes me so angry and ashamed of this company,” wrote the employee in support of their colleague.

    The engineer joined the company in 2016 and most recently worked on Instagram. He left the company on Wednesday. One employee on an internal thread seen by BuzzFeed News said that they received permission from the engineer to say that the dismissal “was not voluntary.”

    A journalist for a fact-checking organization also told BuzzFeed that conservatives regularly complain directly to Facebook:

    The internal evidence gathered by the engineer aligns with the experience of a journalist who works for one of Facebook’s US fact-checking partners. They told BuzzFeed News that conservative pages often complain directly to the company.

    “Of the publishers that don’t follow the procedure, it seems to be mostly ones on the right. Instead of appealing to the fact-checker they immediately call their rep at Facebook,” said the journalist, who declined to be named because they were not authorized to speak publicly. “They jump straight up and say ‘censorship, First Amendment, freedom.’”

    “I think Facebook is a bit afraid of them because of the Trump administration,” they added.

    Following publication of this bombshell report, a BuzzFeed journalist noted that Zuckerberg said during an all-staff call that he was cracking down on leakers.

    On August 7, NBC News' Olivia Solon reported additional details from inside Facebook, including that executives removed strikes from right-wing outlets and current and former employees say the company is giving special treatment to the conservatives:

    The list and descriptions of the escalations, leaked to NBC News, showed that Facebook employees in the misinformation escalations team, with direct oversight from company leadership, deleted strikes during the review process that were issued to some conservative partners for posting misinformation over the last six months. The discussions of the reviews showed that Facebook employees were worried that complaints about Facebook's fact-checking could go public and fuel allegations that the social network was biased against conservatives.

    The removal of the strikes has furthered concerns from some current and former employees that the company routinely relaxes its rules for conservative pages over fears about accusations of bias.

    Two current Facebook employees and two former employees, who spoke anonymously out of fear of professional repercussions, said they believed the company had become hypersensitive to conservative complaints, in some cases making special allowances for conservative pages to avoid negative publicity.

    In The New York Times, Ben Smith reported that “the vast bulk of the posts getting tagged for being fully or partly false come from the right” according to “two people close to the Facebook fact-checking process.”

    BuzzFeed News and NBC News reported last week that Facebook executives have acted in recent months on pleas from pro-Trump voices that they not be punished for misleading readers. It’s a sign of the pressure on the company — but also of a reality that Facebook won’t say aloud: The pro-Trump media is in the misinformation business with scale and energy that lacks parallel, and in part because simply repeating the president often means spreading misinformation.

    In fact, two people close to the Facebook fact-checking process told me, the vast bulk of the posts getting tagged for being fully or partly false come from the right. That’s not bias. It’s because sites like The Gateway Pundit are full of falsehoods, and because the president says false things a lot.

    That’s the messy political reality — not the sort of neat systemic answer that makes engineers comfortable. The global surge in misinformation isn’t a matter of code, or an eternal political truth, or the structure of information. It’s just how the social-media-fueled, right-wing populism of 2020 works. And while Google, Facebook and Twitter dance around to refuse saying it out loud for obvious regulatory reasons, it makes them look dishonest and, at times, as Mr. Frankel now says of his boss’s accommodations, “ridiculous.”

  • After the election, Facebook executives refused to implement products to slow the spread of misinformation

    On November 24, The New York Times reported that in the days after the presidential election, Facebook executives knew about measures to slow the spread of misinformation on the news feed but refused to make them permanent.

    [E]mployees proposed an emergency change to the site’s news feed algorithm, which helps determine what more than two billion people see every day. It involved emphasizing the importance of what Facebook calls “news ecosystem quality” scores, or N.E.Q., a secret internal ranking it assigns to news publishers based on signals about the quality of their journalism.

    Typically, N.E.Q. scores play a minor role in determining what appears on users’ feeds. But several days after the election, Mr. Zuckerberg agreed to increase the weight that Facebook’s algorithm gave to N.E.Q. scores to make sure authoritative news appeared more prominently, said three people with knowledge of the decision, who were not authorized to discuss internal deliberations.

    ...

    Some employees argued the change should become permanent, even if it was unclear how that might affect the amount of time people spent on Facebook. In an employee meeting the week after the election, workers asked whether the “nicer news feed” could stay, said two people who attended.

    Guy Rosen, a Facebook executive who oversees the integrity division that is in charge of cleaning up the platform, said on a call with reporters last week that the changes were always meant to be temporary. “There has never been a plan to make these permanent,” he said.

    The Times also reported that Facebook had other measures to slow the spread of misinformation and hateful content, but executives vetoed them out of fear that it would disproportionately affect right-wing misinformation.

    [O]ther features employees developed before the election never were.

    One, called “correct the record,” would have retroactively notified users that they had shared false news and directed them to an independent fact-check. Facebook employees proposed expanding the product, which is currently used to notify people who have shared Covid-19 misinformation, to apply to other types of misinformation.

    But that was vetoed by policy executives who feared it would disproportionately show notifications to people who shared false news from right-wing websites, according to two people familiar with the conversations.

    Another product, an algorithm to classify and demote “hate bait” — posts that don’t strictly violate Facebook’s hate speech rules, but that provoke a flood of hateful comments — was limited to being used only on groups, rather than pages, after the policy team determined that it would primarily affect right-wing publishers if it were applied more broadly, said two people with knowledge of the conversations.

  • Citing internal “Hate Bait dashboard,” departing data scientist blasts Facebook for rewarding right-wing pages that spread hate

    A BuzzFeed News report on December 11 documented criticism of the company’s hate speech policy enforcement from a data scientist departing Facebook. The data scientist, in an internal memo written as part of Facebook’s “Violence and Incitement” team, had cited an internal product called a “Hate Bait dashboard” which showed that right-wing pages like Fox News and Breitbart spread hateful interactions on the platform. The data scientist blasted Facebook for having rewarded these pages “fantastically” for spreading “hateful vile comments.”