Mark Zuckerberg | Media Matters for America

Mark Zuckerberg

Tags ››› Mark Zuckerberg
  • For Zuck's sake

    Blog ››› ››› MELISSA RYAN

    Mark Zuckerberg has been sharing a lot this month. First, he posted that his “personal challenge” for 2018 is to fix the glaring and obvious problems for which he’s taken so much heat. Last week, he announced that he had directed Facebook’s product teams to change their focus from “helping you find relevant content to helping you have more meaningful social interactions.” Zuckerberg promised users that they’d see less content from “businesses, brands and media” and more content from “your friends, family and groups.” On Friday, Zuckerberg shared another major change: Facebook would improve the news that does get shared by crowdsourcing what news sources were and weren’t trustworthy via user surveys.

    The first change, a return to “meaningful interaction,” is one I can get behind. I’m all for anything that discourages fake news sites from monetizing on Facebook. I’ve long suspected that part of why these sites took hold in the first place was a lack of meaningful content available on our feeds. Less sponsored content and more pictures and videos from family and friends will greatly improve my Facebook experience. I suspect I’m not the only one.

    I’m also hopeful this change will move digital advocacy away from broadcasting and back to organizing. Given how Facebook groups have become such a crucial part of #TheResistance I’m glad to hear they’ll be emphasized. I want to see more groups like Pantsuit Nation and the many local Indivisible groups that have formed in the last year. (Media outlets fear not, Vox has also been building Facebook groups in addition to their pages.) Digital ads and acquisition shouldn’t be the only tools digital organizers use. Increased engagement should involve actually engaging folks rather than simply broadcasting to them.

    The second change, user surveys to determine what news people trust, is maddening. If you were going to design a system that could be easily gamed, this is how you’d do it. “Freeping” online polls and surveys is a longstanding tactic of the far right online, going back nearly 20 years. It’s in their online DNA and they have groups of activists at the ready who live for this activity. Facebook isn’t handing authority over to their broader community but to an engaged group of users with an agenda. Even if the freeping wasn’t inevitable, it’s pretty well established that there’s already no common ground when it comes to what news sources people with different political viewpoints trust.

    The crux of the problem is that Facebook desperately wants to be seen a neutral platform while Facebook’s users want them to keep inaccurate information off of Facebook. In his New Year’s post, Zuckerberg emphasized he believes technology “can be a decentralizing force that puts more power in people’s hands” while acknowledging that the reality might be the opposite. There’s a tension between his core beliefs and what Facebook users currently expect from the company. My sense is that’s a driving force behind attempting to pass the buck back to us.

    Facebook will only go as far as their users pressure them, especially in the US where regulation from the government will be minimal. If we want Facebook to take responsibility, we have to continually hold them accountable when things go wrong or when proposed solutions don’t go far enough. Mark Zuckerberg’s personal challenge is to fix what’s broken. Ours is to keep pressing him in the right direction.

    This piece was originally published as part of Melissa Ryan's Ctrl Alt Right Delete newsletter -- subscribe here

  • Facebook’s news feed changes could elevate fake news while harming legitimate news outlets

    Blog ››› ››› ALEX KAPLAN


    Sarah Wasko / Media Matters

    New changes announced by Facebook to elevate content on its users’ news feed that is shared by friends and family over that shared by news publishers could wind up exacerbating Facebook’s fake news problem.

    Over the past year, Facebook has struggled to combat the spread of fake news and misinformation on its platform. On January 11, the social media giant announced that it would change the algorithm of its news feed so that it would “prioritize what [users’] friends and family share and comment on,” according to The New York Times. Facebook CEO Mark Zuckerberg, who was named Media Matters2017 Misinformer of the Year, told the Times that the shift was “intended to maximize the amount of content with ‘meaningful interaction’ that people consume on Facebook.” Additionally, content from news publishers and brands will be given less exposure on the news feed. Facebook is also weighing including some kind of authority component to its news feed algorithm so outlets that are considered more credible will get more prominence in the news feed.

    In the past year or so, Facebook has attempted to employ some measures in its effort to fight fake news, including its third party fact-checking initiative. Though these efforts have thus far been far from effective, the new changes threaten to undercut the measures even more.

    At least one study has shown that Facebook users are influenced by their friends and family members’ actions and reactions on the site. Last year, New York magazine reported on a study that found that “people who see an article from a trusted sharer, but one written by an unknown media source, have much more trust in the information than people who see the same article from a reputable media source shared by a person they do not trust.” With Facebook’s new changes, as the Times noted, “If a relative or friend posts a link with an inaccurate news article that is widely commented on, that post will be prominently displayed.”

    An additional point of concern is how this will exacerbate the problem of conservative misinformation specifically. Up until now, misinformation and fake news on social media have seemingly come from and been spread more by conservatives than liberals. And according to research conducted by Media Matters, right-wing communities on Facebook are much bigger than left-wing communities and mainstream distribution networks, and right-wing engagement is also bigger than in left-wing circles. These changes then could mean that peer-to-peer promotion of right-wing misinformation will more likely lead to fake news being pushed toward the top of people’s news feed.

    The changes will also likely cause real harm to legitimate news outlets by burying their stories. The head of Facebook’s news feed admitted that some pages “may see their reach, video watch time and referral traffic decrease.” Smaller, less-known outlets, especially those that do not produce content on the platform (such as live videos), could face major financial losses from the move. Facebook’s head of news partnerships, Campbell Brown, also wrote to some major publishers that the changes would cause people to see less content from “publishers, brands, and celebrities,” but that “news stories shared between friends will not be impacted,” which could suggest that fake news might get promoted over content directly from legitimate news outlets.

    It’s conceivable that adding some kind of authority component that ensures “articles from more credible outlets have a better chance of virality” could help lessen this possibility. Such a move would be a welcome development, and Media Matters has recommended that Facebook include it in its algorithm. But the possible criteria that Facebook is currently considering to determine which publisher is credible -- such as “public polling about news outlets” and “whether readers are willing to pay for news from particular publishers” -- is vague and could be problematic to enforce. And The Wall Street Journal noted that Facebook was still undecided about adding the authority component; without that, the possible negative impact from these news feed changes could be even worse.

    It is possible that Facebook’s move to include “Related Articles” next to the posts that its fact-checking partners have flagged could override people’s tendency to believe what their peers share. And perhaps the algorithm that tries to stop the spread of stories the fact-checkers have flagged may decrease the spread of fake news. But it’s also possible that these new moves undermine those initiatives, and that Zuckerberg’s aim to make users more happy could also make them more misinformed.

  • Angelo Carusone explains why Facebook’s Mark Zuckerberg is the 2017 Misinformer of the Year

    Blog ››› ››› MEDIA MATTERS STAFF

    Today, Media Matters for America named Facebook CEO Mark Zuckerberg as its 2017 Misinformer of the Year. The designation is presented annually to the media figure, news outlet, or organization that is the most influential or prolific purveyor of misinformation.

    Media Matters President Angelo Carusone explained why Mark Zuckerberg is the Misinformer of the Year:

    “We selected Mark Zuckerberg as the Misinformer of the Year because Facebook's actions in 2017 have been more of a public relations campaign than a deeper systemic approach to address the underlying causes of the proliferation of fake news and disinformation.

    I know that Facebook has the talent and knows how to implement some meaningful countermeasures. Instead of utilizing that talent, Zuckerberg has spent too much time downplaying the crisis and repeating his mistakes from 2016, like continually caving to right-wing pressure. There are some very basic things that Facebook can do to make meaningful dents in this problem -- and my hope for 2018 is that Mark Zuckerberg lets his team take those steps and more.”

    Here’s more about why Mark Zuckerberg earned the Misinformer of the Year designation:

    • Not only did Mark Zuckerberg allow Facebook to be used to mislead, misinform and suppress voters during the 2016 election, but he took active steps in an attempt to assuage right-wing critics that actually made the problem worse. He subsequently downplayed concerns about Facebook’s clear impact on the 2016 election. Instead of learning from those past mistakes, Zuckerberg has repeated them, continuing to act in a way designed to inoculate against or mollify right-wing critics, despite evidence that 126 million Facebook users saw election-related propaganda in 2016.

    • Mark Zuckerberg’s inaction and half-measures illustrate either his lack of recognition of the scope and scale of the crisis or his refusal to accept responsibility. After intense public pressure made Facebook’s misinformation problem impossible to ignore, Zuckerberg announced a series of toothless policy changes that are more public relations ploys than real meaningful solutions. Notably, little effort has been made to improve its news feed algorithm so that Facebook is not turbocharging disreputable or extreme content simply because it has high engagement, or to grapple with the scores of Facebook-verified disinformation and fake news pages masquerading as news sites.

    • Facebook’s third-party fact-checking system doesn't stop fake news from going viral. Fact-checkers who have partnered with Facebook have voiced their concerns about the company’s transparency and effectiveness of their efforts as Zuckerberg has largely refused to release Facebook’s data for independent review.  

    • In yet another attempt to mollify right-wing critics, Zuckerberg’s Facebook partnered with disreputable right-wing media outlet The Weekly Standard, thus allowing an outlet that baselessly criticizes fact-checkers and undermines confidence in journalism into its fact-checking program.

    Media Matters is the nation’s premier media watchdog. Following the 2016 presidential election, Media Matters acknowledged that in order to fully carry out its mission, its scope of work must incorporate an analysis of the way digital platforms influenced the larger problem of misinformation and disinformation in the media landscape.

    Media Matters’ selection of Mark Zuckerberg as Misinformer of the Year reflects Zuckerberg’s failure to take seriously Facebook’s role as a major source of news and information, and his failure to address Facebook’s role in spreading fake news, yet repeating past mistakes that are actually making the problem worse.

    Previous Misinformers of the Year include Sean Hannity, Rush Limbaugh, Rupert Murdoch and News Corp,  and Glenn Beck.

  • Misinformer of the Year: Facebook CEO Mark Zuckerberg

    Facebook's "personalized newspaper" became a global clearinghouse for misinformation

    Blog ››› ››› MATT GERTZ


    Sarah Wasko / Media Matters

    In late August, as Hurricane Harvey crashed through the Texas coastline, millions of Americans searching for news on the crisis were instead exposed to a toxic slurry of fabrications. Fake news articles with headlines like “Black Lives Matter Thugs Blocking Emergency Crews From Reaching Hurricane Victims” and “Hurricane Victims Storm And Occupy Texas Mosque Who Refused To Help Christians” went viral on Facebook, spreading disinformation that encouraged readers to think the worst about their fellow citizens.

    When Facebook set up a crisis response page a month later -- following a mass shooting in Las Vegas, Nevada, that killed dozens and injured hundreds -- the company intended to provide a platform for users in the area to confirm they were safe and to help people across the country learn how to support the survivors and keep up to date on the events as they unfolded. But the page soon became a clearinghouse for hyperpartisan and fake news articles, including one which baselessly described the shooter as a “Trump-hating Rachel Maddow fan.”

    In Myanmar, an ethnic cleansing of the Muslim Rohingya minority population was aided by misinformation on Facebook, the only source of news for many people in the country. In India, The Washington Post reported, “false news stories have become a part of everyday life, exacerbating weather crises, increasing violence between castes and religions, and even affecting matters of public health.” In Indonesia, disinformation spread by social media stoked ethnic tensions and even triggered a riot in the capital of Jakarta.

    Throughout the year, countries from Kenya to Canada either fell prey to fake news efforts to influence their elections, or took steps they hoped would quell the sort of disinformation campaign that infected the 2016 U.S. presidential race.

    Last December, Media Matters dubbed the fake news infrastructure 2016’s Misinformer of the Year, our annual award for the media figure, news outlet, or organization which stands out for promoting conservative lies and smears in the U.S. media. We warned that the unique dangers to the information ecosystem meant “merely calling out the lies” would not suffice, and that “the objective now is to protect people from the lies.” We joined numerous experts and journalists in pointing to weaknesses in our nation’s information ecosystem, exposed by the presidential election and fueled by key decisions made by leading social media platforms.

    Twelve months later, too little has changed in the United States, and fake news has infected democracies around the world. Facebook has been central to the spread of disinformation, stalling and obfuscating rather than taking responsibility for its outsized impact.

    Media Matters is recognizing Facebook CEO Mark Zuckerberg as 2017’s Misinformer of the Year.

    He narrowly edges Larry Page, whose leadership of Google has produced similar failures in reining in misinformation. Other past recipients include the Center for Medical Progress (2015), George Will (2014), CBS News (2013), Rush Limbaugh (2012), Rupert Murdoch and News Corp. (2011), Sarah Palin (2010), Glenn Beck (2009), Sean Hannity (2008), ABC (2006), Chris Matthews (2005), and Bill O'Reilly (2004).

    Facebook is the most powerful force in journalism

    “Does Even Mark Zuckerberg Know What Facebook Is?” Max Read asked in an October profile for New York magazine. Rattling off statistics pointing to the dizzying reach and breadth of a site with two billion monthly active users, Read concluded that the social media platform Zuckerberg launched for his Harvard peers in 2004 “has grown so big, and become so totalizing, that we can’t really grasp it all at once.”

    Facebook’s sheer size and power make comparisons difficult. But Zuckerberg himself has defined at least one key role for the website. In 2013, he told reporters that the redesign of Facebook’s news feed was intended to “give everyone in the world the best personalized newspaper we can.” This strategy had obvious benefits for the company: If users treated the website as a news source, they would log on more frequently, stay longer, and view more advertisements.

    Zuckerberg achieved his goal. Forty five percent of U.S. adults now say they get news on Facebook, dominating all other social media platforms, and the percentage of people using the website for that purpose is rising. The website is the country’s largest single source of news, and indisputably its most powerful media company.

    That goal of becoming its users' personalized newspaper was assuredly in the best interest of Facebook. The company makes money due to its massive usership, so the value of any particular piece of content is in whether it keeps people engaged with the website. (Facebook reported in 2016 that users spend an average of 50 minutes per day on the platform.) But ultimately, Zuckerberg’s declaration showed that he had put his company at the center of the information ecosystem -- crucial in a democracy because of its role in setting public opinion -- but refused to be held accountable for the results.

    That failure to take responsibility exposes another key difference between Facebook and the media companies Zuckerberg said he wanted to ape. Newspapers face a wide variety of competitors, from other papers to radio, television, and digital news products. If a newspaper gains a reputation for publishing false information, or promoting extremist views, it risks losing subscribers and advertising dollars to more credible rivals and potentially going out of business. But Facebook is so popular that it has no real competitors in the space. For the foreseeable future, it will remain a leading force in the information ecosystem.

    Fake news, confirmation bias, and the news feed


    Sarah Wasko / Media Matters

    Facebook’s news feed is designed to give users exactly what they want. And that’s the problem.

    All content largely looks the same on the feed -- regardless of where it comes from or how credible it is -- and succeeds based on how many people share it. Facebook’s mysterious algorithm favors “content designed to generate either a sense of oversize delight or righteous outrage and go viral,” serving the website’s billions of users the types of posts they previously engaged with in order to keep people on the website.

    When it comes to political information, Facebook largely helps users seek out information that confirms their biases, with liberals and conservatives alike receiving news that they are likely to approve of and share.

    A wide range of would-be internet moguls -- everyone from Macedonian teenagers eager to make a quick buck to ideological true believers hoping to change the political system -- have sought to take advantage of this tendency. They have founded hyperpartisan ideological websites and churned out content, which they have then shared on associated Facebook pages. If the story is interesting enough, it goes viral, garnering user engagement that leads to the story popping up in more Facebook feeds. The site’s owners profit when readers click the Facebook story and are directed back to the hyperpartisan website, thereby driving up traffic numbers and helping boost advertising payouts. The more extreme the content, the more it is shared, and the more lucrative it becomes. Facebook did not create ideological echo chambers, but it has certainly amplified the effect to an unprecedented degree.

    Intentional fabrications packaged as legitimate news have become just another way for hyperpartisan websites to generate Facebook user engagement and cash in, launching outlandish lies into the mainstream. Users seem generally unable to differentiate between real and fake news, and as they see more and more conspiracy theories in their news feed, they become more willing to accept them.

    Facebook’s 2016 decision to bow to a conservative pressure campaign has accelerated this process. That May, a flimsy report claimed that conservative outlets and stories had been “blacklisted” by the Facebook employees who selected the stories featured in its Trending Topics news section, a feature that helps push stories viral. The notion that Facebook employees might be suppressing conservative news triggered an angry backlash from right-wing media outlets and Republican leaders, who declared that the site had a liberal bias. In an effort to defuse concerns, Zuckerberg and his top executives hosted representatives from Donald Trump’s presidential campaign, Fox News, the Heritage Foundation, and other bastions of the right at Facebook’s headquarters. After hearing their grievances over a 90-minute meeting, Zuckerberg posted on Facebook that he took the concerns seriously and wanted to ensure that the community remained a “platform for all ideas.”

    While Facebook’s own internal investigation found “no evidence of systematic political bias” in the selection or prominence of stories featured in the Trending Topics section, the company announced the following week that its curators would no longer rely on a list of credible journalism outlets to help them determine whether a topic was newsworthy, thereby removing a key method of screening the credibility of stories. And in late August, as the presidential election entered its stretch run, Facebook fired its “news curators,” putting Trending Topics under the control of an algorithm. The company promised that removing the human element “allows our team to make fewer individual decisions about topics.” That’s true. But the algorithm promoted a slew of fabricated stories from bogus sources in the place of news articles from credible outlets.

    This confluence of factors -- users seeking information that confirms their biases, sites competing to give it to them, and a platform whose craven executives deliberately refused to take sides between truth and misinformation -- gave rise to the fake news ecosystem.

    The result is a flood of misinformation and conspiracy theories pouring into the news feeds of Facebook users around the world. Every new crisis seems to bring with it a new example of the dangerous hold Facebook has over the information ecosystem.

    Obfuscation and false starts for enforcement

    Zuckerberg resisted cracking down on fake news for as long as he possibly could. Days after the 2016 presidential election, he said it was “crazy” to suggest fake news on Facebook played a role in the outcome. After an uproar, he said that he took the problem “seriously” and was “committed to getting this right.” Seven months later, after using his control of more than 50 percent of Facebook shares to vote down a proposal for the company to publicly report on its fake news efforts, the CEO defended the company’s work. Zuckerberg said Facebook was disrupting the financial incentives for fake news websites, and he touted a new process by which third-party fact-checkers could review articles posted on the site and mark them as “disputed” for users. This combination of small-bore proposals, halting enforcement, and minimal transparency has characterized Zuckerberg’s approach to the problem.

    Under Facebook’s third-party fact-checking system, rolled out in March to much hype, the website’s users have the ability to flag individual stories as potential “false news.” A fact-checker from one of a handful of news outlets -- paid by Facebook and approved by the International Fact-Checking Network at Poynter, a non-partisan journalism think tank -- may then review the story, and, if the fact-checker deems it inaccurate, place an icon on the story that warns users it has been “disputed.”

    This is not a serious effort at impacting an information infrastructure encompassing two billion monthly users. It’s a fig leaf that Facebook is using to benefit from the shinier brands of the outlets it has enlisted in the effort, while creating a conflict of interest that limits the ability of those news organizations to scrutinize the company.  

    The program places the onus first on users to identify the false stories and then on a small group of professionals from third parties -- including The Associated Press, Snopes, ABC News and PolitiFact -- to take action. The sheer size of Facebook means the fact-checkers cannot hope to review even a tiny fraction of the fake news circulating on the website. The Guardian's reviews of the effort have found that it was unclear whether the flagging process actually impeded the spread of false information, as the “disputed” tag is often only added long after the story had already gone viral and other versions of the same story can circulate freely without the tag. The fact-checkers themselves have warned that it is impossible for them to tell how effective their work is because Facebook won’t share information about their impact.

    Zuckerberg’s happy talk about the company’s efforts to demonetize the fake news economy also continues to ring hollow. According to Sheryl Sandberg, Facebook’s chief operating officer, this means the company is “making sure” that fake news sites “aren’t able to buy ads on our system.” It’s unclear whether that is true, since Facebook refuses to be transparent about what it’s doing. But whether the fake news sites buy Facebook ads or not, the same websites continue to benefit from the viral traffic that Facebook makes possible. They can even benefit from having their associated Facebook pages verified. (Facebook verifies pages for public figures, brands, and media outlets with a blue check mark to confirm it is “the authentic Page” or profile for the associated group or person, imbuing the page with what inevitably looks like a stamp of approval from the social media giant.)

    Facebook’s response to criticism of its political advertising standards has been more robust. During the 2016 presidential election, the company was complicit in what the Trump campaign acknowledged was a massive “voter suppression” effort. Trump’s digital team spent more than $70 million on Facebook advertising, churning out hundreds of thousands of microtargeted “dark” ads, which appeared only on the timelines of the target audience and did not include disclosures that they were paid for by the campaign. A hefty portion of those ads targeted voters from major Democratic demographics with negative messages about Hillary Clinton and was intended to dissuade them from going to the polls. Facebook employees embedded with the Trump team aided this effort, helping with targeting and ensuring the ads were approved through an automated system. But in October, the company received major blowback following the disclosure that Russian-bought ads were targeted using similar strategies. Facebook subsequently announced that in the future, election ads would be manually reviewed by an employee to ensure it meets the company’s standards. The company plans to require disclosure of who paid for political ads, and increase transparency by ensuring that users can view all ads a particular page had purchased. Those are meaningful steps, but Facebook should go further by making clear that the company opposes civil suppression and will instruct its employees not to approve ads intended for that purpose.   

    It’s notable, of course, that Facebook’s effort to curb abuse of its political advertising came only after U.S. senators unveiled legislation requiring stricter disclosure for online political ads. The company took action in order to preempt a meaningful federal response. With no such pressure on offer with regard to fake news, Facebook has been left to its own devices, responding only as needed to quiet public anger at its failures. At every step, experts have warned that Facebook’s efforts to push back against fake news have been insufficient and poorly implemented. The company is doing as little as it can get away with.

    What can be done?


    Sarah Wasko / Media Matters

    Hoaxes and disinformation have always been a part of human society, with each new generation enlisting the era’s dominant forms of mass communication in their service. But Facebook’s information ecosystem and news feed algorithm has proven particularly ripe for abuse, allowing fake news purveyors to game the system and deceive the public. Those bad actors know that user engagement is the major component in ensuring virality, and have engineered their content with that in mind, leading to a system where Facebook turbocharges false content from disreputable sources.

    Facebook could fight back against fake news by including an authority component in its algorithm, ensuring that articles from more credible outlets have a better chance of virality than ones from less credible ones. Facebook’s algorithm should recognize that real news outlets like The New York Times or CNN are more credible than websites that serve up deliberate fabrications, and respond accordingly, the way Google’s (admittedly imperfect) search engine does.

    This will also require Facebook to stop conveying authority on users that do not deserve it by stripping verified tags from pages that regularly traffic in fake news.

    Facebook also has a serious problem with bots: software that mimics human behavior and cheats the company’s algorithm, creating fake engagement and sending stories viral. The company will need to step up its efforts to identify algorithmic anomalies caused by these bots, and develop heightened countermeasures, which should include minimizing the impact on users by known bots.

    If Facebook can find a way to change its algorithm to avoid clickbait, as it has claimed, it should be able to do the same to limit the influence of websites that regularly produce fabrications.

    But algorithms alone won’t be enough to solve the problem. Facebook announced it would hire 1,000 people to review and remove Facebook ads that don’t meet its standards. So why hasn’t Zuckerberg done something similar to combat fake news? Why won’t Facebook, as one of the third-party fact-checkers suggested in an interview with The Guardian, hire “armies of moderators and their own fact-checkers” to solve that problem?

    Given the collapse of the news industry over the last decade, there is no shortage of journalists with experience at verifying information and debunking falsehoods. Facebook could hire thousands of them; train them; give them the actual data that they need to determine whether they are effective and ensure that their rulings impact the ability of individual stories to go viral; and penalize websites, associated Facebook pages, and website networks for repeat offenses.

    If Zuckerberg wants Facebook to be a “personalized newspaper,” he needs to take responsibility for being its editor in chief.

    There is a danger, of course, to having a single news outlet with that much power over the U.S. information ecosystem. But Facebook already has that power, though there are compelling arguments in favor of limiting it, either with government regulation or antitrust actions.

    What’s clear is that Facebook will only act under pressure. Earlier this month, The Weekly Standard, a conservative magazine and regular publisher of misinformation, announced it had been approved to join Facebook’s fact-checking initiative. The magazine was founded by Bill Kristol, the former chief of staff to Vice President Dan Quayle, and is owned by right-wing billionaire Philip Anschutz. Stephen Hayes, The Weekly Standard’s editor-in-chief and the author of The Connection: How al Qaeda's Collaboration with Saddam Hussein Has Endangered America, praised Facebook for the decision, telling The Guardian: “I think it’s a good move for [Facebook] to partner with conservative outlets that do real reporting and emphasize facts.” Conservatives, including those at The Weekly Standard, had previously criticized the initiative, claiming the mainstream news outlets and fact-checking organizations Facebook partnered with were actually liberal partisans. Facebook responded by trying to “appease all sides.”

    Nineteen months after Facebook’s CEO sat down with conservative leaders and responded to their concerns with steps that inadvertently strengthened the fake news infrastructure, his company remains more interested in bowing to conservative criticisms than stopping misinformation.


    The very people who helped build Facebook now warn that it is helping to tear the world apart.

    Founding President Sean Parker lamented “the unintended consequences of a network when it grows to a billion or 2 billion people” during a November event. “It literally changes your relationship with society, with each other,” he said. “It probably interferes with productivity in weird ways. God only knows what it's doing to our children's brains."

    Chamath Palihapitiya, who joined Facebook in 2007 and served as vice president for user growth at the company, said earlier this month that he regrets helping build up the platform: “I think we have created tools that are ripping apart the social fabric of how society works.”

    Facebook’s current employees also worry about the damage the company is doing, according to an October The New York Times report detailing “growing concern among employees.” Last week, the Facebook’s director of research tried to allay some of these fears with a press release titled, “Hard Questions: Is Spending Time on Social Media Bad for Us."

    Mark Zuckerberg built a platform to connect people that has become an incredibly powerful tool to divide them with misinformation, and he’s facing increasing criticism for it. But he only ever seems interested in fixing the public relations problem, not the information one. That’s why he is 2017’s Misinformer of the Year.

  • Facebook partners with conservative misinformer The Weekly Standard on fact-checking

    The Weekly Standard becomes the only partisan organization tasked with fact-checking for Facebook

    Blog ››› ››› MEDIA MATTERS STAFF

    Conservative news outlet The Weekly Standard has been approved by Facebook to partner in fact-checking "false news," a partnership that makes little sense given the outlet’s long history of making misleading claims, pushing extreme right-wing talking points, and publishing lies to bolster conservative arguments.

    The Weekly Standard’s history of publishing false claims on topics such as the 2012 attacks on diplomatic facilities in Benghazi, the Affordable Care Act, tax cuts, and the war in Iraq, among many others, raises doubts that Facebook is taking the challenge of fact-checking seriously.

    As The Guardian reports, The Weekly Standard is the first “explicitly partisan” outlet to partner with Facebook in their effort to fact-check fake news. The decision by Facebook raises concerns over the decision to give a conservative opinion outlet with a history of misinformation unearned influence over the fact-checking process. From the December 6 report:

    A conservative news organization has been approved to partner with Facebook to fact-check false news, drawing criticisms that the social media company is caving to rightwing pressures and collaborating with a publication that has previously spread propaganda.

    The Weekly Standard, a conservative opinion magazine, said it is joining a fact-checking initiative that Facebook launched last year aimed at debunking fake news on the site with the help of outside journalists. The Weekly Standard will be the first right-leaning news organization and explicitly partisan group to do fact-checks for Facebook, prompting backlash from progressive organizations, who have argued that the magazine has a history of publishing questionable content.

    [...]

    “I’m really disheartened and disturbed by this,” said Angelo Carusone, president of Media Matters for America, a progressive watchdog group that published numerous criticisms of the Weekly Standard after the partnership was first rumored in October. “They have described themselves as an opinion magazine. They are supposed to be thought leaders.”

    Calling the magazine a “serial misinformer”, Media Matters cited the Weekly Standard’s role in pushing false and misleading claims about Obamacare, Hillary Clinton and other political stories.

  • Facebook CEO’s Immigration Reform Group Donated To Trump’s Transition To “Curry Early Favor” With Administration

    Latest Report Adds To Growing List Of Questionable Donations And Meetings With Conservatives

    Blog ››› ››› CHRISTOPHER LEWIS

    Facebook CEO Mark Zuckerberg’s immigration reform lobby FWD.us, donated $5,000 to President Donald Trump’s transition according to a report from Politico.

    Despite a contentious history opposing Trump’s anti-immigrant policies, the group donated to Trump “hoping to curry early favor and help shape the incoming administration.” From Politico:

    But months later the nonprofit, founded by Facebook CEO Mark Zuckerberg wrote a $5,000 check to Trump’s presidential transition — the latest indication that it’s still business as usual for the tech industry in Washington despite the revulsion many Silicon Valley engineers and executives feel toward Trump.

    Hoping to curry early favor and help shape the incoming administration, FWD.us joined a handful of tech and telecom companies like AT&T, Microsoft and Qualcomm in funding Trump’s months-long transition operation, which raked in roughly $6.5 million through Feb. 15, according to a transition disclosure report filed last weekend and obtained by POLITICO on Thursday.

    [...]

    FWD.us has had a fractious history with Trump and some of his top lieutenants, dating back to well before the election. Jeff Sessions, now the U.S. attorney general, blasted the group and its founder, Zuckerberg, in a blistering anti-immigration speech from the Senate floor in 2014. When Trump, as a candidate in 2015, detailed his immigration policy blueprint, Schulte described the approach as “just wrong.” While he didn’t mention Trump by name, the FWD.us founder took aim at “anti-immigrant voices” that seek to “forcibly expel millions of immigrants, period.”

    Facebook and CEO Mark Zuckerberg have faced increasing criticism over their efforts to reach out to conservatives. Recently Facebook donated more than $120,000 to the American Conservative Union’s annual event the Conservative Political Action Conference (CPAC).  In 2016, Facebook met with conservative leaders to listen to their complaints of anti-conservative bias in Facebook’s trending topics feature. Facebook subsequently fired their human editors in August. 

  • Report: Facebook Continues To Placate Conservatives By Donating To CPAC

    Blog ››› ››› MEDIA MATTERS STAFF

    The Daily Beast reports that Facebook donated more than $120,000 to the American Conservative Union’s annual event the Conservative Political Action Conference (CPAC). Mark Zuckerberg’s donation comes after he held a meeting with conservative media personalities such as Glenn Beck and Fox’s Dana Perino following allegations that the website had been suppressing conservative views.

    During the meeting, Zuckerberg lauded President Donald Trump for having “more fans on Facebook than any other presidential candidate” and Fox News for driving “more interactions on its Facebook page than any other news outlet in the world.” Following the accusations of bias, Facebook laid off its entire editorial team and replaced it with an algorithm, a move which The Washington Post reported led to the rise and prominence of “fake news” trending on the website.

    According to The Daily Beast, Facebook continues to court conservatives with its “six-figure contribution to CPAC,” which includes a cash donation and “in-kind support.” From The Daily Beast:

    Sources with direct knowledge of the matter tell The Daily Beast that Facebook made a six-figure contribution to CPAC, the yearly conference for conservative activists which will feature President Donald Trump, White House advisor Steve Bannon, NRA president Wayne LaPierre, and other right-wing favorites.

    Facebook’s contribution is worth more than $120,000, according to our sources. Half of that is cash, and the other half is in-kind support for CPAC’s operations. Facebook will have a space at the conference for attendees to film Facebook Live videos, and will also train people on best practices for using the social network and Instagram.

    [...]

    The Wall Street Journal reported in October that Trump’s own Facebook posts fueled intense debate within the company about what kind of content was acceptable——particularly his calls for a ban on Muslims from entering the U.S. Mark Zuckerberg himself had to determine that Trump’s posts were okay, according to the paper’s report. And The New York Times reported that after Trump won the election, some company employees worried the spread of racist memes and fake news on the site may have boosted his candidacy.

    “A fake story claiming Pope Francis—actually a refugee advocate—endorsed Mr. Trump was shared almost a million times, likely visible to tens of millions,” Zeynep Tufekci, an associate professor at the University of North Carolina who studies the social impact of technology, told the Times. “Its correction was barely heard. Of course Facebook had significant influence in this last election’s outcome.”

  • Facebook’s Trending Topics Adjustments Are Half-Measures

    Blog ››› ››› MEDIA MATTERS STAFF

    In November, Facebook CEO Mark Zuckerberg promised to fix Facebook’s infestation of fake news. Today, Facebook announced changes to their trending topics section:

    Today we’re announcing three updates to Trending, a feature that shows people popular topics being discussed on Facebook that they might not see in their News Feed:

    • Trending topics will now feature a publisher headline below each topic name

    • An improved system to determine what is trending

    • Everyone in the same region will see the same topics

    These changes begin rolling out today and will be available to everyone in the US in the coming weeks. We’re listening to people’s feedback and will continue to make improvements in order to provide a valuable Trending experience.

    In response, Media Matters’ President Angelo Carusone released the following statement:

    At Media Matters, we have been doing a deep dive on the fake news ecosystem on Facebook and understand it better than anyone. Accordingly, we understand what will have an effect and what won’t. We’ll give credit where credit is due, but will also hold Facebook to its commitment to address its role in the proliferation of fake news.

    That said, today’s announced policy changes are at best a marginal improvement. While moving in the right direction, these half-measures will not stop the rampant lies spreading on the platform. We can’t forget that Facebook made the problem of fake news significantly worse when they acted on right-wing misinformation and fired all their human editors over the summer and let their algorithms get gamed.

    Ultimately, Facebook’s timidity in addressing this problem will prove bad for business and stands in in stark contrast to Snapchat’s bold actions earlier this week that were aimed at stopping the spread of lies and false impressions. So, we’re encouraged by the small improvements, specifically the changes to transparency in the trending topics section, but we await Facebook prioritizing truth in the same way their competitors have demonstrated.

    Earlier this week, The New York Times detailed the success of Snapchat’s “hard line” on misleading images, and BuzzFeed previously examined how the platform has worked to keep fake news out. You can read more of Media Matters detailed work on fake news here.

  • Mark Zuckerberg Commits To Fixing The Fake News Problem On Facebook

    Blog ››› ››› JOHN WHITEHOUSE

    Days after Media Matters launched a petition urging him to take action, Facebook CEO Mark Zuckerberg committed to working on the problem on fake news at Facebook, writing in a Facebook post that “I want you to know that we have always taken this seriously, we understand how important the issue is for our community and we are committed to getting this right.”

    There is mounting evidence about the scope of the crisis of fake news on Facebook. Teenagers in Macedonia ran content farms creating fake news that generated over hundreds of thousands of shares on Facebook during the presidential election. Fake news stories repeatedly trended and were shared across Facebook. A BuzzFeed analysis even found that fake news outperformed real news during the final three months of the election.

    Facebook has been slow to address the problem. They only removed fake news websites from their advertising network once Google did the same. Multiple former Facebook employees spoke out and confirmed that Facebook really did have the ability to address the problem but was choosing not to, with Zuckerberg calling the argument that fake news on Facebook influenced the election outcome “crazy.”

    But in a November 19th post, Zuckerberg agreed that this is a problem and that Facebook was working on ways to mitigate fake news. His full post is worth reading:

  • Facebook CEO Mark Zuckerberg Is Too Scared Of Being Accused Of “Bias” By Conservatives To Address Fake Stories

    Blog ››› ››› ANDREW LAWRENCE

    Tens of millions of Americans get their news from Facebook and an increasing amount of that news is fake. 

    In May, Facebook CEO Mark Zuckerberg met with right-wing media personalities over concerns that “many conservatives don’t trust that [Facebook] surfaces content without a political bias.” Following the meeting, Zuckerberg noted how important conservative engagement was to Facebook by stating, “Donald Trump has more fans on Facebook than any other presidential candidate. And Fox News drives more interactions in its Facebook page than any other news outlet in the world. It’s not even close.”

    Following the outcry by conservatives of political bias, Facebook adopted revised guidelines on its Trending Topics, promised its reviewers would undergo new training “that emphasized content decisions may not be made on the basis of politics or ideology,” and fired the 18 human editors it used to write descriptions of trending topics and ensure their accuracy.

    Facebook's response to the cries of “political bias” backfired on the company after just 72 hours when fake stories began trending, including a story about Fox News host (and Trump agitator) Megyn Kelly being fired from the network.

    The response from Facebook over conservatives’ concerns of “political bias” stand in contrast to the company’s response to reports that fake stories and hoaxes have taken over the News Feed of the platform, which according to the Pew Research Center, 61 percent of web-using Millennials and 39 percent of Baby Boomers use to get their political news.

    A recent study by Buzzfeed found “hyperpartisan political Facebook pages and websites are consistently feeding their millions of followers false and misleading information,” with one of the most egregious examples being a group of pro-Trump websites originating in Macedonia which were “playing a significant role in propagating” false and misleading pro-Trump articles. One of the Macedonians contacted for the story, a 17-year-old, said, “I started the site for a easy way to make money.”

    Zuckerberg initially downplayed the widespread problem of fake news on Facebook and its effect on the election, saying “it’s a very small amount of the content,” and calling it a “crazy idea” that the hoaxes influenced the election. Zuckerberg has since acknowledged that his platform has a problem with false stories, but rather than meeting with journalists to discuss solutions as he did with conservatives in May, Zuckerberg offered up the excuse that “identifying the truth is complicated,” and once again expressed concern that Facebook “find ways for our community to tell us what content is most meaningful.” Gizmodo reported that Facebook executives recently conducted a review of the News Feed process that would have eliminated fake and hoax stories, but that the plan was set aside due to concern that removing false stories would upset conservatives:

    According to two sources with direct knowledge of the company’s decision-making, Facebook executives conducted a wide-ranging review of products and policies earlier this year, with the goal of eliminating any appearance of political bias. One source said high-ranking officials were briefed on a planned News Feed update that would have identified fake or hoax news stories, but disproportionately impacted right-wing news sites by downgrading or removing that content from people’s feeds. According to the source, the update was shelved and never released to the public. It’s unclear if the update had other deficiencies that caused it to be scrubbed.

    “They absolutely have the tools to shut down fake news,” said the source, who asked to remain anonymous citing fear of retribution from the company. The source added, “there was a lot of fear about upsetting conservatives after Trending Topics,” and that “a lot of product decisions got caught up in that.”

    Recently Buzzfeed reported that despite fear of losing their jobs for and being warned about speaking to the press, “dozens” of Facebook employees have formed “an unofficial task force” to address the company’s role in spreading misinformation.

    Join Media Matters in asking Mark Zuckerberg and Facebook to fix their fake news problem by signing our petition.