Fact-checkers used by Facebook to debunk fake news pushed on the social media platform say that the company’s lack of transparency is “hurting their efforts,” according to a new report from Politico.
For months, researchers and other experts have called on Facebook and other social media platforms to share more of their data to help reveal how fake news spreads. A May analysis from The Guardian found that Facebook’s efforts to fight fake news had been “regularly ineffective” and appeared to be “having minimal impact.” Additionally, a group at Oxford University that studied fake news’ impact on the U.K.’s 2017 elections was able to examine only Twitter because Facebook “doesn't allow for its data to be viewed.” At the time, one of the study’s authors called on the company to share more data with academics, nonprofits and tech companies. Yet only days earlier Facebook’s shareholders and board members, led by CEO Mark Zuckerberg, rejected a proposal to publish a report on how fake news impacts the social media giant.
Since then, as The Washington Post reported, Facebook has admitted that “a Russian ‘troll farm’ with a history of pushing pro-Kremlin propaganda” spent at least $100,000 on political ads on the social media platform during the 2016 presidential campaign. The announcement confirms previous suspicions that Facebook could have information relevant to possible Russian collusion with the Trump campaign and the targeted use of fake news -- something both congressional probes and special counsel Robert Mueller have been looking into -- and Facebook has reportedly turned over some of that data to Mueller.
But in the September 7 article, Politico noted that the fact-checkers who have partnered with Facebook say the company has been declining “to share any internal data” on fake news, which means “they have no way of determining whether the ‘disputed’ tags they’re affixing to ‘fake news’ articles slow — or perhaps even accelerate — the stories’ spread.” More data could also help them “prioritize the most important stories” to fact-check. From the article:
The fact-checkers enlisted by Facebook to help clear the site of “fake news” say the social media giant’s refusal to share information is hurting their efforts.
In December, Facebook promised to address the spread of misinformation on its platform, in part by working with outside fact-checking groups. But because the company has declined to share any internal data from the project, the fact-checkers say they have no way of determining whether the “disputed” tags they’re affixing to “fake news” articles slow — or perhaps even accelerate — the stories’ spread. They also say they’re lacking information that would allow them to prioritize the most important stories out of the hundreds possible to fact-check at any given moment.
Some fact-checkers are growing frustrated, saying the lack of information is undermining Facebook’s efforts to combat false news reports.
[Alexios] Mantzarlis [director of Poynter’s International Fact-Checking Network] said he would want to see data on how the length of time it takes to flag a false story affects its spread. It’s not hard to imagine other questions: Do certain types of stories require more immediate attention than others? Are there other types that may not be viral yet, but data has shown likely will be soon? When an article is flagged, how often do copy-cat versions with altered headlines pop up to replace it? Even knowing what types of headlines work best for fact-checking posts would be valuable, Mantzarlis says.
UPDATE: Twitter has also promised to hand over analysis of Russian activity on its platform to the congressional probes.