Facebook’s news feed changes could elevate fake news while harming legitimate news outlets
Written by Alex Kaplan
Published
New changes announced by Facebook to elevate content on its users’ news feed that is shared by friends and family over that shared by news publishers could wind up exacerbating Facebook’s fake news problem.
Over the past year, Facebook has struggled to combat the spread of fake news and misinformation on its platform. On January 11, the social media giant announced that it would change the algorithm of its news feed so that it would “prioritize what [users’] friends and family share and comment on,” according to The New York Times. Facebook CEO Mark Zuckerberg, who was named Media Matters’ 2017 Misinformer of the Year, told the Times that the shift was “intended to maximize the amount of content with ‘meaningful interaction’ that people consume on Facebook.” Additionally, content from news publishers and brands will be given less exposure on the news feed. Facebook is also weighing including some kind of authority component to its news feed algorithm so outlets that are considered more credible will get more prominence in the news feed.
In the past year or so, Facebook has attempted to employ some measures in its effort to fight fake news, including its third party fact-checking initiative. Though these efforts have thus far been far from effective, the new changes threaten to undercut the measures even more.
At least one study has shown that Facebook users are influenced by their friends and family members’ actions and reactions on the site. Last year, New York magazine reported on a study that found that “people who see an article from a trusted sharer, but one written by an unknown media source, have much more trust in the information than people who see the same article from a reputable media source shared by a person they do not trust.” With Facebook’s new changes, as the Times noted, “If a relative or friend posts a link with an inaccurate news article that is widely commented on, that post will be prominently displayed.”
An additional point of concern is how this will exacerbate the problem of conservative misinformation specifically. Up until now, misinformation and fake news on social media have seemingly come from and been spread more by conservatives than liberals. And according to research conducted by Media Matters, right-wing communities on Facebook are much bigger than left-wing communities and mainstream distribution networks, and right-wing engagement is also bigger than in left-wing circles. These changes then could mean that peer-to-peer promotion of right-wing misinformation will more likely lead to fake news being pushed toward the top of people’s news feed.
The changes will also likely cause real harm to legitimate news outlets by burying their stories. The head of Facebook’s news feed admitted that some pages “may see their reach, video watch time and referral traffic decrease.” Smaller, less-known outlets, especially those that do not produce content on the platform (such as live videos), could face major financial losses from the move. Facebook’s head of news partnerships, Campbell Brown, also wrote to some major publishers that the changes would cause people to see less content from “publishers, brands, and celebrities,” but that “news stories shared between friends will not be impacted,” which could suggest that fake news might get promoted over content directly from legitimate news outlets.
It’s conceivable that adding some kind of authority component that ensures “articles from more credible outlets have a better chance of virality” could help lessen this possibility. Such a move would be a welcome development, and Media Matters has recommended that Facebook include it in its algorithm. But the possible criteria that Facebook is currently considering to determine which publisher is credible -- such as “public polling about news outlets” and “whether readers are willing to pay for news from particular publishers” -- is vague and could be problematic to enforce. And The Wall Street Journal noted that Facebook was still undecided about adding the authority component; without that, the possible negative impact from these news feed changes could be even worse.
It is possible that Facebook’s move to include “Related Articles” next to the posts that its fact-checking partners have flagged could override people’s tendency to believe what their peers share. And perhaps the algorithm that tries to stop the spread of stories the fact-checkers have flagged may decrease the spread of fake news. But it’s also possible that these new moves undermine those initiatives, and that Zuckerberg’s aim to make users more happy could also make them more misinformed.