Bombshell report: Facebook has known that it is fomenting extremism for years -- and refuses to stop
Written by John Whitehouse
Published
A major report from The Wall Street Journal revealed that Facebook has known that it is fomenting political polarization for years -- and that its top executives refused to implement proposed solutions to stop it.
This reporting dovetails with previous reports that Facebook is refusing to enforce its own policies that might upset conservatives in the United States. The Journal’s new report specifically mentions concern within Facebook that proposed changes “would have disproportionately affected conservative users and publishers, at a time when the company faced accusations from the right of political bias.”
The report cites an internal Facebook presentation from 2016 which found that “64% of all extremist group joins are due to our recommendation tools”; the presentation stated bluntly that “Our recommendation systems grow the problem.”
To be clear, these accusations are definitively not rooted in any facts. Media Matters has extensively and repeatedly debunked accusations of right-wing bias at Facebook; furthermore, when the platform finally revealed its internal audit of bias, it did not include a single concrete example of anti-conservative bias at Facebook.
The new report details that a key figure in nixing proposed solutions was Joel Kaplan, a Facebook executive who previously worked in President George W. Bush’s White House. Kaplan also reportedly worked behind the scenes to advance Justice Brett Kavanaugh’s nomination to the Supreme Court.
The Wall Street Journal’s Jeff Horwitz and Deepa Seetharaman report:
Facebook had kicked off an internal effort to understand how its platform shaped user behavior and how the company might address potential harms. Chief Executive Mark Zuckerberg had in public and private expressed concern about “sensationalism and polarization.”
But in the end, Facebook’s interest was fleeting. Mr. Zuckerberg and other senior executives largely shelved the basic research, according to previously unreported internal documents and people familiar with the effort, and weakened or blocked efforts to apply its conclusions to Facebook products.
Facebook policy chief Joel Kaplan, who played a central role in vetting proposed changes, argued at the time that efforts to make conversations on the platform more civil were “paternalistic,” said people familiar with his comments.
Another concern, they and others said, was that some proposed changes would have disproportionately affected conservative users and publishers, at a time when the company faced accusations from the right of political bias.
Facebook revealed few details about the effort and has divulged little about what became of it. In 2020, the questions the effort sought to address are even more acute, as a charged presidential election looms and Facebook has been a conduit for conspiracy theories and partisan sparring about the coronavirus pandemic.
In essence, Facebook is under fire for making the world more divided. Many of its own experts appeared to agree—and to believe Facebook could mitigate many of the problems. The company chose not to.
The report also specifically identifies how hyper-partisan users have disproportionate influence -- and how CEO Mark Zuckerberg and top executives refused to intervene to adjust it.
Under Facebook’s engagement-based metrics, a user who likes, shares or comments on 1,500 pieces of content has more influence on the platform and its algorithms than one who interacts with just 15 posts, allowing “super-sharers” to drown out less-active users. Accounts with hyperactive engagement were far more partisan on average than normal Facebook users, and they were more likely to behave suspiciously, sometimes appearing on the platform as much as 20 hours a day and engaging in spam-like behavior. The behavior suggested some were either people working in shifts or bots.
One proposal Mr. Uribe’s team championed, called “Sparing Sharing,” would have reduced the spread of content disproportionately favored by hyperactive users, according to people familiar with it. Its effects would be heaviest on content favored by users on the far right and left. Middle-of-the-road users would gain influence.
Mr. Uribe called it “the happy face,” said some of the people. Facebook’s data scientists believed it could bolster the platform’s defenses against spam and coordinated manipulation efforts of the sort Russia undertook during the 2016 election.
Mr. Kaplan and other senior Facebook executives pushed back on the grounds it might harm a hypothetical Girl Scout troop, said people familiar with his comments. Suppose, Mr. Kaplan asked them, that the girls became Facebook super-sharers to promote cookies? Mitigating the reach of the platform’s most dedicated users would unfairly thwart them, he said.
...
The debate got kicked up to Mr. Zuckerberg, who heard out both sides in a short meeting, said people briefed on it. His response: Do it, but cut the weighting by 80%. Mr. Zuckerberg also signaled he was losing interest in the effort to recalibrate the platform in the name of social good, they said, asking that they not bring him something like that again.
The entire report is worth reading.