YouTube has allowed conspiracy theories about interference with voting machines to go viral
The platform has also made money from ads airing on some of the videos
Written by Alex Kaplan
Published
YouTube has allowed users to share videos pushing false conspiracy theories alleging interference with voting machines and election computer systems. The videos have racked up millions of views, and the platform has profited from some of them.
Numerous false voter fraud conspiracy theories have circulated on social media since Election Day, including a couple claiming that the election’s technological infrastructure was hacked or tampered with. One theory falsely alleged that some kind of supercomputer named “Hammer” and a computer program called “Scorecard” were used to alter vote counts (and that footage shown on CNN last year proves it); the director of the Cybersecurity and Infrastructure Security Agency in the Department of Homeland Security has called it “nonsense.” The other conspiracy theory falsely alleged that Dominion Voting Systems, which has created election software multiple states have used, changed vote tallies because of “glitches”; in actuality the glitches were not related to the software and did not change vote counts. (President Donald Trump has embraced this conspiracy theory.)
Though they are false, the claims have gained particular traction on YouTube. The platform has suggested that election misinformation that does not directly encourage people to interfere with the vote count does not violate its rules and says it is not algorithmically promoting inaccurate videos.
Media Matters used the tracking tool BuzzSumo to review videos posted between November 5 and November 12 that have at least 10,000 views and had “hammer” and “scorecard” or “dominion” in the title, and we found 41 such videos, with a combined total of nearly 3 million views. The videos in turn have drawn more than 200,000 combined Facebook engagements.
About 40% of the videos feature clips of segments from Fox News (or its business network Fox Business). YouTube had listed Fox News as an “authoritative” source “for election-related news and information queries” before the election, even though the network has a track record of amplifying election-related misinformation both on air and on its YouTube channels. All of the Fox clips feature Sidney Powell, the attorney for former national security adviser Michael Flynn, pushing the two election conspiracy theories. (And at least one of the videos comes from an account that appears to support the QAnon conspiracy theory.)
A few of the videos also feature Trump’s personal lawyer Rudy Giuliani, who has been trying to contest the election for Trump, pushing the Dominion conspiracy theory.
At least eight of the videos also feature ads, meaning that both YouTube and the channels that have uploaded the videos have profited from election misinformation.
At least two of the videos (from the same channel) even sold merchandise below the video -- from which YouTube may also financially benefit.
The proliferation of these conspiracy theories comes as YouTube has allowed the spread of voter fraud misinformation on its platform nearly unchecked since the election. It also continues the platform’s repeated and serious failure to address concerns that YouTube — and various channels it hosts — is making money from misinformation. And it demonstrates, yet again, that there is an increasingly problematic YouTube-to-Facebook pipeline for the spread of misinformation on social media.