Update (1/14/21): After we published this piece, YouTube removed all of the ads from the videos mentioned here as having ads. Five of the videos mentioned in this report were also removed from the platform for violating the platform’s community guidelines.
Videos pushing false claims of voter fraud in the 2020 presidential election have continued to circulate on YouTube despite the platform’s promise to crack down on such content following the insurrection at the U.S. Capitol on January 6. Several of the videos are monetized.
Since losing reelection to President-elect Joe Biden, President Donald Trump and right-wing figures have pushed multiple false conspiracy theories alleging widespread voter fraud in the election. Those claims led to a rally on January 6 where Trump pushed those false claims, and pro-Trump rioters subsequently stormed the United States Capitol to prevent the Electoral College certification of Biden’s victory. The following day, YouTube announced that “due to the disturbing events that transpired yesterday, .... posting new videos with false claims in violation of our policies will now receive a strike,” meaning a temporary suspension. (By the time YouTube made that decision, it had already allowed millions of views for videos pushing false voter fraud claims after the election, some of which were monetized.)
However, using the tracking tools BuzzSumo and Raditube, a Media Matters review of YouTube videos uploaded since (and including) January 7 found videos with a combined total of just under 1.5 million views that all in some manner falsely suggest voter fraud impacted the election.
In one video, host Greg Hunter said that Congress “surely … know[s] this is massive fraud” that was “well orchestrated with seditious traitors inside the U.S. government and countries hostile to the United States.” The video carried ads, meaning both the channel and YouTube made money off of the false claim. Hunter’s channel also had another video claiming there was “massive well-documented fraud” in the election, and it also featured ads.
Another video that had ads was titled “Trump Supporters EXPLAIN How Election Was Stolen” and featured Trump supporters who had come to Washington, D.C., for what eventually became the insurrection. The video description featured the “#StopTheSteal” hashtag, referencing the movement that pushed the voter fraud claims and organized the January 6 riot.
In another video with ads, from widely followed YouTuber Tim Pool, a speaker claimed that Trump in his January 6 rally speech “emphasized very concrete pieces of evidence, Matt Braynard-type evidence, you know, out-of-state voters, dead voters, whatever.” (Braynard is a former Trump campaign staffer whose claims -- as well as claims of dead voters and out-of-state voters -- have not stood up to scrutiny.)
One video featured the host reading user chats claiming that “we won even bad media coverage for four years and lied. We won two elections but they just, you know, did what, more votes than voters, dead people” and that they “can’t believe Dominion was used to do us dirty,” a reference to the false claim that voting machine company Dominion Voting Systems created and stole votes for Biden. The video not only had ads but also had “super chats” -- where people can pay the account to have their comments featured and which YouTube takes a cut of.
Other content posted since the insurrection and YouTube’s announcement include multiple videos alluding to a QAnon conspiracy theory that Italy helped cause voter fraud in the election; a video from a known conspiracy theorist falsely claiming that states illegally “changed the way that the election was carried out” and that it was evidence of “election fraud”; a video of radio host Sebastian Gorka claiming there was “incredible evidence of fraud”; and a video featuring a clip from a QAnon supporter claiming there was “real-time voter fraud” against Trump.
YouTube’s apparent failure to enforce this new crackdown comes as it has repeatedly allowed channels to monetize videos that violate the platform’s own rules, along with allowing the monetization of videos pushing misinformation in general.