Rep. Raskin (D-MD) tells Google's CEO that YouTube is "being used to promote propaganda that leads to violent events"
Video ››› ››› MEDIA MATTERS STAFF
Loading the player reg...
Loading the player reg...
White supremacist YouTuber Mark Collett commended Fox’s Tucker Carlson for discussing the white supremacist talking point of “white genocide” on his prime-time show. The shoutout came during the December 5 edition of Collett’s weekly YouTube livestream called This Week on the Alt-Right. During the episode, Collett also took credit for his own role in mainstreaming the term.
Self-described 'White Nationalist' Mark Collett, who hosts a Youtube show on the Alt-Right, praises Tucker Carlson for using the term "white genocide" on his Fox News show. pic.twitter.com/C88aejXBy2
— Jason Campbell (@JasonSCampbell) December 6, 2018
Collet is a British neo-Nazi whose racist content thrives on YouTube and whose extremism has been amplified by American far-right figures, including Fox’s Laura Ingraham and white supremacist darling Rep. Steve King (R-IA). YouTube allows Collett to monetize his extremist content and profit from spreading white supremacist propaganda, and his December 5 livestream was no exception. The Super Chat feature allowed viewers to pay for their messages to be featured more prominently in the live chat.
White supremacists often push the false narrative of “white genocide” to propagandize about what they claim are fatal threats against white people, like immigration or demographic change. On his prime-time Fox show, Carlson often echoes white supremacist talking points and has become increasingly explicit in championing white grievances, earning accolades among white supremacists along the way.
On the October 1 edition of his Fox show Tucker Carlson Tonight, Carlson specifically fearmongered to his audience about the threat of white genocide by pushing a literal interpretation of an angry tweet written by a Georgetown University professor protesting the nomination of Supreme Court Justice Brett Kavanaugh.
A majority of viral caravan coverage on Facebook and YouTube came from right-leaning sources, which frequently pushed anti-immigrant disinformation
Since Central American migrants fleeing poverty and violence slowly began making their way toward the U.S. southern border in a series of caravans, right-wing sources have dominated viral caravan content and coverage on Facebook and YouTube. A Media Matters study of Facebook and YouTube between October 13 and November 19 found that a majority of the caravan content with the most interactions came from right-leaning sources.
Among all sources analyzed in this study, Fox News had the most top-engaged Facebook links and page posts as well as the most caravan-related YouTube videos with over 100,000 views. On air, the cable network dedicated over 23 hours to caravan coverage in the first two weeks after the first caravan set off on October 13, and its reports often spread anti-immigrant disinformation and conspiracy theories.
Similarly, viral right-leaning caravan coverage on Facebook was riddled with anti-immigrant false news. On YouTube -- where far-right misinformation thrives -- some of the right-leaning channels dominating caravan-related video content were news aggregators run by sources that we could not verify, and others featured “alt-right” and far-right personalities.
Of the 267 caravan-related Facebook posts with the most interactions, 171 were posted by right-leaning pages. Fifty-one of these posts came from Facebook pages without any political alignment (19.1 percent) and 45 came from left-leaning pages (16.9 percent).
During the 38 days analyzed, Fox News’ main Facebook page had by far the highest number of posts with high engagement related to the migrant caravan, with 42 such posts (compared to the second highest number,the page of right-leaning The Daily Caller, which had 18 page posts). Nine of the 13 pages with five or more viral posts related to the caravan came from right-leaning sources. These right-leaning pages were Fox News, The Daily Caller, Ben Shapiro, Breitbart, Patriots United, ForAmerica, American Voices, Judicial Watch, and Conservative Tribune.
Viral content from right-leaning Facebook pages often depicted the migrant caravan as a violent invasion. The Facebook page American Voices, a channel on Facebook’s streaming service Facebook Watch, is run by the right-wing media outlet The Daily Caller and had multiple viral video posts that spread misinformation on the caravan and painted migrants as violent or criminal.
The most popular caravan post from American Voices, which has earned over 100,000 interactions and 5.8 million views, is a video that called the caravan a “potential crisis” and stoked fear about a supposed lack of defenses on the border. The video also misrepresented a fence on a specific part of the southern border as that area’s only “defense against the caravan.” Other viral videos from American Voices: falsely speculated that some members of the caravan weren’t from Central and South America; associated migrants and asylum seekers in the caravan with drug smugglers; and featured a clip of a Fox News guest calling the migrant caravan’s journey an “invasion and an act of war.”
Other viral posts from right-leaning pages spread baseless right-wing conspiracy theories about the nationality of members of the caravan and painted the caravan of migrants and asylum-seekers as an “invasion.”
Of the 278 most popular links on Facebook, 163 went to right-leaning websites (58.6 percent); only 14 links came from left-leaning websites (5.0 percent), and 101 came from websites without political alignment (36.3 percent).
As with Facebook pages, right-leaning websites made up the majority of domains with numerous top links to caravan-related content. Fox News once again topped caravan coverage on Facebook, with 32 top-performing links. Seven of the 12 domains with the most links in our study belonged to right-leaning outlets. The top right-leaning outlets were Fox News, Daily Wire, Breitbart, Western Journal, The Daily Caller, American Military News, and The Washington Times.
Some top links from right-leaning websites advocated for violence on the border against migrants and asylum-seekers, characterizing them as invaders. On Glenn Beck’s personal site and his outlet The Blaze, he penned an article titled “This is not a caravan, it’s an INVASION.” In it he claimed that the caravan was a “political stunt” to provoke violence from the National Guard and Border Patrol. Links to both earned over 50,000 interactions on Facebook. In a Fox News op-ed, political contributor and former Speaker of the House Newt Gingrich wrote that the caravan was “attempting to invade and attack the U.S.,” and he called on the president and Congress to stop the “attack.” The op-ed earned almost 48,000 interactions on Facebook.
Other right-leaning websites pushed false information on the caravan. Five of the top links on Facebook included debunked claims from a Project Veritas video that alleged that former senatorial candidate Rep. Beto O’Rourke’s (D-TX) campaign was illegally giving campaign funds to help the caravan.
The right-wing group Judicial Watch had multiple top links on Facebook that spread anti-immigrant conspiracies, including: an article falsely stating that the caravan poses a “serious public health threat”; one calling members of the caravan “gangbangers”; another calling the caravan a “movement that’s benefiting human smugglers”; and one article speculating that ISIS terrorists could be part of the caravan. All of these articles earned over 40,000 interactions on Facebook, with the most popular post earning over 84,000 interactions.
Eighty-five of the 128 caravan-related videos with over 100,000 views on YouTube were posted by right-leaning channels (66.4 percent). Only 24 caravan-related videos from channels without political alignment (18.8 percent) and 19 videos from left-leaning channels (14.8 percent) earned over 100,000 views.
Fox News’ YouTube channel posted the highest number of top-viewed videos about the migrant caravan, with Fox Business’ channel coming in third. The YouTube channel with the second highest number of top-viewed videos was kylekuttertv, with 14 caravan-related videos earning over 100,000 views apiece. Kylekuttertv is an unverified news aggregation channel, whose typically 30- to 60-minute videos feature a compilation of mainstream, right-wing, and fringe YouTube news clips framed under far-right and conspiracy-theory narratives, which are detailed in the video titles and descriptions.
Another unverified right-leaning news aggregation channel, GLOBAL News, had multiple top-viewed videos. GLOBAL News and kylekuttertv have each earned tens of millions of views, and they paint themselves as nonpartisan channels, while almost exclusively mixing clips from local media outlets with right-wing commentary from outlets including Fox News, NewsMax TV, and One America News Network.
Far-right and “alt-right” sources also had top-viewed YouTube videos on the caravan. The channel belonging to the far-right Canadian outlet Rebel Media published three videos about the caravan that each earned over 100,000 views on YouTube. In one video, Rebel Media host Ezra Levant speculates about whether caravan members have “antifa-style or paramilitary-”style training, and then he goes on to say that migrants and asylum-seekers in the caravan are not claiming to be “refugees fleeing from danger” and are “just looking to, you know, get rich, I suppose.” In addition, numerous vloggers linked to the “alt-right” -- including Stefan Molyneux, James Allsup, and Tarl Warwick (known online as Styxhexenhammer666) -- all had top-viewed videos on YouTube in which they stoked fear about the caravan.
Other right-wing media figures also used YouTube to spread false news and conspiracy theories to stoke fear about immigrants. On his YouTube channel, former Fox host Bill O’Reilly falsely implied that George Soros funded the migrant caravan. In a video from The Blaze that earned over 500,000 views, Glenn Beck falsely stated that Venezuela financed the migrant caravan and then speculated that Cuban and Venezuelan spies and terrorists could be using the caravan as a “cover” to enter and attack the U.S.
We then individually reviewed all posts, links, and videos to flag for irrelevant content -- content that had nothing to do with the migrant caravan, content from satire sources like The Onion or The Babylon Bee, and content that mentioned the migrant caravan only tangentially -- and excluded it from the study.
Researchers then reviewed sources and coded them as either “left-leaning,” “right-leaning,” or without political alignment. For Facebook page data, the source coded was the Facebook page. For links on Facebook, the domain of each link was coded. And for YouTube videos, the channel was coded.
Most sources had been previously coded as part of an earlier Media Matters study, and we used the previous political-alignment codes for those pages. For new sources, two researchers independently coded each link, Facebook page, and YouTube channel. We determined the ideological alignment of a source by considering the source’s name and published content. Sources that expressed opposition to President Donald Trump or focused on issues primarily aimed at liberals (e.g., protecting abortion rights, calling for action against gun violence, etc.) were coded as left-leaning. Sources that expressed support for Trump or focused on issues primarily aimed at conservatives (e.g., restricting abortion rights, downplaying gun violence, etc.) were coded as right-leaning. All right-wing and left-wing media outlets and organizations were automatically coded as right-leaning or left-leaning, respectively. Pages that did not have an ideological leaning in their content were coded as nonaligned. Coding conflicts were resolved between the two researchers with available information about the source’s political alignment.
Charts by Melissa Joskow.
Loading the player reg...
As Election Day gets underway in the 2018 midterm elections, right-wing misinformation and hoaxes are targeting voters on social media platforms -- including Facebook, Twitter, and YouTube -- and via text messages. The right-wing misinformation campaigns include hoaxes about Democrats burning flags, lies about a gubernatorial candidate buying votes, and followers of the conspiracy theory QAnon fearmongering about violent anti-fascist groups targeting voters.
Here are some examples:
Alex Jones promoted conspiracy theories about noncitizen and dead Democratic voters on Bitchute. During a broadcast published November 6 on Bitchute, a YouTube alternative, Jones said that polling indicates a “major red wave” and claimed without evidence that “they have caught people from Texas to Maryland, Democrats organizing illegal aliens to have mailed to their address absentee ballots in the name of dead people still on the rolls,” asking, “Will the Democrats be able to steal another election?”
In Florida, some voters got a text from someone impersonating a campaign staffer for Democratic Gubernatorial candidate Andrew Gillum. The text made misleading and false claims about Gillum’s campaign promises, including that he will "raise taxes on anyone making over $25,000 a year." As the Tampa Bay Times reported, Democrats have not proposed adding a state income tax (Florida does not current have one), and Gillum particularly has “repeatedly said that he wouldn’t propose” one. The text also mischaracterized Gillum’s position that “there is a racial element to the application” of Florida’s “stand your ground” law, falsely claiming he called it “a racist ideology.”
A member of Facebook group Drain The Swamp claimed that a report showed 1.7 million California voters were not registered.
A Twitter account posted a hoax video showing Democrats burning flags to celebrate a “blue wave.” From The Daily Beast:
One fake video that’s getting circulation on both Facebook and Twitter today purports to show CNN anchor Don Lemon laughing as Democrats burn flags in a celebration of the “blue wave.”
Twitter pulled the video from its site around 11:00 a.m. on Tuesday, although it’s still on Facebook.
The video, which claims to be a scene from CNN’s “Reliable Sources” comes complete with a CNN-style chyron: "Dems celebrate 'Blue Wave' Burning Flags on Election Day." The original version of the video has was viewed nearly 55,000 views on Twitter since being posted Monday, with the tweet promoting it retweeted nearly 5,000 times.
The video appears to have been first posted by Twitter user “@RealDanJordan,” who said it was a reason to vote for Republican candidates.
The same Twitter account pushed memes telling men to skip voting in order to help Democrats.
A user of the neighborhood social network Nextdoor posted false voter information.
New frontier in the misinfo wars: a reader submits a tip about false voter information being spread on Nextdoor (!).
FYI: everyone votes today, R or D. pic.twitter.com/EMDHRDCZBj
— Kevin Roose (@kevinroose) November 6, 2018
Trolls claiming to be from the Russian Internet Research Agency have been spamming reporters offering to give an inside scoop on their operations.
Today in failed trolling attempts:
1. Spam journalists by claiming to be the Internet Research Agency
2. Set up a website for the “IRA American Department”
3. Promptly get suspended from Twitter pic.twitter.com/mqXiWsaQIl
— Casey Michel 🇰🇿 (@cjcmichel) November 5, 2018
Users of different social media platforms are attempting to revive a false claim from 2016 that billionaire philanthropist George Soros owns a specific brand of voting machines.
A member of Facebook group Brian Kemp For Georgia Governor claimed without any proof that Georgia gubernatorial candidate Stacey Abrams is “buying votes.”
A 4chan account encouraged fellow users to post on Twitter a meme falsely claiming people can vote by text.
Conspiracy theorist “Q” encouraged supporters to be vigilant about voter fraud at the polls. On the anonymous message board 8chan, the anonymous poster known as “Q” encouraged supporters of the absurd “deep state” conspiracy theory to be vigilant about voter fraud at the polls. The conspiracy theorist pushed vague allegations of widespread voter fraud across the U.S. and stated that during the election, “uniformed and non-uniformed personnel will be stationed across the country in an effort to safeguard the public.”
A QAnon-themed YouTube channel posted a video echoing Q’s voter fraud conspiracy theories. As of this writing, the video had more than 43,300 views.
A pro-Trump Facebook page spread similar claims that fearmongered about election fraud. The page posted a screenshot from the original 8chan post that had been taken from that YouTube video:
In a QAnon Facebook group, one user claimed that voting machines in Pennsylvania were switching votes for non-Democratic candidates into votes for Democratic candidates.
Natalie Martinez, Timothy Johnson, and Melissa Ryan contributed research to this piece.
The NRA's state association for Nevada also endorses Joyce Bentley
Republican congressional candidate Joyce Bentley recently tweeted out a “QAnon” conspiracy theory video which claims that “the deep state” occupies “the highest levels of power" and the "cabal" includes members who are part of "a dark and deeply sinister death cult with a strong reliance on symbolism and numerology with levels of cruelty unimaginable to all right-thinking people."
The Nevada Republican Party’s website includes Bentley in its list of “our” Republican candidates and encourages people to vote for her. She’s also endorsed by the political arm of the Nevada Firearms Coalition, an independent organization that is “affiliated with and recognized by the National Rifle Association” as one of its state associations.
On October 17, Bentley tweeted out a YouTube video with the title “Q -- We Are The Plan” by the account “Storm Is Upon Us.” Both phrases are references to “QAnon” or “The Storm,” a sprawling and nonsensical conspiracy theory which claims that an anonymous government official with “Q” clearance has been leaving clues online about President Donald Trump’s actions against the so-called “deep state” and its alleged activities, including child trafficking. The “Storm Is Upon Us” account contains other videos dedicated to QAnon, including one video that says various political figures would have been "hung ... for treason" by the Founding Fathers.
Media Matters has documented right-wing media figures who have helped spread the conspiracy theory online. QAnon surfaced in the news during the summer when a man who was reportedly influenced by the conspiracy theory drove an armored truck to the Hoover Dam and blocked traffic on the nearby Mike O'Callaghan-Pat Tillman Bridge (he is now facing terrorism charges).
The video also includes violent and conspiratorial images; for example:
Media Matters has previously documented numerous Republican officials and candidates who have ties to conspiratorial media figures and social media groups. (In some instances, local Republican officials reversed their backing after news surfaced that the candidates were pushing conspiracy theories.) In July, a Twitter account for the Hillsborough County Republican Executive Committee of Florida tweeted out (and later deleted) a video promoting QAnon.
Loading the player reg...
From "gurus" to extremist "influencers," the video site is a potent tool for ideologues
For the casual YouTube viewer -- someone who logs on once in a while to access cute kitten videos or recipe demonstrations -- it can be difficult to imagine that the video site is also a teeming cesspit of hate speech and a prime means of its transmission.
But a new study from think tank Data & Society and the earlier work of ex-YouTube engineer Guillaume Chaslot reveal the technical and social mechanisms underlying an inescapable truth: Thanks to an algorithm that prioritizes engagement -- as measured by the interactions users have with content on the platform -- and “influencer” marketing, YouTube has become a source of right-wing radicalization for young viewers.
An algorithm that incentivizes extreme content
YouTube’s recommendation algorithm dictates which videos rise to the top in response to search queries, and, after a video finishes playing, it populates the video player window with thumbnails recommending further content. According to a Wall Street Journal analysis, YouTube’s algorithm “recommends more than 200 million different videos in 80 languages each day.” These recommendations take into account what the viewer has already watched, but it’s all in the service of engagement, or, as the Journal’s Jack Nicas put it, “stickiness” -- what keeps the viewer on the site, watching. The longer viewers watch, the more ads they see.
But this has unintended consequences.
“They assume if you maximize the watch time, the results are neutral,” Guillaume Chaslot, a former Google engineer and creator of the YouTube algorithm analysis tool Algo Transparency, told Media Matters. “But it’s not neutral ... because it’s better for extremists. Extremists are better for watch time, because more extreme content is more engaging.”
In a way, it’s common sense -- videos that make inflammatory claims or show explosive images tend to grab viewers’ attention. And attention-grabbing videos -- those that cause viewers to watch more and longer -- rise up in the recommendation algorithm, leading more new viewers to see them in their list of recommended videos.
As the Journal’s analysis showed, viewers who began by viewing content from mainstream news sources were frequently directed to conspiracy theory-oriented content that expressed politically extreme views. A search for “9/11” quickly led Journal reporters to conspiracy theories alleging the U.S. government carried out the attacks. When I searched the word “vaccine” on YouTube using incognito mode on Google Chrome, three of the top five results were anti-vaccine conspiracy videos, including a video titled “The Irrefutable Argument Against Vaccine Safety,” a series titled “The Truth About Vaccines” with more than 1 million views, and a lecture pushing the debunked pseudo-scientific claim that vaccines are linked to autism.
Because YouTube’s algorithm is heavily guided by what has already been watched, “once you see extremist content, the algorithm will recommend it to you again,” Chaslot said.
The result is a tailor-made tool for radicalization. After all, once users have started exploring the “truth” about vaccines -- or 9/11, or Jews -- the site will continue feeding them similar content. The videos that auto-played after “The Truth About Vaccines” were, in order: “My Vaxxed child versus my unvaccinated child”; “Worst Nightmare for Mother of 6 Unvaxxed Children” (description: “The mother of 6 unvaccinated children visits the emergency room with her eldest daughter. Her worst nightmare becomes reality when her child is vaccinated without her consent”); and “Fully Recovered From Autism,” each with more than 160,000 views.
“By emphasizing solely watch time, the indirect consequence that YouTube doesn’t want to acknowledge is that it’s promoting extremism,” Chaslot said.
Chaslot emphasized that YouTube’s own hate speech policy in its Community Guidelines was unlikely to meaningfully curb the flourishing of extremist content. The primary issue: The algorithm, which controls recommendations, is utterly separate from the company’s content-moderation operation. The result is a fundamentally self-contradictory model; engagement alone controls the rise of a video or channel, independent from concerns about substance.
There’s also what Chaslot called “gurus” -- users who post videos that cause viewers to engage for hours at a time. As a result, even if their audiences begin as relatively small, the videos will rise up in the recommendation algorithm. The examples he provided were PragerU, a right-wing propaganda channel whose brief explainer videos have garnered some 1 billion views, and Canadian pop-antifeminist Jordan Peterson’s channel.
But the guru effect has the power to amplify far more troubling content, and, according to new research, far-right extremists have adapted to a world of recommendation algorithms, influencer marketing, and branding with ease and efficiency.
The sociopath network
YouTube isn’t just a sea of mindless entertainment; it’s also a rather ruthless market of individuals selling their skills, ideas, and, above all, themselves as a brand. YouTube’s Partner Program provides financial incentives in the form of shares of advertising revenue to “creators” who have racked up 4,000 hours of audience attention and at least 1,000 subscribers. For those who become authentic micro-celebrities on the platform, the viral-marketing possibilities of becoming a social-media “influencer” allow them to advertise goods and products -- or ideologies.
Becca Lewis’ groundbreaking new study from Data & Society catalogues the ways that ideological extremists have cannily adapted the same techniques that allow makeup vloggers and self-help commentators to flourish on the video site. The study, titled “Alternative Influence: Broadcasting the Reactionary Right on YouTube,” is an unprecedented deep dive into 81 channels that spread right-wing ideas on the site. Crucially, it also maps the intricate interconnections between channels, breaking down how high-profile YouTube figures use their clout to cross-promote other ideologues in the network. (Media Matters’ own study of YouTube extremists found that extremist content -- including openly anti-Semitic, white supremacist, and anti-LGBTQ content -- was thriving on the platform.)
Lewis’ study explores and explains how these extremists rack up hundreds of thousands or even millions of views, with the aid of a strong network of interconnected users and the know-how to stand out within a crowded field of competing would-be influencers.
The study provides a concrete look at the blurring of lines between popular, right-wing YouTube content creators often hosted on conservative media outlets like Fox News like Dave Rubin, Ben Shapiro, and Candace Owens, and openly white supremacist content creators with smaller platforms. In many cases, Lewis found that these channels had invited the same guests to speak from other channels in the network, leading to the creation of “radicalization pathways.” Rubin, whose channel has 750,000 subscribers, was cited as an example for hosting the Canadian racist commentator Stefan Molyneux. “Molyneux openly promotes scientific racism, advocates for the men’s rights movement, critiques initiatives devoted to gender equity, and promotes white supremacist conspiracy theories focused on ‘White Genocide,’” Lewis writes. During his appearance on Rubin’s channel, the host failed to meaningfully challenge Molyneux’s ideas -- lending credibility to Molyneux’s more extreme worldview.
Rubin vehemently denied charges of his association with white supremacy on Twitter, but failed to refute the specifics of Lewis’ findings:
Despite Rubin’s assertion, Lewis’ study does not mention the word “evil.” What the study does make clear, however, are the ways in which web-savvy networks of association and influence have become crucial to the spread of extremist ideologies on the internet. The issue of racist, sexist, and anti-LGBTQ content is not limited to obscure internet fever swamps like 4chan and Gab -- but it is also happening in a public and highly lucrative way on the web’s most popular video platform.
Conservative provocateur Ben Shapiro, named as an influencer in the network, also sought to discredit the study.
But Shapiro was only separated by one degree, not six, from Richard Spencer: He has been interviewed by a right-wing YouTuber, Roaming Millenial, who had invited Richard Spencer to share his views on her channel two months earlier.
“There is an undercurrent to this report that is worth making explicit: in many ways, YouTube is built to incentivize the behavior of these political influencers,” Lewis writes. “The platform, and its parent company, have allowed racist, misogynist, and harassing content to remain online – and in many cases, to generate advertising revenue – as long as it does not explicitly include slurs.”
Just last week, extremist hate channel Red Ice TV uploaded a screed titled “Forced Diversity Is Not Our Strength,” promoting segregated societies. Hosted by gregarious racist Lana Lotkeff, who has become a micro-celebrity in the world of white supremacists, the video asserts that “minorities and trans people” have had a negative impact on “white creativity.”
Red Ice TV has more than 200,000 subscribers. At press time, the “Forced Diversity” video had more than 28,000 views. Upon completion of Lotkeff’s anti-diversity rant, YouTube’s auto-play suggested more Red Ice TV content -- this time a video fearmongering about immigrants -- thus continuing the automated cycle of hate.
Creators are profiting off hateful content
On August 6, YouTube removed the channel belonging to Infowars’ Alex Jones, citing violations of community guidelines.
"All users agree to comply with our Terms of Service and Community Guidelines when they sign up to use YouTube,” YouTube’s parent company, Google, said in a statement to CNBC. “When users violate these policies repeatedly, like our policies against hate speech and harassment or our terms prohibiting circumvention of our enforcement measures, we terminate their accounts.”
YouTube’s action came as numerous other tech companies, including Apple, Facebook, and Spotify, took action against Jones.
But for those who monitor the popular video streaming platform, it’s hard not to see YouTube’s move as a selective, belated, and inadequate action to quell the hate speech that currently thrives on the platform.
In a brief research survey, Media Matters found multiple channels with tens of thousands of subscribers -- and some videos with hundreds of thousands of views -- that seem to clearly violate YouTube’s terms of service about hate speech. These channels expose YouTube’s primarily youthful viewership to some of the vilest propaganda on the Internet, and they make a tidy profit to boot.
A Pew Research Survey found that YouTube is the most popular social media platform among teens. It showed that 85 percent of 13- to 17-year-olds reported using YouTube, and 32 percent said it’s the platform that they use most often. Meanwhile, in the last three years, Facebook usage among teenagers has fallen significantly. Pew also found YouTube to be most popular among 18- to 24-year-olds; 948 percent of respondents said they use the platform. The impact YouTube has on young people is not to be underestimated.
YouTubers with significant audiences can profit by creating content that draws advertisements. It’s difficult to say how much any individual creator makes, but Polygon estimates that a very large creator like Jake Paul -- who is in the top 100 channels in terms of number of subscribers -- makes $10 for every 1,000 views. While this rate is significantly lower for channels with smaller followings,channels that livestream their content -- common practice among far-right YouTubers -- can get additional income by using “super chats.” Super chats allow viewers to pay to have their comments featured prominently. On a livestream, there is usually a constant flow of comments appearing along the side of the video, but super chat comments are placed in a bar at the top of the chat and creators can react to or read them on air. The more a user pays, the longer their comment appears at the top.
Like super chats, donations to content creators can also come in through alternative servers that are not hosted by YouTube, like in the example below.
YouTubers also rely on Multi-Channel Networks (MCN), which provide a variety of services to YouTube creators, including aiding and increasing their monetization rates, expanding audiences, and, most importantly for extremists, appealing YouTube strikes, which are issued when YouTube reviewers are notified that content is in violation of community guidelines.
The impunity with which racists operate on the site -- and the profitability of their efforts -- make YouTube a potent ground for young people to be exposed to toxic ideologies. Or, as Zeynep Tufekci, a professor and expert in social networks, put it in a powerful editorial for The New York Times, “Given its billion or so users, YouTube may be one of the most powerful radicalizing instruments of the 21st century.”
Below is a sample of YouTube channels that Media Matters found to have violated YouTube’s terms of service, but that continue to profitably engage viewers by the tens or hundreds of thousands. In particular, these videos appear to violate YouTube’s policy against “content that ‘promotes violence against or has the primary purpose of inciting hatred against’ protected classes” -- including LGBTQ individuals, Jewish people, African-Americans, and other racial minorities.
Jesse Lee Peterson is a far-right radio host and media personality whose radio show The Jesse Lee Peterson Show airs on Newsmax TV and is reposted to his YouTube page, which currently boasts 135,000 followers. On another YouTube channel with over 159,000 followers, Peterson hosts his show The Fallen State, where he interviews activists, celebrities and other public figures.
Peterson’s YouTube content contains a torrent of anti-Black, anti-gay, and misogynistic hate. In July 2018, he announced it was “white history month,”, saying, “Happy white history month, white folks., Tthis is your country, thank you --, I appreciate it.”
In a video titled “Most Blacks Are Mentally Retarded!” Peterson said Jim Crow laws were good for Black people because they helped their “mentality” and that “most Blacks today, as I mentioned, most Blacks today -- unlike the days when I was growing up -- are mentally ill, they’re mentally retarded.” He has compared the Ku Klux Klan to Black Lives Matter, describing the latter as “a Black, radical, evil, agitated organization that was founded by a bunch of Black lesbians and Black homosexuals.” In another video, Peterson described transgender people as “messed up,” “abnormal,” “confused” people who’ve “been traumatized.” He said refusing to recognize transgender people might help them “overcome their traumas.”
On The Jesse Lee Peterson Show, Peterson gave a platform to notorious neo-Nazi Andrew Auernheimer, also known as “weev,” to spew anti-Semitic, racist, and homophobic hate with zero pushback. In the video titled “WEEV! White Nationalism, Jews, Homosexuals, and Black people - Daily Stormer,” Aurenheimer identified himself as a white nationalist and called the FBI “a Jewish terror organization.” and He said he doesn’t live in the United States because it’s a country “full of whores and faggots and pornography and wickedness.” He called for America to become a white ethnostate because Black people are “the tools of Jewry” who “betrayed the values of all common decency [and] of morality.” He said, “Righteousness and color are equivalent, because segregation increases trust within a society.”
Jean-Francois Gariépy is a YouTube personality and former Duke University student who relies on his background as a neuroscience researcher to give credence to bogus “race science” theories he pushes on YouTube. The long-standing racist trope holds that “humankind is divided into separate and unequal races.” Gariépy has two channels, “JFG Livestreams,” which has 20,000 followers, and “Jean-Francois Gariépy,” which has 40,000 followers. His show The Public Space, which normally streams daily, features a cesspool of white supremacist guests including former Ku Klux Klan Grand Wizard David Duke, Nick Fuentes, Mark Collett, Richard Spencer, Mike “Enoch” Peinovich, and Vox Day.
Many of Gariépy’s videos are monetized through advertisement placements. Kelly Weill of The Daily Beast identified Gariépy as an advocate for a white ethnostate. She also described a legal battle he had with his ex-wife in which she alleged that he tried to kidnap their child. A separate lawsuit alleges that Gariépy had a sexual relationship with a 19-year-old autistic teenager and attempted to get her pregnant “for U.S. immigration purposes.”
Gariépy’s The Public Space recently streamed an episode titled “The Truth About German Racial Ideology” with Weronika Kuzniar, a cosplayer and proponent of Third Reich revisionism with multiple books for sale on Amazon who says she works to “De-Weaponize Third Reich History.” During her appearance on Gariépy’s YouTube channel, he described Adolf Hitler’s Mein Kampf as “pretty solid in terms of understanding basic issues of biology [and] basic issues of race.” Kuzinar responded by citing “a good book” that “denies that there is any anti-Semitism that can be detected in Hitler’s background.”
In a monetized stream he titled “A Discussion with Ryan Faulk about Race, IQ and Nationalism,” Gariépy and guest Ryan Faulk -- the founder of the white nationalist site The Alternative Hypothesis -- discussed the potential for violence that would be required to establish a white ethnostate in America. Faulk claimed that “from a historical sense,” the United States has always been a white country, and “the only real solution today is a full on partition of the United States” based on racial lines. Faulk conceded that “a violent civil war” might be a result of trying to achieve that goal. Gariépy endorsed the idea that violence is possible despite a “modern society that is very polite” because there is “within humans a capacity for violence that can express itself within a few days if people are in the right condition for violence.” They also discussed what Gariépy characterized as the “very mainstream idea” that “there is an observed phenotypical difference” in IQ levels between racial groups.
During a stream with “alt-right” leader Richard Spencer, Gariépy called white nationalism “great” and “a romantic vision and one that could be, even in pragmatic terms, a reality. That would be the only option in the future when the white race has lost so much power across western civilization.” Gariépy has also hosted Patrick Casey of the “alt-right” group Identity Evropa in a monetized stream where Casey said that “the best framework for … human civilization overall to be able to exist” is “a degree of separation between ethnic and racial groups.” In another monetized stream, the anti-feminist Lacey Lynn (who has also appeared on neo-Nazi YouTuber Mark Collett’s show, This Week on the Alt-Right) argued that the movement for women’s suffrage was an “anti-male, … communist, anti-family, anti-nation movement” and praised the “privilege that women had being under coverture” laws, which made women legally subordinate to their husbands.
Nick Fuentes, host of America First with Nicholas J Fuentes, is an “alt-right” online personality whose channel has 17,000 followers and streams approximately five days a week. He was previously fired from Right Side Broadcasting, an online pro-President Donald Trump outlet, after he called for the people who run CNN to be “arrested and deported or hanged.” He also was a participant in the Charlottesville, VA, Unite the Right rally last summer. In a YouTube stream titled “Embrace the State feat. Lucian Wintrich,” Fuentes described himself as an “authoritarian.” During the same stream, in a discussion about the film I Feel Pretty starring Amy Schumer, Wintrich called her a “fat, ugly slob” and Fuentes said she “should be a literal punching bag in some cases.”
During a recent stream titled “The Death of Mollie Tibbets,” Fuentes attacked Hispanic immigrants in the United States, saying, “The problem that we see is it’s the people -- it’s not the culture, it’s not their legal status, it’s not their paperwork; it’s who they are. It’s coursing through their blood, it’s their DNA. They’re different. Race is real. These people are different. They’re not European. It’s not arbitrary that they come from Mexico.” In a different stream, titled “White Identity Gaslighting,” Fuentes applauded the Trump administration for revoking passports from American citizens in Texas. He called this development a “big white pill,” meaning a reason for white supremacists to have hope, and called it one “of the more aggressive approaches to solving the demographic issues.”
In a stream titled “Who *owns* the Media? Hello,” a reference to a tweet from Elon Musk in which he asked the same question, Fuentes called whistleblower Chelsea Manning a “tranny freak” and said she is “mentally ill.”
Fuentes frequently hosts white supremacists on his show. This includes Matt Colligan, known online as “Millennial Matt,” who was a participant in the 2017 rally in Charlotteville and once waved a flag featuring a swastika during a Periscope stream with Lucian Wintrich. During a stream titled “THOT WARS,” Colligan denied the Holocaust, calling it “one of the greatest lies in history,” and said his goal was “to become a public Holocaust revisionist.” Other extremists seen on Fuentes’ America First include Identity Evropa’s Patrick Casey; white nationalist Douglass Mackey, A.K.A. “Ricky Vaughn”; conspiracy theorist and anti-Muslim carnival barker Laura Loomer; and fellow YouTuber Gariépy.
Mark Collett is a 37-year-old British far-right activist and author of the book The Fall of Western Man, which features chapter titles including “The Role of Feminism --The Destruction of the Family Unit.” An open reactionary with extreme white supremacist views, he was once featured in a documentary called Young, Nazi and Proud. He was also acquitted in Britain on charges of inciting racial hatred after a television interview in which he called asylum seekers “cockroaches.”
Currently, Collett has a YouTube channel with 42,000 subscribers. Just last month, he featured David Duke in a livestream called This Week on the Alt-Right. Other recent videos include “The Jewish Question Answered in 4 Minutes,” “The Plot to Flood Europe with 200 Million Africans” (for which, as of this writing, YouTube has “disabled certain features” because it was identified “as inappropriate or offensive to some audiences.”) and “The Death of White America.”
“The Jewish Question Answered in 4 Minutes” includes graphics singling out and identifying journalists as Jews -- which surely violates YouTube’s Community Guidelines. The hate speech policy at YouTube prohibits content “that promotes violence against or has the primary purpose of inciting hatred against individuals or groups based on certain attributes, such as race or ethnic origin [and] religion.”
”In addition, this explicitly anti-Semitic video posits that “Jews have attacked the glue that holds our communities together, with the aim of breaking up Western society” and that Jewish people “seek to strip … power from those of European descent.”
In another video, “The Holocaust: An Instrument of White Guilt,” Collett engages in a winking, coquettish flirtation with Holocaust denial, a classic abuse of the “just asking questions” format. He continually refers to the Holocaust as the “alleged extermination of 6 million Jews at the hands of the German people during World War II.” Ultimately, Collett bemoans the fact that “the Holocaust is the one historical event that cannot be questioned,” and ascribes this to “Zionist power.” At the video’s conclusion, he seems to suggest that the Holocaust was inspired by righteous forces: “The Holocaust is the most powerful tool in the promotion of a mindset that is foisted upon those of European descent in order to make them feel guilty for pursuing self-determination, to make them feel guilty for loving their own."
Another video with over 100,000 views blames Jewish people for the pornography industry.
In addition to explicitly anti-Semitic content, Collett also traffics in conspiracy theories about the cruel regime of Syrian President Bashar Assad (“Assad Didn’t Do It -- Faked Syrian Gas Attack”) and about LGBT individuals, whom he claims are seeking to “normalize paedophilia” through “debased degeneracy.” Although the latter video was flagged as “inappropriate,” it has garnered over 137,000 views.
Red Ice TV is an explicitly racist channel that boasts an impressive viewership: With 227,000 subscribers, its hosts claim to reach 1 million viewers a month. Most of their videos draw audiences in the tens of thousands.
The channel was founded by Henrik Palmgren and his wife, Lana Lokteff, far-right white supremacists whose content is consistently racist, anti-Semitic, and anti-immigrant.
The channel has hosted extremist Richard Spencer and featured Holocaust denier Kevin Macdonald discussing the “JQ” (Jewish Question).
Lokteff has received attention in the media as one of the few female faces of the “alt-right,” while her husband and cohost Palmgren took part in the infamous “Unite the Right” rally in 2017 in Charlottesville, VA, which resulted in the death of counterprotester Heather Heyer after a white nationalist drove a car into a crowd.
In a June video (which as of this writing has been flagged as “inappropriate” but is still accessible after a couple clicks) titled “Why Interracial Relationships are Pushed on White Women,” with over 500,000 views, Lokteff stated, “I do not accept the promotion of interracial relationships, it is very targeted and promoted to white people… You should think your race is the most attractive.” Later in the video, she claimed that “a mulatto baby” was a “trendy” accessory for modern women -- “forget the purse.”
The channel continually stirs up fear about immigration -- calling immigration advocates “anti-white poison” -- and stoking the racial fears of a white, male audience.
Many commentators have noted the radicalizing effect viewing increasingly extreme content can have on viewers. YouTube’s ongoing decision to continue to allow channels that are in blatant violation of its terms of service while rewarding their extremist creators through monetary incentives is a dangerous abdication of responsibility on the part of the media giant.
This morning, the Senate Intelligence Committee questioned Facebook COO Sheryl Sandberg and Twitter CEO Jack Dorsey on Russian interference in the 2016 election. The House of Representatives Energy and Commerce Committee are scheduled to question Dorsey about anti-conservative bias on Twitter.
On the Senate intelligence committee hearing, Media Matters’ President Angelo Carusone explained:
The tech industry’s failure to grapple with its roles in allowing -- and sometimes even enabling -- the fake news crisis and foreign interference in American elections is a national security crisis. The Senate intelligence committee is currently our best hope for getting some insight into the steps that tech companies have taken to address known problems. The committee is at least trying.
On balance, committee members have treated this issue with the gravity it warrants and have worked to give the public actionable information about election interference and manipulation of the information ecosystem.
It’s been two years though since the fake news crisis of 2016 -- and for the committee to keep its passing grade, it’s going to need to put more pressure on these platforms to not only address the problems we know about, but to start focusing on preventing the next fake news crisis that will be fueled by synthetic video and synthetic audio.
On the House Energy and Commerce Committee hearing, Carusone added:
In contrast to their Senate colleagues, who are at least trying to stay focused on this national security crisis, House Energy and Commerce Committee has turned its inquiry into an embarrassing partisan mess steeped in conspiracy theories and right-wing chicanery. House Republicans don’t seem at all concerned with understanding and preventing foreign interference and instead are more concerned with helping Trump’s 2020 campaign manager, Brad Parscale, work the refs so that they can cheat the system like they did in 2016.
These hearings should be focused on things that we know are real, like foreign intervention, bots, algorithmic manipulations and other cheating -- where a lot more needs to be done in order to neutralize those threats.
In 2016, right-wing efforts to game the refs led Facebook to make significant changes its trending topics section that ended up greatly contributing to amplification of fake news as well as changes to its ad approval rules that helped the Trump campaign execute an aggressive voter suppression campaign. And baseless cries of bias no doubt contributed to Twitter’s inconsistent policy enforcement and inadequate response to its climate of harassment. So, Republicans on the House Energy and Commerce Committee working hand-in-hand with right-wing political activities to help work the refs is alarming and worthy of scorn.
Functioning democracy is actually at stake. Neither Twitter nor Congress should be wasting its time with this baseless and partisan bullshit.
For a brief time, searching for Tom Hanks or Steven Spielberg on the video site brought up baseless accusations
On the morning of July 30, if you were searching YouTube for Tom Hanks or Steven Spielberg -- wanting to learn a little about Hollywood royalty, or just to find that funny clip from Big you loved years ago -- you would have been in for an unpleasant surprise.
As NBC’s Ben Collins first pointed out on Twitter, the search results for Hanks and Spielberg were dominated by conspiracy theories, alleging that both Spielberg and Hanks -- along with other celebrities including like Seth Green and Macaulay Culkin -- were pedophiles and, a part of a nefarious ring of Hollywood child predators that online conspiracy theorists had dubbedentitled #Pedowood.
The videos that popped up upon searching for Spielberg and Hanks were low-quality-fi, rambling, close-up shots, several made by a man named Isaac Kappy, a minor actor who has spent the last week posting video-recorded rants on YouTube with titles like “Famous Actor Exposes Hollywood Pedophiles! Steven Spielberg, Tom Hanks And More! #Pizzagate.” Thanks to rapid dissemination on message boards Reddit and 4chan, the videos garnered hundreds of thousands of views and shot up in the YouTube rankings, eclipsing interviews and movie clips featuring the stars.
The hashtag #Pizzagate included in the title of Kappy’s video is a reference to the Pizzagate conspiracy theory, which posits that prominent Democrats are running a child sex-slave ring out of a Washington, D.C., pizza restaurant. The conspiracy theory culminated in one adherent firing an automatic weapon inside the pizzeria. According to BuzzFeed, the newfound allegations of pedophilia against Hanks can be traced back to Twitter user Sarah Ruth Ashcraft, a prominent member of the QAnon conspiracy theory community, which grew out of Pizzagate and has mushroomed into baroque complexity. The ever-growing QAnon conspiracy theory, which is flexible enough to accommodate a wide range of events, asserts that a broad array of prominent figures with liberal leanings are part of an international child sex-slavery operation. The theory has hundreds of thousands of devotees on Reddit, YouTube, Facebook, and Twitter and countless dedicated blogs. (Roseanne Barr is a prominent believer in QAnon.) People are even showing up to Trump rallies dressed in "Q" apparel.
People lining up for the Trump rally in Tampa today. A lot of the chan anons might treat Q-Anon like a LARP, but by all appearances there are plenty of people who take it seriously irl. pic.twitter.com/uys7kmnAs1
— Travis View (@travis_view) July 31, 2018
Ashcraft, who frequently uses the hashtag #QAnon, has over 45,000 Twitter followers and uses her page to decry “Ritual Abuse, Mind Control, Child Porn, and Sex Trafficking,” focusing her ire on the alleged wrongdoings of celebrities like Hanks. (Since Ashcraft’s accusations against Hanks made headlines, and after BuzzFeed pointedly reached out to the social media company, her Twitter page has been restricted.)
After NBC’s Collins reached out to YouTube for comment, some of the conspiracy-theory videos dropped in search rankings for the celebrities. A spokesperson for YouTube told Buzzfeed, “We’re continuously working to better surface and promote news and authoritative sources to make the best possible information available to YouTube viewers.”
The hyperconnectivity of social media can make constructive messages spread fast -- and destructive falsehoods spread even faster. This latest incident is another powerful illustration of the ways in which social media can be gamed by conspiracy theorists. It’s an issue social networks have struggled to fully grasp; any suppression of conspiracy theorists’ pages, after all, lends credence to the notion that they are oppressed keepers of vital truths. Infowars’ Alex Jones was recently personally banned from Facebook for 30 days after the platform determined that several videos he shared were determined to have violated community standards; Jones and his fanbase reacted with predictable opprobrium and claims of censorship. But Facebook did not assert that Jones’ penchant for spreading baseless conspiracy theories was part of the rationale for the ban; instead, it focused on policies regarding hate speech and bullying. That, in turn, raised questions of why Infowars as a whole did not receive a ban.
Social media platforms that purport to be concerned with the spread of "fake news" must consider -- and contain -- conspiracy theories proactively, not just when journalists point them out. Left unchecked, those conspiracy theories have a direct connection to subsequent harassment and worse.