How YouTube facilitates right-wing radicalization

From “gurus” to extremist “influencers,” the video site is a potent tool for ideologues

Sarah Wasko/Media Matters

For the casual YouTube viewer -- someone who logs on once in a while to access cute kitten videos or recipe demonstrations -- it can be difficult to imagine that the video site is also a teeming cesspit of hate speech and a prime means of its transmission.

But a new study from think tank Data & Society and the earlier work of ex-YouTube engineer Guillaume Chaslot reveal the technical and social mechanisms underlying an inescapable truth: Thanks to an algorithm that prioritizes engagement -- as measured by the interactions users have with content on the platform -- and “influencer” marketing, YouTube has become a source of right-wing radicalization for young viewers.

An algorithm that incentivizes extreme content

YouTube’s recommendation algorithm dictates which videos rise to the top in response to search queries, and, after a video finishes playing, it populates the video player window with thumbnails recommending further content. According to a Wall Street Journal analysis, YouTube’s algorithm “recommends more than 200 million different videos in 80 languages each day.” These recommendations take into account what the viewer has already watched, but it’s all in the service of engagement, or, as the Journal’s Jack Nicas put it, “stickiness” -- what keeps the viewer on the site, watching. The longer viewers watch, the more ads they see.

But this has unintended consequences.

“They assume if you maximize the watch time, the results are neutral,” Guillaume Chaslot, a former Google engineer and creator of the YouTube algorithm analysis tool Algo Transparency, told Media Matters. “But it’s not neutral ... because it’s better for extremists. Extremists are better for watch time, because more extreme content is more engaging.”

In a way, it’s common sense -- videos that make inflammatory claims or show explosive images tend to grab viewers’ attention. And attention-grabbing videos -- those that cause viewers to watch more and longer -- rise up in the recommendation algorithm, leading more new viewers to see them in their list of recommended videos.

As the Journal’s analysis showed, viewers who began by viewing content from mainstream news sources were frequently directed to conspiracy theory-oriented content that expressed politically extreme views. A search for “9/11” quickly led Journal reporters to conspiracy theories alleging the U.S. government carried out the attacks. When I searched the word “vaccine” on YouTube using incognito mode on Google Chrome, three of the top five results were anti-vaccine conspiracy videos, including a video titled “The Irrefutable Argument Against Vaccine Safety,” a series titled “The Truth About Vaccines” with more than 1 million views, and a lecture pushing the debunked pseudo-scientific claim that vaccines are linked to autism.

Because YouTube’s algorithm is heavily guided by what has already been watched, “once you see extremist content, the algorithm will recommend it to you again,” Chaslot said.

The result is a tailor-made tool for radicalization. After all, once users have started exploring the “truth” about vaccines -- or 9/11, or Jews -- the site will continue feeding them similar content. The videos that auto-played after “The Truth About Vaccines” were, in order: “My Vaxxed child versus my unvaccinated child”; “Worst Nightmare for Mother of 6 Unvaxxed Children” (description: “The mother of 6 unvaccinated children visits the emergency room with her eldest daughter. Her worst nightmare becomes reality when her child is vaccinated without her consent”); and “Fully Recovered From Autism,” each with more than 160,000 views.

“By emphasizing solely watch time, the indirect consequence that YouTube doesn’t want to acknowledge is that it’s promoting extremism,” Chaslot said.

Chaslot emphasized that YouTube’s own hate speech policy in its Community Guidelines was unlikely to meaningfully curb the flourishing of extremist content. The primary issue: The algorithm, which controls recommendations, is utterly separate from the company’s content-moderation operation. The result is a fundamentally self-contradictory model; engagement alone controls the rise of a video or channel, independent from concerns about substance.

There’s also what Chaslot called “gurus” -- users who post videos that cause viewers to engage for hours at a time. As a result, even if their audiences begin as relatively small, the videos will rise up in the recommendation algorithm. The examples he provided were PragerU, a right-wing propaganda channel whose brief explainer videos have garnered some 1 billion views, and Canadian pop-antifeminist Jordan Peterson’s channel.

But the guru effect has the power to amplify far more troubling content, and, according to new research, far-right extremists have adapted to a world of recommendation algorithms, influencer marketing, and branding with ease and efficiency.

The sociopath network

YouTube isn’t just a sea of mindless entertainment; it’s also a rather ruthless market of individuals selling their skills, ideas, and, above all, themselves as a brand. YouTube’s Partner Program provides financial incentives in the form of shares of advertising revenue to “creators” who have racked up 4,000 hours of audience attention and at least 1,000 subscribers. For those who become authentic micro-celebrities on the platform, the viral-marketing possibilities of becoming a social-media “influencer” allow them to advertise goods and products -- or ideologies.

Becca Lewis’ groundbreaking new study from Data & Society catalogues the ways that ideological extremists have cannily adapted the same techniques that allow makeup vloggers and self-help commentators to flourish on the video site. The study, titled “Alternative Influence: Broadcasting the Reactionary Right on YouTube,” is an unprecedented deep dive into 81 channels that spread right-wing ideas on the site. Crucially, it also maps the intricate interconnections between channels, breaking down how high-profile YouTube figures use their clout to cross-promote other ideologues in the network. (Media Matters’ own study of YouTube extremists found that extremist content -- including openly anti-Semitic, white supremacist, and anti-LGBTQ content -- was thriving on the platform.)

Lewis’ study explores and explains how these extremists rack up hundreds of thousands or even millions of views, with the aid of a strong network of interconnected users and the know-how to stand out within a crowded field of competing would-be influencers.

The study provides a concrete look at the blurring of lines between popular, right-wing YouTube content creators often hosted on conservative media outlets like Fox News like Dave Rubin, Ben Shapiro, and Candace Owens, and openly white supremacist content creators with smaller platforms. In many cases, Lewis found that these channels had invited the same guests to speak from other channels in the network, leading to the creation of “radicalization pathways.” Rubin, whose channel has 750,000 subscribers, was cited as an example for hosting the Canadian racist commentator Stefan Molyneux. “Molyneux openly promotes scientific racism, advocates for the men’s rights movement, critiques initiatives devoted to gender equity, and promotes white supremacist conspiracy theories focused on ‘White Genocide,’” Lewis writes. During his appearance on Rubin’s channel, the host failed to meaningfully challenge Molyneux’s ideas -- lending credibility to Molyneux’s more extreme worldview.

Rubin vehemently denied charges of his association with white supremacy on Twitter, but failed to refute the specifics of Lewis’ findings:

 

Despite Rubin’s assertion, Lewis’ study does not mention the word “evil.” What the study does make clear, however, are the ways in which web-savvy networks of association and influence have become crucial to the spread of extremist ideologies on the internet. The issue of racist, sexist, and anti-LGBTQ content is not limited to obscure internet fever swamps like 4chan and Gab -- but it is also happening in a public and highly lucrative way on the web’s most popular video platform.

Conservative provocateur Ben Shapiro, named as an influencer in the network, also sought to discredit the study.

But Shapiro was only separated by one degree, not six, from Richard Spencer: He has been interviewed by a right-wing YouTuber, Roaming Millenial, who had invited Richard Spencer to share his views on her channel two months earlier.

“There is an undercurrent to this report that is worth making explicit: in many ways, YouTube is built to incentivize the behavior of these political influencers,” Lewis writes. “The platform, and its parent company, have allowed racist, misogynist, and harassing content to remain online – and in many cases, to generate advertising revenue – as long as it does not explicitly include slurs.”

Just last week, extremist hate channel Red Ice TV uploaded a screed titled “Forced Diversity Is Not Our Strength,” promoting segregated societies. Hosted by gregarious racist Lana Lotkeff, who has become a micro-celebrity in the world of white supremacists, the video asserts that “minorities and trans people” have had a negative impact on “white creativity.”

Red Ice TV has more than 200,000 subscribers. At press time, the “Forced Diversity” video had more than 28,000 views. Upon completion of Lotkeff’s anti-diversity rant, YouTube’s auto-play suggested more Red Ice TV content -- this time a video fearmongering about immigrants -- thus continuing the automated cycle of hate.