TT algorithm study

Andrea Austria / Media Matters

Research/Study Research/Study

STUDY: Interacting with these popular right-leaning comedy podcasters can turn your TikTok feed into a swamp of Andrew Tate-style misogyny and conspiracy theories

After we interacted with content of five popular right-leaning comedy podcasters, TikTok’s “For You” page recommendation algorithm catapulted our account down a right-wing rabbit hole of conspiracy theories and toxic masculinity content over the span of just a few hours.

  • Background

  • As part of his tour of nontraditional media prior to the 2024 election, President Donald Trump appeared for interviews with a handful of very popular podcasts that claim to be nonpolitical  but discuss news and politics with a right-leaning ideological bent and remain receptive to the MAGA agenda. This media tour included appearances on the Nelk Boys’ Full SendThe Joe Rogan ExperienceImpaulsive with Logan PaulThis Past Weekend w/ Theo Von, and Flagrant — all of which self-identify as comedy shows.

    Media Matters followed the five podcasters’ TikTok accounts, then watched and  “liked” each account’s 10 most recent videos. (Note: Rogan doesn’t have an official account, so we engaged with a fan account that exclusively posts clips from The Joe Rogan Experience)

    We then documented, coded, and analyzed the first 425 videos fed to our account’s “For You” page. 

  • Key findings

    • TikTok’s recommendation algorithm began quickly and progressively populating our FYP with right-wing conspiracy theories and toxic masculinity content.
       
    • Of the 425 videos coded, 121 (28%) contained conspiracy theories and 70 (16%) contained toxic masculinity content.
       
    • Our FYP also contained content promoting medical misinformation, doomsday prepping, racism, right-wing media, and transphobia.
  • TT FYP narratives following engagement with right-leaning comedy podcasters

    Citation

    Media Matters / John Whitehouse

  • Context

  • Our study began with the hypothesis that interacting with right-leaning comedy podcasters on TikTok would lead us to more extreme content. This theory was predicated on two prior studies that documented how TikTok’s recommendation algorithm can radicalize a feed after a user engages with right-wing-adjacent content. 

    In a recent study, Media Matters analyzed popular online shows that were active in 2024 and found 320 with a right-leaning or left-leaning ideological bent. Over a third of these self-identified as nonpolitical, even though 72% of these shows were determined to be right-leaning. Instead, these shows described themselves as comedy, entertainment, sports, or sorted themselves into other supposedly nonpolitical categories.

  • Discussion of research findings

  • Our research findings suggest that interacting with content from right-leaning comedy podcasters can cause a user's feed to populate with conspiracy theories and misogynistic content. TikTok’s recommendation algorithm radicalized our research account within the span of a few hours.

    Of the 425 videos assessed, 121 (28%) contained conspiracy theories. The conspiracy theories ranged from seemingly harmless ideas about aliens to claims about secret elites controlling the world.

    Some of the notable conspiracy theories fed to our FYP included:

    • A claim that recent plane crashes were intentional in order to make “sure people won’t leave.”
  • Plane crash conspiracy theory
    • A claim that Michael Jackson “uncovered secrets about a dark network involving high-ranking politicians and minors” implying he was killed because of this knowledge.
  • Video file
    • A claim that the most powerful man in America is unknown and “behind the scenes” and that “the center of gravity isn’t in the government at all.”
  • Video file
    • A claim that “time is an illusion — it doesn’t exist.”
  • Video file
  • 70 videos (16%) contained content promoting some variation of toxic masculinity.

    Many of the toxic masculinity videos were framed as motivational, containing clips of expensive watches, boats, cars, and planes with phrases such as “get rich and disappear.” The videos often advocated for a lone wolf mentality, encouraging users to isolate and obtain wealth. We were also fed multiple “inspirational” videos displaying imagery of luxurious lifestyles with what seems to be Andrew Tate’s voice talking in the background.

  • Video file
  • In one video, Tate’s voice seems to coach men about their girlfriends, saying, “Tell her that you’re going to quit your job because you want to chase your dreams and run your own business. Tell her she has to continue to work. Tell her that she’s paying your rent and you’re borrowing her car and you’re not going to have a penny for the next two years. And if she doesn’t agree to that shit, then dump her. Simple. That’s how you know you have a quality woman.” 

  • Video file
  • Other significant findings within the 425 videos included 15 videos containing right-wing media figures, 7 videos containing medical misinformation, 2 videos containing doomsday prepper content, 2 videos containing racist content, and 1 video containing transphobic content. 

  • Methodology

  • Media Matters created a new TikTok account using a device used solely for related research and engaged with content from five MAGA-approved podcasters.

    We identified five popular online shows that hosted President Donald Trump ahead of the 2024 election and self-identified as being supposedly nonpolitical, per our March study. These shows were Full Send, The Joe Rogan Experience, Impaulsive with Logan Paul, This Past Weekend w/ Theo Von, and Flagrant, which all self-identified as comedy shows. 

    We then engaged with a TikTok account of the show or the show’s host (or, in the case of Joe Rogan, who does not have an official account, with a popular fan account that exclusively posts clips from his show) — watching and “liking” each account’s 10 most recent videos. 

    We then navigated to our “For You” page and began scrolling, eventually requesting a record of the account’s watch history. From that data, we evaluated the first 447 videos served to the account’s “For You” page after the 50 videos we initially watched (10 from each of the accounts) in order to train the algorithm. 

    Of the 447, 22 videos became unavailable some time after we viewed them on the FYP, resulting in 425 videos that were then independently assessed  by three researchers to determine which of nine categories fit the video best: conspiracy theory, toxic masculinity, medical misinformation, racist/white supremacist, transphobia, prepper, right-wing media, video unavailable, and general. Each video was reviewed individually and given a final code if two of the three researchers independently awarded it the same code after a blind review. Videos that did not achieve this level of consensus were reviewed again by two coders who then reconciled discrepancies.

    We defined “conspiracy theory” posts as those that claim the existence of a secret manipulation of events, people, or situations by powerful forces or that oppose mainstream agreement among experts qualified to evaluate the claim’s accuracy.

    We defined “toxic masculinity” as misogynistic content meant to attack, degrade, or mock a woman or support anti-feminism and the men’s rights movement; videos positively promoting Andrew Tate or imagery of Tate/audio of his voice; posts promoting traditional notions of masculinity; posts utilizing language such as “alpha” or “beta”; and content promoting dominance over others and a “lone wolf” mentality.

    We defined “medical misinformation” posts as those that spread medical information or advice counter to mainstream agreement among medical professionals. These posts often spread false information about vaccines, the pharmaceutical industry, or modern medicine.

    We defined “racist/white supremacist” posts as those that attack, degrade, or mock a particular race or ethnic group or support white supremacy by using white supremacist talking points such as “white pride” or “white lives matter.”

    We defined “transphobia” as posts that attack or mock trans people.

    We defined “doomsday prepper” as posts suggesting an impending civil war/government destruction and encouraging users to gather materials and make plans for the survival of a major disaster.

    We defined “right-wing media” posts as those that contain video or audio from prominent far-right media figures such as Ben Shapiro, Alex Jones, Paul Joseph Watson, Steven Crowder, Paul Nicholas Miller, and Nick Fuentes.

    We defined “general” posts as any that did not fall into any of the aforementioned categories.