TikTok is prompting users to follow far-right extremist accounts
Three Percenter, QAnon, Patriot Party, and Oath Keeper accounts are being recommended by TikTok’s algorithm
Written by Olivia Little
Published
TikTok’s account recommendation algorithm appears to be prompting users to follow far-right extremist movements, with Three Percenter, QAnon, Patriot Party, and Oath Keeper accounts all being pushed by the platform’s algorithm. These include far-right movements that were deeply involved in the planning and execution of the January 6 insurrection, which resulted in five deaths and nearly 140 injuries to police defending the Capitol.
Three Percenter, QAnon, Patriot Party, and Oath Keeper content is prohibited by TikTok, but the company’s algorithm appears to be both circulating their content and helping them to expand their following.
A TikTok spokesperson told Vox that the platform’s recommendation of accounts to follow is “based on user behavior”; the company’s other explanation of the algorithm (found on the information button next to “suggested accounts”) claims that account recommendations are tailored to the “interests” or “connections” of an individual user.
To analyze how this process appears to operate, Media Matters reviewed and tracked which accounts TikTok recommended under a “suggested accounts” prompt after following a specific account from TikTok’s “For You” page.
Our analysis found that by following TikTok’s suggested follower prompts, users can easily be exposed to and increasingly served far-right extremist accounts and content. This is uniquely harmful because it has the potential to further radicalize people interested in these far-right extremist movements, and it doesn’t even require users to seek them out; TikTok hand-delivers the extremist movements to its users, many of whom are 14 or younger.
Using this process, Media Matters identified six common scenarios demonstrating how following specific accounts from TikTok’s “For You” page shapes the type of extremist content recommended by the platform’s “suggested accounts” algorithm.
For example, after following a QAnon account from the “For You” page, TikTok recommended another QAnon account. After following that second QAnon account, TikTok then recommended a Three Percenter account.
In another example, after following a Three Percenter account from the “For You” page, TikTok suggested following a different Three Percenter account. After following the second account, TikTok recommended yet another Three Percenter account. After following several Three Percenter accounts, the web of radicalization expanded and accounts from Patriot Party, QAnon, and others were all recommended at times.
This accelerated pattern of recommendations is alarming and has the potential to push TikTok users down a far-right rabbit hole, further populating user feeds with the sort of extremist movements behind the Capitol attack.
The following graphic illustrates the accounts fed to our “For You” page which were followed, and then TikTok’s suggested accounts indicated by an arrow. (Accounts with two graphics indicate split extremist ideology, such as a Patriot Party account that also pushed QAnon conspiracy theories.)