TikTok’s massive COVID-19 and vaccine misinformation failure
Written by Olivia Little & Chloe Simon
Published
TikTok’s apparent negligence in moderating COVID-19 and vaccine misinformation has turned the platform into a shoddily regulated marketplace of lies, unsubstantiated medical advice, and far-right conspiracy theories.
Although a transparency report TikTok released in February hyped the company’s efforts to combat COVID-19 misinformation, dangerous content regarding COVID-19 and related vaccines still heavily plagues the app. Some media outlets wrote up the transparency report without critically evaluating the platform’s COVID-19 misinformation ecosystem, giving TikTok free, undeserved PR. But the prominence of COVID-19 misinformation on the platform tells a very different story.
The report noted that TikTok removed 89 million videos for violating its community standards in the second half of 2020, but only 51,000 of those were removed for promoting COVID-19 misinformation. Although 51,000 may appear as a substantial number, it is not aggressive enough to counteract the rapid dissemination of COVID-19 disinformation.
TikTok has struggled to control the spread of COVID-19 misinformation since the beginning of the pandemic. And the rollout of the COVID-19 vaccines has spurred anti-vaccination content and far-right conspiracy theories on the platform at a time when clear, factual information about vaccines is critical to gaining public trust.
Media Matters has identified what appear to be major holes in TikTok’s moderating approach toward COVID-19 and vaccine misinformation. In identifying these examples of harmful content, researchers used no special software. Instead, all findings were identified by simple targeted searching on the platform and are easily accessible.
How COVID-19 misinformation spreads on TikTok
COVID-19 misinformation circulates on TikTok in a number of ways. Sometimes users post videos from other platforms of far-right conspiracy theorists pushing bogus COVID-19 theories. In a recent episode of Infowars, conspiracy theorist Alex Jones even bragged about his COVID-19 theories going viral on TikTok.
In other instances, users shorten or intentionally misspell banned terms in order to circumvent content moderation. For example, “plandemic” (referring to a viral COVID conspiracy theory video) is a banned search term, but “plandemi” is searchable and has over 4.9 million views. “Cronavirus” (1.8 billion views) and “cronaviurs” (9.6 million views) are also searchable.
Sometimes entire accounts seem set up to spread COVID-19 and vaccine misinformation. “Virahnews” is a COVID-19 conspiracy theory account with 18,000 followers. Its videos regularly receive tens of thousands of views and sometimes even hundreds of thousands of views. Given the account’s modest follower count, its regularly viral COVID-19 and vaccine misinformation content suggests that TikTok’s algorithm is picking up these videos and recommending them to users on their “For You” page.
Although TikTok promised to add COVID-19 information banners to “any TikTok videos that mention vaccines” or COVID-19, many of the videos we identified do not have the promised banners.
TikTok’s algorithm presents a unique challenge because while identifying major COVID-19 misinformation accounts is useful, small accounts can go viral just as easily. TikTok confirmed this in the explanation of its recommendation system, writing, “Neither follower count nor whether the account has had previous high-performing videos are direct factors in the recommendation system.”
Vaccine misinformation
The COVID-19 pandemic has been catastrophic for American citizens, killing upwards of half a million people. The COVID-19 vaccines, which have been called the key to achieving herd immunity, have become targets of TikTok users spreading dangerous and misleading misinformation.
“Saynotovaccines,” “notavaccine,” and “vaccinepoison” are all accessible search terms, yielding misinformation videos with thousands of views and comments. The search term “vaccinefreedom” returns vaccine misinformation, including one viral video with over 236,400 views in which the user employs the hashtags “notgonna” and “brave” to declare their refusal to get vaccinated.
The misinformation within the anti-vaccination TikTok community is wide-ranging. Some have claimed that the Pfizer or Moderna vaccines will cause adverse reactions, including making the misleading claim that the vaccine has caused deaths in Norway. TikToks like these use a variety of similar hashtags like “nothanks” and “covidvaccinesideeffects.” A Tiktok user who received 1.4 million views for a video claiming they developed Bell's palsy as a result of being vaccinated has been dueted by a number of anti-vaccination accounts to show the dangers of vaccinations. Although there were a few accounts of Bell's palsy reported in the vaccine trial stage, the FDA states that they “do not represent a frequency above that expected in the general population.”
Another element of anti-vaccination TikToks is users claiming the vaccine does not work or is not even real. Some have taken clips of televised vaccinations to make it seem as though the needle has nothing inside of it. Health experts have debunked such claims, showing that the needle is simply retractable. Anti-vaccination Tiktoks have also used suspect health sources to push disinformation about the COVID-19 vaccine. For example, Johan Denis, a holistic medical practitioner whose videos are used in anti-vaccination TikToks, claims that the vaccine is “ineffective” and the pandemic is “fake.”
General COVID-19 misinformation
TikTok claims to work with experts to develop its misinformation policies and “stay ahead of evolving content.” Yet, basic COVID-19 misinformation terms remain highly active on TikTok.
Both “kungflu” and “kungfluvirus” are searchable terms, and their combined view count is over 15 million. Similarly, “batsoup” and “batsoupvirus” are unblocked search terms containing a combined count of nearly 17 million views.
Former President Donald Trump popularized the racist term “kung flu” in June 2020. According to Psychology Today, this racist characterization of the virus has the potential to lead to hate crimes or discrimination against Asian Americans. The Center for the Study of Hate & Extremism released a report showing that anti-Asian hate crimes surged 149% in America’s largest cities from 2019 to 2020. “Bat soup” refers to an early COVID-19 origin myth that quickly turned into a racist meme, blaming the pandemic on Chinese people eating bats. Bat soup did not cause the COVID-19 virus, but the idea became a pervasive racist myth that still circulates today.
Other unblocked general COVID-19 misinformation search terms include:
- “hydroxychloroquine,” 14.1 million views (hydroxychloroquine does not cure COVID-19 or help infected patients)
- “HCQ,” 2.4 million views
- “plandemichoax,” 606,300 views
- “pandemichoax,” 5,337 views
- “virushoax,” 63,300 views
- “covidlie,” 249,900 views
This is not a comprehensive list, but the fact that such obvious terms are still active suggests that TikTok’s COVID-19 misinformation moderation has substantial gaps. Many of these hashtags do not even have a COVID-19 information banner directing users to credible resources.
COVID-19 conspiracy theories
Dangerous conspiracy theories about COVID-19 and the vaccines are constantly emerging, and it is clear TikTok does not have a handle on them.
Far-right conspiracy theorists are often traced as the origin of these unsubstantiated claims as they confidently push pseudoscience in an attempt to become an authority on the subject. While some of the conspiracy theories may seem outlandish, they do have real consequences. The conspiracy theories foster vaccine skepticism and encourage dangerous COVID-19 behaviors, such as refusing to wear masks.
FEMA camps
One far-right conspiracy theory that has reemerged with the pandemic is that the United States government will detain citizens in a time of crisis in Federal Emergency Management Agency camps. A simple “FEMA camps” search on TikTok yields misinformation as the top results. And the “FEMACamps” hashtag has over 1.2 million views and is saturated with conspiracy theories warning users about supposed FEMA camps and nonexistent mandates to get vaccinations.
Mandatory vaccines
The far-right conspiracy theory claiming that the United States government will make COVID-19 vaccines mandatory remains active on TikTok.
Unblocked searches include:
- “forcedvaccine,” 400,400 views
- “forcedvaccinations,” 52,200 views
- “mandatoryvaccine,” 894,500 views
- “mandatoryvaccination,” 14,600 views
Depopulation
Far-right conspiracy theorists also continue to peddle the baseless claim that “globalist elites” staged the pandemic in order to further their depopulation agenda. While the theories differ, Microsoft founder and philanthropist Bill Gates is often accused of executing this (nonexistent) depopulation plan.
Searching “Bill Gates depopulation” also returns COVID-19 misinformation with no information banner, even though implicating Gates in the rise of COVID-19 is one of the most popular conspiracy theories. Additionally, a “planned virus” search returns COVID-19 misinformation as the top results.
The following related search terms return COVID-19 misinformation as the top results:
Agenda 21
Similarly, “Agenda 21” refers to the nonbinding 1992 United Nations resolution that focused on “sustainability on an increasingly crowded planet.” Far-right conspiracy theories have made it a point of focus since its origin, and it is now being interpreted as a depopulation agenda tied to COVID-19. The hashtag “agenda21” has over 18.5 million views, and “agenda21depopulationplan” has over 50,800 views. Searching “agenda21 virus” returns COVID-19 misinformation as the top results.
Mark of the beast and the “God gene”
In February, The Washington Post reported on extremists mixing faith with vaccine misinformation, referencing the prevalence of the “mark of the beast” conspiracy theory circulating on TikTok and elsewhere. According to the Post, the mark of the beast is “a reference to an apocalyptic passage from the Book of Revelation that suggests that the Antichrist will test Christians by asking them to put a mark on their bodies.”
In short, the conspiracy theory frames COVID-19 vaccines as a test from the “Antichrist,” which followers will fail should they accept the vaccine.
Despite wide reporting on this conspiracy theory, TikTok appears to have done little to nothing to stop the spread of this content. A “mark of the beast” search does not even return clarifying COVID-19 information, but instead is filled with viral vaccine misinformation; the hashtag “markofthebeast” has over 47 million views.
A similar conspiracy theory claims that receiving the COVID-19 vaccine means removing a person’s “God gene,” or potential genetic disposition to spirituality. The search “God gene” returns COVID-19 misinformation claiming that the “vaccine is an attack on Christians to remove the God genes.”
Other conspiracy theories
The Public Readiness and Emergency Preparedness (PREP) Act is invoked in times of public health emergencies. It was used in countering smallpox, the Zika virus, Ebola, and now COVID-19. Conspiracy theorists are sensationalizing the PREP Act and claiming that it will be used as a way for the United States to kill its citizens. In one video about the PREP Act, a user claims that the “US government soon will kill millions of people with stroke I am so glad I am not American do not trust Pfizer or you will be dead last warning.”
Another popular (and unfounded) conspiracy theory claims that the COVID-19 virus was developed as a “bioweapon” in a lab. The first result when searching “bioweapon” on TikTok is an Alex Jones Infowars clip promoting this misinformation with over 122,000 views. Searching “Chinese bioweapon” returns COVID-19 misinformation as the top result.
A baseless conspiracy theory claiming that the COVID-19 vaccine will “alter” a person’s DNA is also circulating on TikTok. The search “alter DNA” does not have a COVID-19 information banner pointing out that the video includes inaccurate information, and the top two results yield misinformation claiming that the COVID-19 vaccine will alter a person’s DNA. The top video result alone has over 20,000 views.
There’s still a lot of work to be done
Given the sheer number of examples of COVID-19 misinformation running rampant on the platform, it seems clear that TikTok is not effectively moderating COVID-19 and vaccine misinformation. Whether the enforcement gaps arose from a lack of awareness (unlikely given the platform’s partnerships), or inconsistent enforcement of community standards, there are real, material consequences to TikTok’s poor moderation.