On March 17, YouTube announced that it would be immediately lifting former President Donald Trump’s suspension on the platform and allowing him to upload new content. Though the platform claims that it “carefully evaluated the continued risk of real-world violence,” the decision fails to account for Trump’s propensity to push misinformation and the platform’s weak misinformation policies and haphazard enforcement.
YouTube is the latest platform to reinstate Trump after suspending him in the days after the January 6, 2021, insurrection. Twitter CEO Elon Musk reinstated the former president on November 19 after an unreliable and unscientific Twitter poll, and Meta followed suit, allowing him back on Facebook and Instagram on February 9 after mistakenly assessing that the risk to public safety had “sufficiently receded.”
While Trump posted on Facebook for the first time since his reinstatement late in the afternoon on March 17 — with a brief video seemingly making light of his long delay in returning to the platform — he is reportedly preparing a return to other platforms and doesn’t want to reup his exclusivity agreement with his own social media company, Truth Social. He has, however, already taken advantage of his renewed full access to Meta’s advertising, running ads meant to undermine potential primary opponents and recruit campaign volunteers.
Trump has a well-documented history of using social media platforms, including YouTube, as a megaphone for misinformation and extreme rhetoric. Even after he was suspended from all the major social media platforms in January 2021 after months of pushing misinformation that incited a violent insurrection, Trump continued to push harmful election misinformation on Truth Social, at rallies, and during interviews. In fact, YouTube had to remove a video in March 2022 after he repeated election misinformation in an interview published on the platform.
Prior to his suspension, Trump used the platform to livestream his rallies, amplify right-wing media videos filled with election misinformation — including a video promoting the rally that led to the January 6 insurrection — and post various YouTube Shorts with campaign-style videos. (At the time of publication, Trump had 2.65 million subscribers on YouTube.)
Based on Trump’s previous pattern of posting on social media platforms and his continued behavior of pushing election misinformation, he will likely continue to push false claims when he starts uploading new content on YouTube. And the platform is unprepared to deal with his misinformation, as it is relying on weak policies and haphazard enforcement that have been previously exploited by right-wing actors.
Despite YouTube’s supposed commitment to preventing election misinformation, the platform's policies are limited, prohibiting content that advances “false claims that widespread fraud, errors, or glitches occurred in … past U.S. presidential elections” but not in midterms or future elections. Additionally, the platform didn’t start removing content about widespread voter fraud in 2020 until after December 8, when states certified election results, ultimately allowing users to post such misinformation a month after outlets called the election for now-President Joe Biden.
These loopholes will allow Trump to spread election misinformation across the platform, including as he campaigns for the 2024 election.