Social media companies don't care about safety
Written by Media Matters Staff
Published
On January 16, Sens. Richard Blumenthal (D-CT) and Marsha Blackburn (R-TN) introduced The Kids Online Safety Act, a bipartisan bill aimed at protecting minors on social media. The bill is the first step in addressing the years of harm social media companies have caused their young users in their relentless pursuit of profit. These companies have repeatedly shown that they are unwilling to mitigate the serious issues on their platforms, even when evidence shows these issues are specifically impacting young people. Media Matters has extensively documented the ways social media platforms have put users — young and old — at risk.
Failing to protect young people
Despite consistently marketing their products to a young user base and making promises that they will provide minors with additional protections, social media companies have failed to provide a safe environment for young people.
- 2/8/22: TikTok is enabling predatory ADHD advertisers to target young users.
- 12/20/21: TikTok and negligent reporting created mass panic about an unfounded national school shooting day hoax.
- 11/3/21: On YouTube, Charlie Kirk grotesquely weaponizes a minor's sexual assault to attack trans people.
- 4/30/21: TikTok influencers are pushing dangerous far-right conspiracy theories to their young audience.
- 2/10/21: TikTok is teaching teens how to build fully automatic rifles and make “hollow point” ammunition.
Allowing promotion of dangerous weight loss content
Social media platforms have become a hotbed of medical misinformation, including content promoting dangerous and stigmatizing weight-loss regimens and medicines.
- 10/20/21: Instagram’s suggestion algorithm is promoting accounts that share misinformation.
- 9/22/21: Instagram is recommending weight loss gimmicks through the “Explore” page.
- 7/29/21: TikTok creators are promoting dangerous eating disorder tactics to young users -- and the company is letting them.
- 6/22/21: TikTok scammers are deceptively editing real users' weight loss videos to sell potentially dangerous products.
- 2/25/21: Pro-eating disorder content remains in heavy circulation on TikTok.
Teens may be particularly vulnerable to harmful social media content, and the platforms’ ability to target content to specific ages presents a unique concern. But social media companies have exposed all users, young and old alike, to various forms of misinformation and hate speech, have profited off such content, and have allowed dangerous organizing to occur online.
Monetizing misinformation and hate speech
Social media companies are not only complicit in the spread of misinformation and hate speech, they are also actively profiting from it while enabling bad actors to do the same.
- 2/10/22: On Facebook, ads for Tucker Carlson’s Hungary vs. Soros “documentary” have been viewed millions of times.
- 2/8/22: As YouTube and Google ban Dan Bongino for misinformation, Facebook profits from helping him promote the same false and sensational content.
- 1/24/22: Instagram’s link sticker feature is lining the pockets of some of the platform’s most prolific misinformers.
- 8/13/21: YouTube is profiting off of Russian and Chinese government propaganda on its platform.
- 8/9/21: Instagram Shopping is full of anti-vaccine merchandise.
- 7/30/21: Google is profiting off ads for counterfeit COVID-19 vaccination cards.
- 7/23/21: Google is profiting from QAnon and “plandemic” products on its Play Store that seemingly violate its rules
- 4/7/21: Despite cracking down on QAnon, YouTube continues to profit from content supporting the conspiracy theory
- 3/2/21: YouTube is profiting from Infowars’ editor-at-large, despite Infowars content being banned from the platform.
- 1/6/21: YouTube is making money from violence-linked militia group recruitment videos.
Facilitating dangerous organizing
Social media companies have consistently turned a blind eye to users organizing protests against masks and vaccines. They’ve also allowed militia recruitment to occur on their platforms.
- 2/10/22: Organizers of the upcoming U.S. trucker convoy to protest vaccine mandates are using social media to plan and promote it
- 1/21/22: Various social media platforms are enabling known COVID-19 misinformers to promote an anti-vaccine march in Washington, D.C.
- 9/20/21: How one anti-mask activist leveraged TikTok and Facebook to spur local school protests across the country.
- 4/9/21: Facebook is allowing users to promote and organize maskless marches and mask-burning events.
- 8/10/21: As COVID-19 cases surge and schools assess safety measures for the new year, Facebook is allowing anti-mask groups to thrive.
- 1/22/21: Facebook has allowed private groups to promote “maskless shopping” during COVID-19 pandemic.
- 1/20/21: Ahead of the inauguration, extremists have been circulating calls for violence on TikTok.
- 1/12/21: Far-right militias are using TikTok to organize and recruit new followers.
- 1/5/21 Users in private Facebook groups are encouraging each other to break Washington, D.C., gun laws ahead of election protests
- 1/12/21: Dozens of Republican Party groups used Facebook to help organize bus trips to Washington, D.C., for pro-Trump insurrection.
- 1/12/21: “Stop the Steal” organizers used Facebook and Instagram to promote events, including the rally that led to a mob breaching the Capitol.
- 1/5/21: Users in private Facebook groups are encouraging each other to break Washington, D.C., gun laws ahead of election protests.
Allowing the spread of COVID-19 and anti-vaccine misinformation
COVID-19 misinformation has thrived on social media since the pandemic began, and the minimal, vague policies that social media companies have haphazardly announced attempting to save face have not done much to reign in predatory misinformation.
- 1/11/22: On YouTube, Steven Crowder says, “You could argue that omicron is responsible for reducing deaths more than the vaccines.”
- 12/30/21: In 2021, social media platforms enabled the deadly spread of COVID-19 lies and anti-vaccine misinformation.
- 10/15/21: YouTube's new anti-vaccine crackdown policy is already falling short.
- 10/1/21: YouTube is taking a victory lap for its latest vaccine misinformation policy. Meanwhile, Project Veritas is gaining millions of views on anti-vaccine conspiracy videos on the platform.
- 9/28/21: Facebook is letting COVID-19 vaccine misinformation flourish in its comment sections.
- 8/27/21: Facebook groups around the world are promoting unprescribed livestock medications for COVID-19, while the platform seemingly does nothing to stop them.
- 8/18/21: TikTok’s algorithm is amplifying COVID-19 and vaccine misinformation.
- 8/12/21: An anti-vaccine misinformation video has been viewed at least 30 million times on social media.
- 8/10/21: A new Plandemic-like misinformation video has earned tens of millions of Facebook engagements via streaming platforms.
- 4/27/21: Despite federal complaint, a fake COVID-19 cure thrives on Facebook and Instagram.
- 3/23/21: Vaccine misinformation still runs wild on Instagram.
- 3/12/21: TikTok’s massive COVID-19 and vaccine misinformation failure.
- 2/8/21: YouTube and Facebook allowed another COVID-19 conspiracy theory video to go viral.
- 1/11/21: Prominent anti-vaccine figures pushed egregious misinformation about the COVID-19 vaccine live on Facebook and YouTube.
Allowing the spread of hate speech
Content containing hateful symbols and rhetoric can be easily found on social media platforms, despite company executives’ claims to the contrary.
- 2/09/22: A year ago, Facebook contributed to a military coup in Myanmar. Its response has been an appalling failure.
- 1/10/22: YouTube's history of rewarding Steven Crowder, even though he continually infringes on the platform’s hate speech, bullying, and misinformation policies.
- 12/1/21: Four times Facebook ignored its own research showing its platforms spread hate.
- 11/23/21: In Ethiopia, Facebook allowed posts inciting violence to go viral for years. The company’s response is both dismissive and ineffective.
- 11/9/21: Right-leaning Facebook pages earned nearly two-thirds of interactions on posts about trans issues.
- 10/29/21: As Facebook claims it's cleaning up hate speech, hateful content is rampant on the platform.
- 10/5/21: TikTok's algorithm leads users from transphobic videos to far-right rabbit holes.
- 9/1/21: Amid the humanitarian crisis in Afghanistan, racist anti-refugee narratives are circulating on Facebook.
- 8/12/21: YouTube fails to protect trans people from misgendering or deadnaming.
- 7/6/21: TikTok’s recommendation algorithm promoted homophobic and transphobic content during Pride Month.
- 6/8/21: Steven Crowder’s bigotry has found a home on TikTok.
- 5/18/21: TikTok’s recommendation algorithm is promoting homophobia and anti-trans violence.
- 4/29/21: The right-wing media ecosystem has dominated the national immigration narrative.
- 4/6/21: On YouTube, Steven Crowder spreads debunked anti-Semitic conspiracy theory about George Soros.
- 4/2/21: Seemingly harmless conspiracy theory accounts on TikTok are pushing far-right propaganda -- and TikTok is prompting users to follow them.
- 3/22/21: Steven Crowder returns to YouTube after a six day absence by using a racial slur.
- 3/16/21: On YouTube, Steven Crowder uses racist stereotypes to attack Black farmers.
- 3/9/21: Also on YouTube, Steven Crowder uses a racial slur in reference to Meghan Markle.
- 3/4/21: Right-wing Facebook pages dominated the conversation around the Equality Act and Dr. Rachel Levine -- the first openly transgender person to be confirmed by the Senate.