On May 25, Meta will hold its annual shareholder meeting facing a less active user base, its slowest revenue growth since initial public offering, reduced earnings, and, as our latest study suggests: more risk for shareholders as its platforms continue to be plagued by misinformation, hate speech, and dangerous users that Meta refuses to adequately address.
Over the last year, Media Matters has regularly reported on Meta’s failure to enforce its community standards on Facebook and Instagram, and we have reported on its very narrow interpretations of these standards, which in some cases were not adequate to begin with.
As part of the #MakeMarkListen campaign, Media Matters' associate research director Kayla Gogarty issued the following statement:
Meta claims it protects Facebook and Instagram users under the platforms’ community standards. However, in just the last 12 months, Media Matters found more than 13,500 instances in which Meta failed to consistently enforce its policies — interpreting them so narrowly, or missing apparent violations completely, that its users weren’t safe from the misinformation and hate speech it claims to ban. But rather than address these issues, the company has prioritized focus on new products and features like the metaverse, short-form video, and AI-driven video recommendations.
Meta's lack of policy enforcements has led to real-world violence and other harms, with bad-faith actors using the platform to plan and defend the insurrection at the U.S. Capitol, spread election fraud disinformation, put the public’s health at risk through COVID-19 misinformation, and target trans and nonbinary people, particularly youth.
This pattern of failure is not only a risk to society at large, it’s also a risk to Meta’s shareholders. In advance of Meta’s Annual General Meeting, it is clear that Mark Zuckerberg is not willing to do what it takes to keep his platforms safe, repeatedly prioritizing profits over people and damaging our society in the meantime.
It’s time to #MakeMarkListen and demand oversight and accountability at Meta.