Update (10/2/20): Facebook removed 15 of the 19 QAnon groups following Media Matters' reporting.
Facebook has an ongoing problem managing the spread of the QAnon conspiracy theory on the platform. Though the company promised in August to remove QAnon groups that “discuss potential violence,” many Q-affiliated Facebook groups explicitly violating this policy are now just as active as ever -- and they have continued to gain members in recent months.
As first reported by The New York Times, Facebook’s latest excuse for the slow removal process of QAnon groups with violent content is that the groups are strategically designing ways to “evade” enforcement mechanisms and the site is working with “external experts on ways to disrupt” this activity. Facebook is implying that it is difficult to hold QAnon groups accountable for violent rhetoric from members because they are outsmarting or using elaborate means to evade the community standards enforcement team. While the Times noted that some groups have made slight adaptations to their names, such as “changing ‘Q’ to ‘Cue,’” the substance of these public posts remains easily identifiable and obviously violent -- with perceivably little response from Facebook.
Facebook has framed holding Qanon groups accountable as a difficult task because they use tactics to “evade” detection, but it is evident that, like in other cases of policy enforcement on the platform, the company is seemingly still struggling to implement its own community standards. The New York Times reported on Facebook’s QAnon accountability problem, citing internal research showing that despite Facebook’s promise to remove QAnon content from the recommendation engine, the algorithm is still suggesting QAnon groups to users.
QAnon content continues to widely proliferate on the website, indicating Facebook was either unprepared to actually follow through with the promise to remove groups that discuss violence, or had no intention of actually doing so after receiving initial praise. Either way, Facebook’s public promise to remove QAnon groups promoting violence garnered positive PR even while the online community continued to expand.
Using CrowdTangle, an analytics tool owned by Facebook, Media Matters identified 19 large QAnon groups using violent rhetoric that seemingly violate Facebook’s new policy. This is just a small fraction of the public QAnon community at large, but it provides insight into Facebook’s inability (or unwillingness) to follow through on its own enforcement promises. No special coded language was needed to identify these groups or the posts, which engage in a clear pattern of discussing violence. These discussions of violence, in theory, should be more than sufficient grounds for these groups to be immediately removed from the platform.
Not only do these 19 public QAnon groups contain posts with violent content, but they have also been growing during the time period in which Facebook was allegedly limiting the reach of QAnon groups. According to data from CrowdTangle, the groups grew by 33,500 members between June 21, 2020, and September 21, 2020, yielding a total of 95,800 members in these 19 groups alone.
The Facebook groups listed have been identified as QAnon affiliated by their title, description, or posts. While each group differs, basic QAnon identifiers are “Q” or “WWG1WGA,” a common QAnon slogan meaning “where we go one, we go all.”
Q NEWS - HILLARY CLINTON TREASON LIST THE RALPH COONS - PAGE
Republic for the United States of America
patriots white hats worldwide WWG1WGA
17 1141514 5:5 Memes Cache
Revere’s Midnight Riders
(note: formerly known as ~*Q*~SENT ME[ME])
17- TRUTHERS UNITE !
QAnon 8ch Uncensored Research
UK UNITED PATRIOTS W-W-G-ONE-W-G-A
The GREAT awakening 2020 WE ARE THE CHILDREN
United States Patriots Project ( WWG1WGA) A Plan To Save The World!
The Alliance/Trump/Q Fan Club
Q Anon Patriots
Qanon Forum Group Europe #WWG1WGA
Down the Rabbit Hole