Listen to Kayla Gogarty explain Facebook's role in spreading harmful ivermectin misinformation

Gogarty: “Facebook never really acts proactively and it's always trying to catch up and, you know, after harmful misinformation is already circulated”

Listen to Kayla Gogarty explain Facebook's role in spreading harmful ivermectin misinformation

black and white image of Kayla Gogarty
Audio file

Citation From the September 2, 2021, edition of SiriusXM's The Dean Obeidallah Show

JOE SUDBAY (HOST): It's — I think, an important point, and it's something that you've made clear, is Facebook — their community standards — they claim that they will remove misinformation when public health authorities conclude that the information is false and likely to contribute — contribute to imminent violence or physical harm, OK? We know failure to wear masks contributes to physical harm. We know that, you know, public health authorities say, don't take ivermectin. We know that all the evidence shows that the vaccines work and that the people who aren't vaccinated are the ones filling up our hospitals right now. What does it take for — and you've watched this company for a while — what does it take for Facebook to actually, you know, kind of live by its own alleged standards?

KAYLA GOGARTY (ASSOCIATE RESEARCH DIRECTOR, MEDIA MATTERS): In addition to the policies that you just said, there's also specific policies against vaccine misinformation. And Facebook has acknowledged that ivermectin misinformation is against its policies. It has said that promoting the purchase or sale of ivermectin is against its policies. Yet for some reason, they are not, clearly not enforcing these policies by letting a lot of these groups stay on the platform.

SUDBAY: And what happens, like, when they're informed? When, like, you inform them, you've done these reports, do you ever get responses from Facebook or do they — how do they handle it? I'm so intrigued by this because we've seen — I mean, obviously, anyone who's paid attention to, you know, the way Facebook has, what they've done politically over the years. And we know, you know, there was this — they were going to do a report on, you know, how they handled misinformation and they couldn't send out the first quarterly report because it was so bad for them. How do they respond? I'm just — it's so maddening to watch this.

GOGARTY: I think that report you just mentioned is a good example of how they always prioritize PR over actually combating misinformation or some other problems on the platform, specifically with these Facebook groups and COVID-19 misinformation. A lot of times, they'll tell reporters, yes, we have these policies. I know specifically for the ivermectin groups, they've responded to other reporters who have also been reporting on these groups, and have told them, you know, we'll look at the content and we remove anything that's against our policies. Yet a lot — all of this content that we've also reported on looks — appears to be against these policies, but they're not taking any action. I know NBC reported on a very large ivermectin group and that — that group that they reported on, I'd say a week ago, is still on Facebook. There was no action taken whatsoever. And it is very clearly a group that violates its policies.

SUDBAY: God. It's so maddening. And it's literally, you know, obviously, I think a lot of what they did with politics has put our democracy in danger. But this is literally putting lives in danger. The anti-vax — vaccine groups — the anti-mask groups, and the pro-ivermectin groups. It's literally like life and death for many people. And that doesn't seem to move Facebook. It's — it's really quite — well, we should be astounded by it, but it's become kind of the norm.

GOGARTY: Yeah. Unfortunately, Facebook never really acts proactively and it's always trying to catch up and, you know, after harmful misinformation's already circulated. So we've seen a lot of these instances, like with masks, vaccines, ivermectin, they — the misinformation about them are so widespread now, it's hard for Facebook to even get a handle on it at this point.