New Zealand mass shooting illustrates failure of tech companies to prevent radicalization on their platforms

NBC’s Ben Collins: Because of algorithm suggestions on tech platforms, people “go down rabbit holes” and “maybe they start as racists and they end up as extremists"

Discussing the shooting at two New Zealand mosques that left at least 49 dead at the hands of an anti-Muslim extremist, NBC reporter Ben Collins noted that the alleged shooter showed signs of “getting radicalized by these same algorithms” on tech platforms such as YouTube that previously influenced other mass shooters.

YouTube has a serious radicalization problem because its algorithm sometimes pushes far-right content via recommendations, which Collins noted can cause people to “go down rabbit holes” and “maybe they start as racists and they end up as extremists,” leading to harassment, racist violence, and terrorism. Collins also compared the alleged Christchurch shooter’s reported announcement of the attack on far-right message board 8chan to the alleged Pittsburgh, PA, shooter’s announcement of his attack on Gab, a social media platform that is a hotbed for white nationalists.

The alleged New Zealand mosque shooter livestreamed the attack on Facebook, and it was also shared on YouTube and Twitter, but the social media platforms have struggled to remove the video. Tech companies have been ill-equipped to proactively deal with this kind of content, in some ways leaving the task of combating it to the most affected communities.

Collins said that the tech platforms “can stop this” and prevent such radicalization in advance, noting that they have effectively fought ISIS content but have treated far-right extremist content as “a political issue that it really isn't,” presenting it as political speech rather than a gateway to terrorism.

From the March 15 edition of MSNBC Live with Stephanie Ruhle:

Video file

STEPHANIE RUHLE (HOST): Ben, the white nationalist manifesto that was posted online before the attacks, you say there are a couple of things that ring obvious bells. And we need to remind our audience, whenever people say, “A nationalist is the same thing as a patriot,” no, it's not.

BEN COLLINS: Right, so he was a white nationalist. That’s what he was. And we mean that in the sense that he only wanted white people in his nation. That’s what white nationalism means.

RUHLE: Yeah.

COLLINS: For a lot of these people -- I think a lot of people try to defang that word and say, “It’s not as bad as white supremacist,” right? Well how are you going to get only white people in a nation? This is how you do it, to these people. The entire manifesto is about this. And of course, like, there is trolling in there, right? There is stuff in there -- because he was so reared online.

RUHLE: How -- tell us, because not everyone has seen this. What do you mean there was trolling in there? How? Where?

COLLINS: Right, so he posted this manifesto on this thing called 8chan, which is -- you may have heard of 4chan. It is double 4chan in every way. It’s an extremist website, effectively, where anonymous people are allowed to post whatever they want. He was actually cheered on after he posted this thing. And it’s -- this is the blind spot that you're talking about, right, where this stuff isn’t monitored with the same sort of rigor and it’s because it's a political issue, it's become a political issue for really no reason, but these people are committing terror attacks over and over again, you know, using the same platforms, getting radicalized by these same algorithms, over and over again. And at some point these tech companies, which, like you said, when was the last time you got recommended an ISIS video on YouTube? Probably never, right?

RUHLE: Never.

COLLINS: Tech companies can do this, they can stop this, but they have made this a political issue that it really isn't.

...

RUHLE: Ben, I want to talk about this because you put a tweet out this morning where you write, “Extremism researchers and journalists (including me) warned the company YouTube in emails, on the phone, and to employees' faces after the last terror attack that the next one would show signs of YouTube radicalization again, but the outcome would be worse. I was literally scoffed at.” Please tell me about this.

COLLINS: Yeah, I was scoffed at by an executive at YouTube.

RUHLE: Don’t laugh. Tell me what happened.

COLLINS: Sure, yeah. Look, we have over and over again talked to these companies about -- the last time this happened, right? We were -- it was a couple of days before midterms. There was a guy who shot up a synagogue in Pittsburgh. He used the same verbiage. He said, “Invaders are coming, they’re coming through the caravan, and they’re backed by this Jewish cabal that’s taking over the world, trying to take out the white race,” right? So we talked to them, we talked to these tech companies, we talked to YouTube.

RUHLE: And what do they say to you?

COLLINS: They’re like, “We’re working on it.” I’m like, “Well, are you working on it?”

RUHLE: What does that mean? So in the last 12 hours when I’m watching the news, I keep hearing reporters say, “And they’re working on taking these down.” I don’t understand the “working on” part.

COLLINS: Yeah. And that’s the thing. YouTube put out a statement today, they’re saying, “Oh, we’re really working to take down the video of the shooting.” And that’s great. That’s wonderful. But it's the months before the shooting, where these people go down rabbit holes and maybe they start as racists and they end up as extremists. That’s the stuff they’re really not working on at the level they can. They do it with ISIS. Why can't we do it with this?