Man who was pulled into the far-right through YouTube explains how the platform is used for radicalization

Caleb Cain: Far-right figures on YouTube are “taking advantage of this algorithm” and “this radicalization is a huge public health crisis”

From the June 12 edition of CNN's New Day:

Video file

ALISYN CAMEROTA (CO-HOST): How does a liberal young man become a follower of the “alt-right”? You're about a meet a man who says he was radicalized by “alt-right” figures via their persuasive YouTube videos. He believed their extremist conspiracies for years. But then he somehow deradicalized himself, and he's now working to help others get out. Joining us now is Caleb Cain. Caleb, thank you so much for being here to tell your personal story. I think it is so fascinating. So just to bring people up to speed, somewhere around, as I understand it, 2015, you were sort of at loose ends. You had dropped out of college, you were looking for direction. I think -- correct me if I'm wrong -- at the time you would describe yourself as liberal. And somehow online you found your way to these “alt-right” videos and websites. And then describe what happened. 

CALEB CAIN: Yes, thank you for having me on. Basically, I was depressed during that period of time and looking for an outlet for that depression. And I turned to YouTube. And I found a man by the name of Stefan Molyneux, and his content really helped me in a lot of ways but what I didn't realize was I was being given a political philosophy, and over time my political ideas started to change quite drastically, which was pretty clear to anybody in my life at the time. And really it just took me to a place where I became further isolated from social groups and just started to ostracize a lot of people in my life very unfairly. 

CAMEROTA: Let's talk about some of the things that you came to believe because of these videos. You came to believe -- I mean, again, jump in if this is wrong -- that other races were inferior to whites, that women -- I think or at least feminists were overly aggressive, and women I think, were inferior to men. You believed that, I guess, Muslims were trying to take over Western civilization or immigrants were. And how do you explain how you came to absorb those extreme views so much? 

CAIN: Well, it was mostly due to the people that I was listening to. The people I was listening to were selling me a narrative that, you know, cultural Marxists and, you know, immigrants and Muslims and basically liberals were trying to destroy Western civilization and install some sort of socialist regime. And it's the type of rhetoric you hear from a lot of these people online, the whole way across the political spectrum. And really what it boils down to is it's digital hate politics. And it leads people to radicalization. And, you know, for me, in my opinion, radicalization, it's a public health crisis. And we really need to fix this public health crisis through education, through fixing people's communities, and through providing mental health support, support I didn't have growing up, and is what led me down this path. 

CAMEROTA: And I'm going to get to exactly how you pulled yourself out in one second, but also when you say it's a public health crisis, obviously we've seen hate crimes spike in the past years. We've also seen the rise of neo-Nazism. We've seen more violence. Were you ever tempted? Did you feel you were veering in that direction towards violence? 

CAIN: I don't know if I was ever veering towards violence. I never felt that way. But from what I see, the people that turn to violence are people that feel at the end of their line, they feel that their back is completely against the wall. And these were the people that we saw in the New Zealand shooting and in the Poway shooting. And these people were deeply ingrained within social groups online. And I think that giving people -- relieving those anxieties that people have, their systemic anxieties that they have through, you know, their economic situations or their personal lives, it's fixing those situations that's going to keep us from seeing more violence. 

CAMEROTA: By the way, we also need to talk about the role that YouTube plays. They directed you to more and more extreme videos. Once you found one and YouTube could tell through its algorithms that you liked it, that you were watching it, that you were engaged, they directed you to more and more extreme videos, and they need to take some responsibility for that as well. 

CAIN: I think what YouTube needs to do is they need to have clear terms of service on their websites of what's acceptable and what isn't. You know, I am a free speech advocate but what I saw on the platform was people taking advantage of this algorithm. And the algorithm does not care about what your politics are. It cares about watch time and keeping you on platform. The A.I.'s that they use for this is called reinforce, and the whole idea is to keep you watching more and more. And extremists come online and take advantage of that. 

CAMEROTA: So Caleb, we're almost out of time. What's your message to other people who, as you say, have fallen down the “alt-right” rabbit hole? How did you pull yourself out? 

CAIN: I pulled myself out because I started getting exposed to other ideas. I basically started to educate myself on the problems and the issues, and I also started to reach out and get emotional support from others. And that is what we're trying to do, with my team and I is I have a bunch of volunteers, and we basically set up digital platforms to try to deradicalize young people online. And we're doing that through social intervention and compassion-based conversations. And it's really to combat, once again, this whole thing of digital hate politics, which leads to radicalization. And this radicalization is a huge public health crisis that we really need to solve.

Related:

New York Times: The Making of a YouTube Radical

Previously:

How YouTube facilitates right-wing radicalization

Stefan Molyneux is MAGA Twitter’s favorite white nationalist

New Zealand mass shooting illustrates failure of tech companies to prevent radicalization on their platforms