One Hannity segment included six claims that Rod Rosenstein and the FBI attempted to stage a "coup" against Donald Trump
Video ››› ››› MEDIA MATTERS STAFF
Loading the player reg...
Loading the player reg...
Study co-author blasts this conspiracy theory as “absolutely ridiculous”
Melissa Joskow / Media Matters
An emerging smear of Christine Blasey Ford, who says that Brett Kavanaugh sexually assaulted her when they were in high school, suggests that Ford created a false memory of the assault while in a hypnotic state.
Margot Cleveland, a senior contributor to The Federalist, launched the conspiracy theory on Twitter, seizing on an academic article co-authored by Ford, who is a psychology professor at Palo Alto University, and 10 others that is titled, “Meditation With Yoga, Group Therapy With Hypnosis, and Psychoeducation for Long-Term Depressed Mood: A Randomized Pilot Trial.” The conspiracy theory was later posted at The Federalist.
Cleveland wrote that the article was about “a study in which participants were TAUGHT SELF-HYPNOSIS & noted hypnosis is used to retrieve important memories ‘AND CREATE ARTIFICAL (sic) SITUATIONS’":
BREAKING: This is HUGE (waiting for permission to h/t): One of Christine Ford Blasey's research articles in 2008 included a study in which participants were TAUGHT SELF-HYPNOSIS & noted hypnosis is used to retrieve important memories "AND CREATE ARTIFICAL SITUATIONS." pic.twitter.com/11n1JVnArM
— Margot Cleveland (@ProfMJCleveland) October 1, 2018
The implication is that Ford may have hypnotized herself and created a false memory of her account of Kavanaugh sexually assaulting her at a party when she was 15 and he was 17. This is a misreading of the article, which cites research published in 1964 by Stanley Abrams that “suggested that hypnosis could be used to improve rapport in the therapeutic relationship, assist in the retrieval of important memories, and create artificial situations that would permit the client to express ego-dystonic emotions in a safe manner.”
In terms of self-hypnosis, the article says that “participants also were taught self-hypnosis to use outside the group for relaxation and affect regulation” -- not to create false memories.
Reached for comment, one of the study’s co-authors, who is being granted anonymity because of harassment and threats surrounding Ford’s decision to speak out, told Media Matters that the claims being spread about Ford and the study are “absolutely ridiculous” and “the study had absolutely nothing to do with the creation of false memories, or the creation of memories of any kind.” The co-author added that Ford was a statistical consultant on the report, not a participant in the study, and that she worked on the data after it was collected.
Conservative radio host and conspiracy theorist Michael Savage is promoting a rapidly spreading conspiracy theory that professor Christine Blasey Ford, who says Supreme Court nominee Brett Kavanaugh sexually assaulted her when they were in high school, has “deep” connections to the Central Intelligence Agency.
Savage has pushed incredibly bizarre conspiracy theories and hateful rhetoric, and he has been closely connected to President Donald Trump and the White House. He pushed the latest conspiracy theory on Twitter and his website:
IS DR. FORD DEEPLY TIED TO THE CIA? pic.twitter.com/QcRLchqGGn
— Michael Savage (@ASavageNation) September 27, 2018
Savage’s conspiracy theory makes three claims about Ford’s connections to the CIA, all of which are false or baseless:
The post claims that Ford “happens to head up the CIA undergraduate internship program at Stanford University.” This claim seems to originate from a conspiracy theory website, brassballs.blog, that drew this conclusion because Stanford does have an undergraduate CIA internship program, and Ford, who is a psychology professor at nearby Palo Alto University, is also listed as an “affiliate” in the “psychiatry and behavioral sciences” department at Stanford. The blog post argues that it is suspicious that Ford’s contact information has been deleted from her Stanford profile page, although the more likely explanation is that it has been removed due to the threats and harassment that Ford has received since coming forward.
The theory draws another connection between Ford and the CIA via her brother’s previous work for law firm BakerHostetler. A previous Ford-related conspiracy theory connected her brother’s work at BakerHostetler to Fusion GPS, a research firm involved in the ongoing Russian collusion investigation. However, Ford’s brother left BakerHostetler six years before Fusion GPS was ever founded. Savage’s conspiracy theory repeats this false claim and goes even further, claiming that three CIA-controlled businesses are located in the same building as BakerHostetler. There is no evidence these businesses are connected to the CIA -- in fact, one, Red Coats, Inc., is a janitorial company that does not even share office space with BakerHostetler.
Savage’s post also claims that Ford is the granddaughter of Nicholas Deak, who worked with the CIA during the Cold War. According to his 1985 Washington Post obituary, Deak only had one child, a son named R. Leslie Deak. But as the conspiracy theory’s second claim also notes, Ford’s father is actually Ralph Blasey Jr.
Savage’s false claim is rapidly spreading, and was promoted during Alex Jones’ September 28 broadcast. The conspiracy theory is also indicative of how search platforms like Google amplify such clear falsehoods. A Google search for “Christine Ford CIA” done in a private browsing window aggregated YouTube videos pushing the conspiracy theory and Savage’s website as the top results:
The CIA conspiracy theory is just one of several false narratives related to Ford’s brother. A claim that he also worked with former FBI agent Peter Strzok’s sister-in-law has been spreading on voat and 4chan, and has turned into a meme spreading on Twitter and Facebook.
From "gurus" to extremist "influencers," the video site is a potent tool for ideologues
For the casual YouTube viewer -- someone who logs on once in a while to access cute kitten videos or recipe demonstrations -- it can be difficult to imagine that the video site is also a teeming cesspit of hate speech and a prime means of its transmission.
But a new study from think tank Data & Society and the earlier work of ex-YouTube engineer Guillaume Chaslot reveal the technical and social mechanisms underlying an inescapable truth: Thanks to an algorithm that prioritizes engagement -- as measured by the interactions users have with content on the platform -- and “influencer” marketing, YouTube has become a source of right-wing radicalization for young viewers.
An algorithm that incentivizes extreme content
YouTube’s recommendation algorithm dictates which videos rise to the top in response to search queries, and, after a video finishes playing, it populates the video player window with thumbnails recommending further content. According to a Wall Street Journal analysis, YouTube’s algorithm “recommends more than 200 million different videos in 80 languages each day.” These recommendations take into account what the viewer has already watched, but it’s all in the service of engagement, or, as the Journal’s Jack Nicas put it, “stickiness” -- what keeps the viewer on the site, watching. The longer viewers watch, the more ads they see.
But this has unintended consequences.
“They assume if you maximize the watch time, the results are neutral,” Guillaume Chaslot, a former Google engineer and creator of the YouTube algorithm analysis tool Algo Transparency, told Media Matters. “But it’s not neutral ... because it’s better for extremists. Extremists are better for watch time, because more extreme content is more engaging.”
In a way, it’s common sense -- videos that make inflammatory claims or show explosive images tend to grab viewers’ attention. And attention-grabbing videos -- those that cause viewers to watch more and longer -- rise up in the recommendation algorithm, leading more new viewers to see them in their list of recommended videos.
As the Journal’s analysis showed, viewers who began by viewing content from mainstream news sources were frequently directed to conspiracy theory-oriented content that expressed politically extreme views. A search for “9/11” quickly led Journal reporters to conspiracy theories alleging the U.S. government carried out the attacks. When I searched the word “vaccine” on YouTube using incognito mode on Google Chrome, three of the top five results were anti-vaccine conspiracy videos, including a video titled “The Irrefutable Argument Against Vaccine Safety,” a series titled “The Truth About Vaccines” with more than 1 million views, and a lecture pushing the debunked pseudo-scientific claim that vaccines are linked to autism.
Because YouTube’s algorithm is heavily guided by what has already been watched, “once you see extremist content, the algorithm will recommend it to you again,” Chaslot said.
The result is a tailor-made tool for radicalization. After all, once users have started exploring the “truth” about vaccines -- or 9/11, or Jews -- the site will continue feeding them similar content. The videos that auto-played after “The Truth About Vaccines” were, in order: “My Vaxxed child versus my unvaccinated child”; “Worst Nightmare for Mother of 6 Unvaxxed Children” (description: “The mother of 6 unvaccinated children visits the emergency room with her eldest daughter. Her worst nightmare becomes reality when her child is vaccinated without her consent”); and “Fully Recovered From Autism,” each with more than 160,000 views.
“By emphasizing solely watch time, the indirect consequence that YouTube doesn’t want to acknowledge is that it’s promoting extremism,” Chaslot said.
Chaslot emphasized that YouTube’s own hate speech policy in its Community Guidelines was unlikely to meaningfully curb the flourishing of extremist content. The primary issue: The algorithm, which controls recommendations, is utterly separate from the company’s content-moderation operation. The result is a fundamentally self-contradictory model; engagement alone controls the rise of a video or channel, independent from concerns about substance.
There’s also what Chaslot called “gurus” -- users who post videos that cause viewers to engage for hours at a time. As a result, even if their audiences begin as relatively small, the videos will rise up in the recommendation algorithm. The examples he provided were PragerU, a right-wing propaganda channel whose brief explainer videos have garnered some 1 billion views, and Canadian pop-antifeminist Jordan Peterson’s channel.
But the guru effect has the power to amplify far more troubling content, and, according to new research, far-right extremists have adapted to a world of recommendation algorithms, influencer marketing, and branding with ease and efficiency.
The sociopath network
YouTube isn’t just a sea of mindless entertainment; it’s also a rather ruthless market of individuals selling their skills, ideas, and, above all, themselves as a brand. YouTube’s Partner Program provides financial incentives in the form of shares of advertising revenue to “creators” who have racked up 4,000 hours of audience attention and at least 1,000 subscribers. For those who become authentic micro-celebrities on the platform, the viral-marketing possibilities of becoming a social-media “influencer” allow them to advertise goods and products -- or ideologies.
Becca Lewis’ groundbreaking new study from Data & Society catalogues the ways that ideological extremists have cannily adapted the same techniques that allow makeup vloggers and self-help commentators to flourish on the video site. The study, titled “Alternative Influence: Broadcasting the Reactionary Right on YouTube,” is an unprecedented deep dive into 81 channels that spread right-wing ideas on the site. Crucially, it also maps the intricate interconnections between channels, breaking down how high-profile YouTube figures use their clout to cross-promote other ideologues in the network. (Media Matters’ own study of YouTube extremists found that extremist content -- including openly anti-Semitic, white supremacist, and anti-LGBTQ content -- was thriving on the platform.)
Lewis’ study explores and explains how these extremists rack up hundreds of thousands or even millions of views, with the aid of a strong network of interconnected users and the know-how to stand out within a crowded field of competing would-be influencers.
The study provides a concrete look at the blurring of lines between popular, right-wing YouTube content creators often hosted on conservative media outlets like Fox News like Dave Rubin, Ben Shapiro, and Candace Owens, and openly white supremacist content creators with smaller platforms. In many cases, Lewis found that these channels had invited the same guests to speak from other channels in the network, leading to the creation of “radicalization pathways.” Rubin, whose channel has 750,000 subscribers, was cited as an example for hosting the Canadian racist commentator Stefan Molyneux. “Molyneux openly promotes scientific racism, advocates for the men’s rights movement, critiques initiatives devoted to gender equity, and promotes white supremacist conspiracy theories focused on ‘White Genocide,’” Lewis writes. During his appearance on Rubin’s channel, the host failed to meaningfully challenge Molyneux’s ideas -- lending credibility to Molyneux’s more extreme worldview.
Rubin vehemently denied charges of his association with white supremacy on Twitter, but failed to refute the specifics of Lewis’ findings:
Despite Rubin’s assertion, Lewis’ study does not mention the word “evil.” What the study does make clear, however, are the ways in which web-savvy networks of association and influence have become crucial to the spread of extremist ideologies on the internet. The issue of racist, sexist, and anti-LGBTQ content is not limited to obscure internet fever swamps like 4chan and Gab -- but it is also happening in a public and highly lucrative way on the web’s most popular video platform.
Conservative provocateur Ben Shapiro, named as an influencer in the network, also sought to discredit the study.
But Shapiro was only separated by one degree, not six, from Richard Spencer: He has been interviewed by a right-wing YouTuber, Roaming Millenial, who had invited Richard Spencer to share his views on her channel two months earlier.
“There is an undercurrent to this report that is worth making explicit: in many ways, YouTube is built to incentivize the behavior of these political influencers,” Lewis writes. “The platform, and its parent company, have allowed racist, misogynist, and harassing content to remain online – and in many cases, to generate advertising revenue – as long as it does not explicitly include slurs.”
Just last week, extremist hate channel Red Ice TV uploaded a screed titled “Forced Diversity Is Not Our Strength,” promoting segregated societies. Hosted by gregarious racist Lana Lotkeff, who has become a micro-celebrity in the world of white supremacists, the video asserts that “minorities and trans people” have had a negative impact on “white creativity.”
Red Ice TV has more than 200,000 subscribers. At press time, the “Forced Diversity” video had more than 28,000 views. Upon completion of Lotkeff’s anti-diversity rant, YouTube’s auto-play suggested more Red Ice TV content -- this time a video fearmongering about immigrants -- thus continuing the automated cycle of hate.
Loading the player reg...
Loading the player reg...
Loading the player reg...
Loading the player reg...
After Facebook, YouTube, Spotify, and iTunes all removed conspiracy theorist Alex Jones and Infowars pages from their platforms, several right-wing media figures leapt to the extremist’s defense. Jones’ defenders responded by criticizing and threatening “the entire rotten tech machine” and invoking a wide range of comparisons to support him, including Star Wars, George Orwell’s Nineteen Eighty-Four, reality TV star Kylie Jenner, and the Holocaust.
In a bizarre exchange, Isaac Kappy and Alex Jones sparred over whether “chicken” is slang for pedophilia
In a more-than-usually bizarre segment on Tuesday, Infowars’ Alex Jones hosted Isaac Kappy, a minor actor whose recent spate of Periscope and YouTube videos accusing prominent Hollywood figures of pedophilia have made waves in the conspiracy-minded community.
Liberally utilizing the hashtag #QAnon, which is affiliated with a sprawling pro-Trump conspiracy theory, Kappy has spread baseless accusations that actors including Tom Hanks, Steven Spielberg, and Seth Green are pedophiles. This slate of denunciations proved so popular that for a brief time this week, Kappy’s videos and other QAnon-affiliated broadcasts dominated the YouTube search results for the celebrities. During a segment on the July 31 edition of The Alex Jones Show, Jones set the stage for Kappy to spread his baseless recrimination of Hollywood figures, repeatedly asking leading questions about “Aleister Crowley” rituals and “Hollywood parties.”
Jones -- who has devoted airtime to amplifying QAnon theories on multiple shows -- sparred with Kappy in a series of bizarre segments. Kappy claimed that actor Seth Green is sexually interested in children, based in part on an alleged dinner in which Green, the creator of the show Robot Chicken, told him, “We need to have a talk about chicken.”
Kappy claimed “chicken” is “a pedophile code word for very young child”; Jones responded incredulously, repeatedly asking whether Green and other Hollywood figures had subjected Kappy to practical-joke “Sacha Baron Cohen”-style tactics used to dupe celebrities and politicians. Kappy insisted that he had seen evidence of a broad child-sex ring that pervaded Hollywood, but he was unable to provide substantiating evidence, despite naming Green and his wife directly.
However, Jones, who is being sued in a defamation lawsuit brought by parents of two children killed in the Sandy Hook Elementary School shooting, asked Kappy to restrain himself and avoid “ getting into names.” At one point, Kappy insisted Jones was “gaslighting” him by asking him to substantiate his claims.
The grim sparring was a strange sideshow in the business of broadcasting conspiracy theories to a huge audience, one that Kappy has just entered via unhinged Periscope streams. The notion that broadly liberal segments of society, such as Hollywood and the media, are engaged in baroque cover-ups of pedophilia is a cornerstone of the QAnon conspiracy theory -- which holds that President Donald Trump is working behind the scenes to kneecap members of the “deep state” and crack down on pedophilia rings connected to powerful politicians and liberal celebrities. The claim has flourished for months in online message boards, despite just recently coming to mainstream attention. The recklessness of Kappy’s claims is a powerful illustration of just how far some conspiracy theorists are willing to go in pursuit of infamy -- and a chilling portent of the lengths to which conspiracy theory adherents might be willing to go to stop the horrors they imagine.
Loading the player reg...
Loading the player reg...
Loading the player reg...
For a brief time, searching for Tom Hanks or Steven Spielberg on the video site brought up baseless accusations
On the morning of July 30, if you were searching YouTube for Tom Hanks or Steven Spielberg -- wanting to learn a little about Hollywood royalty, or just to find that funny clip from Big you loved years ago -- you would have been in for an unpleasant surprise.
As NBC’s Ben Collins first pointed out on Twitter, the search results for Hanks and Spielberg were dominated by conspiracy theories, alleging that both Spielberg and Hanks -- along with other celebrities including like Seth Green and Macaulay Culkin -- were pedophiles and, a part of a nefarious ring of Hollywood child predators that online conspiracy theorists had dubbedentitled #Pedowood.
The videos that popped up upon searching for Spielberg and Hanks were low-quality-fi, rambling, close-up shots, several made by a man named Isaac Kappy, a minor actor who has spent the last week posting video-recorded rants on YouTube with titles like “Famous Actor Exposes Hollywood Pedophiles! Steven Spielberg, Tom Hanks And More! #Pizzagate.” Thanks to rapid dissemination on message boards Reddit and 4chan, the videos garnered hundreds of thousands of views and shot up in the YouTube rankings, eclipsing interviews and movie clips featuring the stars.
The hashtag #Pizzagate included in the title of Kappy’s video is a reference to the Pizzagate conspiracy theory, which posits that prominent Democrats are running a child sex-slave ring out of a Washington, D.C., pizza restaurant. The conspiracy theory culminated in one adherent firing an automatic weapon inside the pizzeria. According to BuzzFeed, the newfound allegations of pedophilia against Hanks can be traced back to Twitter user Sarah Ruth Ashcraft, a prominent member of the QAnon conspiracy theory community, which grew out of Pizzagate and has mushroomed into baroque complexity. The ever-growing QAnon conspiracy theory, which is flexible enough to accommodate a wide range of events, asserts that a broad array of prominent figures with liberal leanings are part of an international child sex-slavery operation. The theory has hundreds of thousands of devotees on Reddit, YouTube, Facebook, and Twitter and countless dedicated blogs. (Roseanne Barr is a prominent believer in QAnon.) People are even showing up to Trump rallies dressed in "Q" apparel.
People lining up for the Trump rally in Tampa today. A lot of the chan anons might treat Q-Anon like a LARP, but by all appearances there are plenty of people who take it seriously irl. pic.twitter.com/uys7kmnAs1
— Travis View (@travis_view) July 31, 2018
Ashcraft, who frequently uses the hashtag #QAnon, has over 45,000 Twitter followers and uses her page to decry “Ritual Abuse, Mind Control, Child Porn, and Sex Trafficking,” focusing her ire on the alleged wrongdoings of celebrities like Hanks. (Since Ashcraft’s accusations against Hanks made headlines, and after BuzzFeed pointedly reached out to the social media company, her Twitter page has been restricted.)
After NBC’s Collins reached out to YouTube for comment, some of the conspiracy-theory videos dropped in search rankings for the celebrities. A spokesperson for YouTube told Buzzfeed, “We’re continuously working to better surface and promote news and authoritative sources to make the best possible information available to YouTube viewers.”
The hyperconnectivity of social media can make constructive messages spread fast -- and destructive falsehoods spread even faster. This latest incident is another powerful illustration of the ways in which social media can be gamed by conspiracy theorists. It’s an issue social networks have struggled to fully grasp; any suppression of conspiracy theorists’ pages, after all, lends credence to the notion that they are oppressed keepers of vital truths. Infowars’ Alex Jones was recently personally banned from Facebook for 30 days after the platform determined that several videos he shared were determined to have violated community standards; Jones and his fanbase reacted with predictable opprobrium and claims of censorship. But Facebook did not assert that Jones’ penchant for spreading baseless conspiracy theories was part of the rationale for the ban; instead, it focused on policies regarding hate speech and bullying. That, in turn, raised questions of why Infowars as a whole did not receive a ban.
Social media platforms that purport to be concerned with the spread of "fake news" must consider -- and contain -- conspiracy theories proactively, not just when journalists point them out. Left unchecked, those conspiracy theories have a direct connection to subsequent harassment and worse.
Loading the player reg...