Author Page | Media Matters for America

Melissa Ryan

Author ››› Melissa Ryan
  • Republicans can't quit fake news

    The Republican Party has increasingly created and used political microsites designed to look like local news sites as a political tactic. Here’s why that’s bad for democracy.

    Blog ››› ››› MELISSA RYAN


    Sarah Wasko / Media Matters

    Last fall, Arizona Senate candidate Kelli Ward touted an endorsement from the Arizona Monitor on her Facebook page. Ward’s campaign must have really liked the endorsement because it reprinted it in full on her campaign website. But what is the Arizona Monitor? Is it a local news site? A blog covering local politics in Arizona? Or is it something else entirely?

    A Politico investigation found that the Arizona Monitor “launched just a few weeks before publishing the endorsement, and its domain registration is hidden, masking the identity of its owner. On its Facebook page, it is classified as a news site, but scant other information is offered.” Inquiries to Arizona politicos didn’t turn up anything either, with some telling the outlet that “they could only scratch their heads” and were befuddled by the site’s background.

    There’s nothing wrong with a local political blog supporting Ward’s campaign, or Ward’s team touting a friendly endorsement on her campaign website and social media. But political campaigns are notoriously overcautious about what they post on social media. Campaigns don’t normally highlight an endorsement from entities no one has heard of, especially when it launched just a few weeks prior. Politico noted that Ward denied any knowledge about the site on Facebook. Given that, there are two obvious questions: Is Arizona Monitor a phony news site meant to fool voters on Kelli Ward’s behalf? If so, who exactly is paying for it?

    We may never know who was behind the Arizona Monitor, as the site crumbled quickly after coming under scrutiny. Initially, it posted an article defending itself, but as I was writing this the website was deleted, as well as the site’s Twitter and Facebook pages. Local political blogs don’t generally operate this way; they relish being attacked by larger media outlets (the posture Arizona Monitor initially took) and do not disappear suddenly when attacked. Given its hasty exit from the internet, it’s not unreasonable to speculate that Arizona Monitor was some kind of front.

    Republican campaigns and entities creating campaign microsites designed to look like local news sites to support their candidates is nothing new. In 2014, the National Republican Campaign Committee (NRCC) created a series of phony news sites meant to mimic local news sites. The sites included a disclaimer at the bottom but otherwise made no indication that they were the product of a Republican campaign committee. An NRCC spokesperson at the time called it a “new and effective way to disseminate information to voters.” And last year, the Republican Governors Association (RGA) tried its hand at running its own microsite disguised as news site. As Media Matters senior fellow Matt Gertz noted at the time:

    FreeTelegraph.com resembles any of a host of hyperpartisan conservative websites that purport to share news. The website’s home page and articles emphasize social media sharing buttons and large photos; the pieces are brief and feature block quotes from other sources instead of original reporting or commentary. But while most right-wing hyperpartisan sites feature pieces supporting President Donald Trump and savaging his foes, FreeTelegraph.com employs a single-minded focus, with every article aiming to praise a Republican governor or gubernatorial candidate or criticize a Democratic one, with a particular focus on GOP targets in Virginia (24 articles), Connecticut (13), and Rhode Island (11).

    The website is still active.

    In Maine, the state Democratic Party recently filed a complaint with the state’s ethics agency alleging that the Maine Examiner, an anonymously owned news site covering Maine politics, made illegal expenditures in a local mayor’s race and that they might have coordinated with the Maine Republican Party as well.

    More recently, Politico reported that Rep. Devin Nunes (R-CA), apparently not content to let the NRCC handle his fake news needs, has a phony news site entirely paid for by his campaign committee. The website CARepublican.com, which Nunes refused to discuss with Politico, has a proper, if tiny, disclaimer but no other indication that it is a campaign website rather than an actual local news site or blog.

    But my personal favorite political phony news proprietor is GOP political consultant Dan Backer, who’s turned fake news into a money maker for his pro-Trump super PACs by using them to drive email sign ups and donations. A BuzzFeed investigation last summer found:

    Along with AAN [American Action News], Backer or his company, DB Capitol Strategies, is listed as the owner of conservative news domains AmericanUpdate.com, TrumpTrainNews.com, and GOPPresidential.com. Two other news sites — Truedaily.news and ICYMInews.com — link out heavily to the Backer-connected web properties, and use the same Google AdSense and Analytics codes as AAN and the three other sites. Truedaily.news and ICYMInews.com are also hosted on the same server as GOPPresidential.com — yet another piece of evidence to suggest they too are part of the network of sites connected to Backer. (The server in question hosts only those three websites.)

    Backer’s political fake news game is a whole new level, combining grassroots digital engagement with clickbait to build lists of supporters his super PACs can message and activate.

    Last week, I wrote about how Trump supporters share the most “junk news” online. Given that, it would seem predictable that Republicans would skip the middleman and just create the content themselves. Even better if they can use said content to raise funds for their political activities.

    But what might work for the Republican Party in the short term is terrible for democracy. A recent Knight News/Gallup survey found trust in media and views on what is or isn’t fake news was increasingly viewed through a partisan lens. Whereas liberals and Democrats get their news from more mainstream media outlets, conservatives increasingly rely on only right and far-right sources in their news consumption. News sites -- run by the GOP about the GOP -- risk shrinking that filter bubble even further. If this trend continues, and phony GOP news sites increase in popularity, conservatives could reach a point where much of the political news they consume would come directly from the Republican Party and associated campaign committees.

  • New research shows Trump’s army spreads the most “junk news.” Here’s why it matters

    Blog ››› ››› MELISSA RYAN


    Sarah Wasko / Media Matters

    Our media ecosystem is broken. Americans are continually pummeled online with computational propaganda campaigns, including fake news and manipulated trending topics on Facebook and Twitter. These campaigns drive political conversation from social media feeds to cable news to the White House, but there’s been little acknowledgment of this reality in mainstream political coverage.

    Two academic studies, one recent and one from last year, give us a good sense of how social media manipulation plays out online. This week, Oxford Internet Institute’s Computational Propaganda Project released a study that illustrates the disconnect in American political discourse. The study analyzed “junk news” (the term researchers used for fake news and other kinds of misinformation) shared on Twitter and Facebook in the three months leading up to President Donald Trump’s first State of the Union address. It found that on Twitter, Trump supporters shared 95 percent of “junk news” websites that the researchers had identified for their sample, accounting for 55 percent of “junk news traffic in the sample.” Other audiences also shared links from these “junk news sites” but at much lower rate. On Facebook, far-right pages that the researchers collectively called “Hard Conservative Group,” shared 91 percent of the “junk news sites,” accounting for 58 percent of total “junk news” traffic from the sample.

    The study’s conclusion of the overall American political conversation online is worth highlighting: “The two main political parties, Democrats, and Republicans prefer different sources of political news, with limited overlap. For instance, the Democratic Party shows high levels of engagement with mainstream media sources and the Republican Party with Conservative Media Groups.” This is similar to last year’s Harvard Berkman Klein Center study of traditional media and social media coverage leading up to the 2016 election. According to the author, whereas liberals and Democrats get their news from mainstream media that are ideologically structured from the center to the left, conservatives increasingly rely on only right and far-right sources in their news consumption.

    Social media filter bubbles have received a lot of media coverage but they’re only part of the problem. American political conversation doesn’t just exist in filter bubbles. The influence is lopsided. Right-wing media and social media influence both mainstream media and, by extension, the liberals’ filter bubble (because liberals consume more mainstream news). But the reverse isn’t true.

    Media coverage of #ReleaseTheMemo is a prime example of the problem of the manipulation related to this conservative filter bubble. Information warfare expert Molly McKew wrote a detailed analysis of the computational propaganda campaign that pushed the hashtag to go viral on social media, detailing how #ReleaseTheMemo was a “targeted, 11-day information operation” amplified by both Russian trolls and American Trump supporters to “change both public perceptions and the behavior of American lawmakers.” McKew noted that this campaign, which is part of a far-right echo chamber, is “not just about information, but about changing behavior,” and that it can be “surprisingly effective.” But Playbook, Politico’s premier political news product, mentioned the article almost in passing the day after its release, in some ways proving McKew’s point. Despite the fact that Playbook had covered #ReleaseTheMemo campaign often in the previous week, McKew’s article was mentioned far down Sunday’s edition of the newsletter, below a recap of Saturday Night Live’s political sketches.

    Playbook Screenshot

    Computational propaganda is now a standard practice in political communications. Despite the growing body of research studying the phenomenon, media coverage rarely acknowledges the role computational propaganda plays in shaping American political conversation. This disconnect is troubling when you consider how often trending topics on social media drive political media coverage.

    As the Oxford study shows, Trump and his army of supporters online are in the driver’s seat. What we see as trending on social media often isn’t organic but the result of sophisticated amplification campaigns, which are part of a far-right echo chamber. The goal of computational propaganda is to manipulate public opinion and behavior. Covering politics in this environment requires both a working knowledge of computational propaganda and a duty to explain to readers when political interest is driven by social media manipulation.

  • Russian trolls moved 340,000 Americans up the ladder of engagement

    Blog ››› ››› MELISSA RYAN


    Sarah Wasko / Media Matters

    Last night, The Washington Post revealed that Russian trolls “got tens of thousands of Americans to RSVP” to local political events on Facebook. We’ve known since last September that Russian trolls employed this tactic and often created dueling events at the same location and time, probably to incite violence or increase tension within local communities. But it is only now we’re learning the scale of that engagement. Per the Post, “Russian operatives used Facebook to publicize 129 phony event announcements during the 2016 presidential campaign, drawing the attention of nearly 340,000 users -- many of whom said they were planning to attend.”

    The new information comes via the Senate intelligence committee, which has been investigating potential Russian collusion in the 2016 U.S. elections and pressuring tech companies, especially Facebook, Twitter, and Google, to disclose more of what they know about just how much propaganda Americans saw on their platforms. Both Twitter and Facebook have agreed to let users know if they were exposed, but given that we’re still learning more about the scale of the operation, I’m skeptical that anyone knows how many Americans were exposed to Russian propaganda or how often. (If you’d like to check for yourself, I helped create a site that allows anyone to check the likelihood of them being exposed on Facebook.)

    By now most Americans accept that Russian propaganda appeared on their social media feeds in 2016. What concerns me is whether or not they believe that they themselves were susceptible to it. The fact that nearly 340,000 people RSVP’d to events created by Russian trolls -- that they moved up the ladder of engagement from consuming content to RSVPing to an event -- should make us all reconsider our own vulnerability, especially when you consider that many of these events were created to sow discord. Russia’s goal is to destabilize U.S. democracy. Stoking racial, cultural, and political tensions in local communities across the U.S. via creating events on Facebook is a cheap and effective way for Russian trolls to do this.

    Russia’s use of social media to disseminate propaganda and stoke political tension is an ongoing problem. Last fall, Sens. Richard Burr (R-NC) and Mark Warner (D-VA), leaders of the Senate intelligence committee, issued a bipartisan warning that Russian trolls would continue their actions into the 2018 midterm elections and 2020 presidential elections to sow chaos. A ThinkProgress article on the now-defunct website BlackMattersUS.com illustrates how sophisticated propaganda operations can use content, online campaigns, offline events, and relationships with local activists to develop trust and credibility online. And as the successful dueling event demonstrate, all Americans, no matter what their political persuasion, are susceptible to these influence operations.

    As Recode Executive Editor Kara Swisher pointed out on MSNBC today, we’re in an “ongoing war.” There’s no easy way to tell if the content we see on our social media feeds comes from Russian trolls or other hostile actors. There’s no media literacy course or easily available resource that can teach individuals how to identify propaganda. That’s why regulation that protects consumers such as stricter disclosure of political ads and safeguards against fraud is so vital to solving this problem. Especially as tech companies have proven reluctant to make any real changes beyond what public pressure demands of them.

  • For Zuck's sake

    Blog ››› ››› MELISSA RYAN

    Mark Zuckerberg has been sharing a lot this month. First, he posted that his “personal challenge” for 2018 is to fix the glaring and obvious problems for which he’s taken so much heat. Last week, he announced that he had directed Facebook’s product teams to change their focus from “helping you find relevant content to helping you have more meaningful social interactions.” Zuckerberg promised users that they’d see less content from “businesses, brands and media” and more content from “your friends, family and groups.” On Friday, Zuckerberg shared another major change: Facebook would improve the news that does get shared by crowdsourcing what news sources were and weren’t trustworthy via user surveys.

    The first change, a return to “meaningful interaction,” is one I can get behind. I’m all for anything that discourages fake news sites from monetizing on Facebook. I’ve long suspected that part of why these sites took hold in the first place was a lack of meaningful content available on our feeds. Less sponsored content and more pictures and videos from family and friends will greatly improve my Facebook experience. I suspect I’m not the only one.

    I’m also hopeful this change will move digital advocacy away from broadcasting and back to organizing. Given how Facebook groups have become such a crucial part of #TheResistance I’m glad to hear they’ll be emphasized. I want to see more groups like Pantsuit Nation and the many local Indivisible groups that have formed in the last year. (Media outlets fear not, Vox has also been building Facebook groups in addition to their pages.) Digital ads and acquisition shouldn’t be the only tools digital organizers use. Increased engagement should involve actually engaging folks rather than simply broadcasting to them.

    The second change, user surveys to determine what news people trust, is maddening. If you were going to design a system that could be easily gamed, this is how you’d do it. “Freeping” online polls and surveys is a longstanding tactic of the far right online, going back nearly 20 years. It’s in their online DNA and they have groups of activists at the ready who live for this activity. Facebook isn’t handing authority over to their broader community but to an engaged group of users with an agenda. Even if the freeping wasn’t inevitable, it’s pretty well established that there’s already no common ground when it comes to what news sources people with different political viewpoints trust.

    The crux of the problem is that Facebook desperately wants to be seen a neutral platform while Facebook’s users want them to keep inaccurate information off of Facebook. In his New Year’s post, Zuckerberg emphasized he believes technology “can be a decentralizing force that puts more power in people’s hands” while acknowledging that the reality might be the opposite. There’s a tension between his core beliefs and what Facebook users currently expect from the company. My sense is that’s a driving force behind attempting to pass the buck back to us.

    Facebook will only go as far as their users pressure them, especially in the US where regulation from the government will be minimal. If we want Facebook to take responsibility, we have to continually hold them accountable when things go wrong or when proposed solutions don’t go far enough. Mark Zuckerberg’s personal challenge is to fix what’s broken. Ours is to keep pressing him in the right direction.

    This piece was originally published as part of Melissa Ryan's Ctrl Alt Right Delete newsletter -- subscribe here