Facebook failed to protect consumers from Cambridge Analytica. Only systemic changes can prevent that from happening again.

Facebook failed to protect consumers from Cambridge Analytica. Only systemic changes can prevent that from happening again.

50 million reasons to be mad at Facebook

Blog ››› ››› MELISSA RYAN


Sarah Wasko / Media Matters

Tech companies have repeatedly failed to protect the consumers who use their platforms, and despite the outrage that arises when news of another failure breaks, remarkably little has been done to fix the problem. Consumers have been left to deal with fake news, predatory political ads, and data breaches largely on their own without assistance from companies, government, or other institutions. We’re dealing with systemic failures of the social media ecosystem, but the solutions offered largely call on individuals to sort out their online experience for themselves.

This past weekend, a series of stories broke that illustrate just how colossal those failures are. On Friday, Facebook abruptly announced that it had banned Cambridge Analytica, the firm that did data targeting for Donald Trump’s presidential campaign, from using the platform for “violating its policies around data collection and retention,” as The Verge described it. On Saturday, The New York Times and The Observer broke the story Facebook was clearly trying to get ahead of: Cambridge Analytica had illegally obtained and exploited the Facebook data of 50 million users in multiple countries.

Via The New York Times:

The firm had secured a $15 million investment from Robert Mercer, the wealthy Republican donor, and wooed his political adviser, Stephen K. Bannon, with the promise of tools that could identify the personalities of American voters and influence their behavior. But it did not have the data to make its new products work.

So the firm harvested private information from the Facebook profiles of more than 50 million users without their permission, according to former Cambridge employees, associates and documents, making it one of the largest data leaks in the social network’s history. The breach allowed the company to exploit the private social media activity of a huge swath of the American electorate, developing techniques that underpinned its work on President Trump’s campaign in 2016.

Carole Cadwalladr of The Observer worked with whistleblower Christopher Wylie for over a year to expose Cambridge Analytica’s practices and Facebook’s complicity in allowing them:

Wylie oversaw what may have been the first critical breach. Aged 24, while studying for a PhD in fashion trend forecasting, he came up with a plan to harvest the Facebook profiles of millions of people in the US, and to use their private and personal information to create sophisticated psychological and political profiles. And then target them with political ads designed to work on their particular psychological makeup.

“We ‘broke’ Facebook,” he says.

And he did it on behalf of his new boss, Steve Bannon.

“Is it fair to say you ‘hacked’ Facebook?” I ask him one night. He hesitates. “I’ll point out that I assumed it was entirely legal and above board.”

It’s particularly troubling that this stolen data was used in a political campaign. Cambridge Analytica has long had a reputation for being “shady”; during the 2016 Republican primaries, many GOP consultants complained about the company’s practices and methodology. Democratic data consultants have also speculated prior to this week’s revelations that Cambridge Analytica would have had to steal data in order to do the work its team has bragged about doing. Even the Trump campaign, despite having staff from Cambridge Analytica embedded in its headquarters, attempted to deny that the company had done what it had claimed: used psychographic profiling to help Trump win.

More troubling is the connection to Russia. In 2014, Chris Wylie was asked to help Cambridge Analytica prepare a pitch to Vagit Alekperov, a Russian oligarch and the CEO of Lukoil. “It didn’t make any sense to me,” he told The Guardian, "I didn’t understand either the email or the pitch presentation we did. Why would a Russian oil company want to target information on American voters?” The eventual presentation “focused on election disruption techniques,” The Guardian reported. “The first slide illustrates how a ‘rumour campaign’ spread fear in the 2007 Nigerian election – in which the company worked – by spreading the idea that the ‘election would be rigged’. The final slide, branded with Lukoil’s logo and that of SCL Group and SCL Elections, headlines its ‘deliverables’: ‘psychographic messaging.’”

An illegal data breach. Russian oligarchs. Psychographic profiling to manipulate voters. Social media is breaking democracy, aided by companies with shady practices and politicians who have turned a blind eye. By not disclosing the leak and allowing Cambridge Analytica to continue using its platform, Facebook failed us. By not asking more questions and considering regulations much earlier, political leaders on two continents, have failed us as well. What’s a social media user supposed to do? And remember, this is to say nothing about similar commercial practices on Facebook.

The only recourse we consumers have is to demand systemic changes. Tech companies must feel more pressure from us. Governments and regulatory bodies must be similarly pressured to force tech companies to protect consumers using regulations and legislation. We need more citizens like Parsons professor David Carroll, who is mounting a legal effort against Cambridge Analytica, to explore the potential of lawsuits.

We have 50 million reasons to be mad at Facebook. If that anger can be turned into action, the potential exists to create a global consumer movement on a scale never seen before. Social media is broken, but with the right amount of pressure we can force the tech giants, starting with Facebook, to fix themselves.

We've changed our commenting system to Disqus.
Instructions for signing up and claiming your comment history are located here.
Updated rules for commenting are here.