After months of advocacy from civil rights groups, this morning Facebook announced that it will conduct a civil Rights audit of its platform. Via Axios:
The civil rights audit will be guided by Laura Murphy, a national civil liberties and civil rights leader. Murphy will take feedback from civil rights groups, like The Leadership Conference on Civil and Human Rights, and advise Facebook on the best path forward.
Calls for a civil rights audit, led by Muslim Advocates and Color of Change, along with the Center for Media Justice, the NAACP Legal Defense and Educational Fund, the Southern Poverty Law Center, and the Leadership Conference on Civil and Human Rights, have been ongoing. As the Color of Change petition notes, “Through their data malpractice, opaque interactions with law enforcement, erasure of Black activist voices, and inability and unwillingness to tackle the rise of white supremacist organizing and hate speech on the platform, Facebook's failures have put our communities at risk.”
The civil rights audit is a win. After months of pressure, Facebook has agreed to examine how its platform has been weaponized to spread hate and harm underrepresented communities and people of color. If Facebook takes the audit seriously and implements its recommendations, the platform could change for the better, making Facebook a safer space and a creating better overall user experience for all communities.
Facebook’s announcement of a legal civil rights audit is the direct result of extraordinary advocacy by @mediajustice, @ColorOfChange, @MuslimAdvocates, @NAACP_LDF, @civilrightsorg and so many others.
— malkia a. cyril (@culturejedi) May 2, 2018
But alongside this welcome announcement from Facebook came a not-so-welcome one. Also from Axios:
To address allegations of bias, Facebook is bringing in two outside advisors — one to conduct a legal audit of its impact on underrepresented communities and communities of color, and another to advise the company on potential bias against conservative voices.
The conservative bias advising partnership will be led by former Arizona Republican Sen. Jon Kyl, along with his team at Covington and Burling, a Washington law firm.
The civil rights audit isn’t partisan. Hate speech, safety, and privacy aren’t political issues, but moral issues. Yet by announcing the audit at the same time as a “conservative bias advisory partnership” (presumably to address a claim that has already been debunked), Facebook is conflating the two. It suggests that the audit is meant to address criticism from the political left, not the problems underrepresented communities face on the platform every single day.
“However, this is just a first step. We are concerned that with their appointment of the Heritage Foundation to investigate issues of liberal bias, @Facebook is playing into party politics and detracting from the real issue" @rashadrobinson
— ColorOfChange.org (@ColorOfChange) May 2, 2018
Muslim Advocates addresses this concern in its response:
We are concerned, however, about the pairing of this announcement with another that Facebook will be bringing on advisors responsible for determining bias against conservatives. We strongly reject the message this sends regarding the moral equivalency of hate group activities and conservative viewpoints. Despite this concern, we hope this first step is a sign that Facebook will begin to take responsibility for the hate and bigotry that has flourished on its platform.
Safety online isn’t partisan. Facebook’s users should have the expectation that their civil rights won’t be violated while using the platform. Pairing the civil rights audit with a partisan panel on supposed conservative bias on the platform suggests that Facebook doesn’t take civil rights seriously, instead viewing it as a partisan complaint that must be appeased.
This is not acceptable. Facebook must fully commit to ensuring its users are safe online and that their rights aren’t violated. Today’s announcement was a big misstep. It creates the impression that Facebook sees the audit as a political issue and not a moral one. The ball is in Facebook’s court to correct this.