Facebook censorship bias detailed in internal documents

1

Facebook’s censorship policies are skewed to protect “groups” such as white men from hate speech, but not “subsets” such as black children, according to ProPublica, which published a report Wednesday based on a look at Facebook’s internal documents.

Say what?

Under Facebook’s rules, this is what’s protected: sex, religious affiliation, national origin, gender identity, race, ethnicity, sexual orientation and serious disability or disease. Here’s what is not protected: social class, continental origin, appearance, age, occupation, political ideology, religions and countries. If an issue concerns two protected categories, such as race and gender (white men), that’s a protected category. As for black children, that consists of one protected (black) and one not protected (kids). Therefore, it’s a subset that isn’t protected.

“The policies do not always lead to perfect outcomes,” Monika Bickert, head of global policy management at Facebook, told ProPublica.

Then there are the quasi-protected categories, such as migrants. Facebook’s rules — which its content reviewers must follow — call for migrants to be protected from calls of violence against them, but not against calls for their exclusion.

From ProPublica: “According to one document, migrants can be referred to as ‘filthy’ but not called ‘filth.’ They cannot be likened to filth or disease ‘when the comparison is in the noun form,’ the document explains.”

The rules have led to dramatically different standards for censorship, as cited in the report: A Republican congressman’s call to hunt and kill “radicalized” Muslims last month was allowed to remain on Facebook. Meanwhile, a Black Lives Matter activist’s statement that “all white people are racist” was removed and her account was disabled for seven days, ProPublica reported.

When reached for comment Thursday, a Facebook spokeswoman referred SiliconBeat to a blog post by Richard Allan, Facebook’s vice president for Europe, the Middle East and Africa policy, which was published the day before the ProPublica report.

The post, titled Hard Questions: Hate Speech, says “there is no universally accepted answer for when something crosses the line.”

Allan also repeated something Facebook has had to say a lot recently, as it grapples with issues other than hate speech, such as suicide videos and other violent content on its platform: It has thousands of people working on its community operations team worldwide (4,500) and is adding more (3,000).

Facebook’s censorship practices have sparked questions about bias before. Early this year, nearly 80 civil rights groups and others accused the company of racial bias over its censorship of “users of color or Facebook’s interactions with law enforcement.” Those same groups had asked last year asked Facebook to be more transparent about what it chooses to censor.

Facebook’s “noncommittal” response to the groups’ request last year prompted them to ask to meet with company representatives, but the company did not respond, said Malkia Cyril, executive director of the Center for Media Justice, in an email to SiliconBeat Thursday. Similarly, Facebook did not respond to a petition urging it to change its censorship policies, he said.

“Now we know why,” Cyril said. “Racial discrimination is built into the structure of how they manage content, and it harms communities of color using the platform. It’s time for a change.”

But how would such a change come about? The ProPublica report shows the policies on which Facebook relies were put into place first in the mid-to-late 2000s by Chris Kelly, the company’s former general counsel, then expanded in 2013 by Dave Willner, who’s now head of community policy at Airbnb.

“There is no path that makes people happy,” Willner told ProPublica. “All the rules are mildly upsetting.”

But they’re more upsetting to some than others, obviously. (Might Facebook’s policy team need some diverse voices?)

Facebook needs “thoughtful collaboration from racial justice partners in a number of countries that can ensure human rights with regard to race, national origin, and religion among others, are respected and rooted in reality,” Cyril — who said the Center of Media Justice will contact Facebook to follow up on the ProPublica report — said.

With Facebook’s huge role and reach, some people feel they have no choice but to use it regardless of its skewed policies: They’re at its mercy.

Wired reports that the activists mentioned in the ProPublica piece — the ones who have been suspended or seen their posts taken down — have not left Facebook.

“Leaving Facebook is always an option —  if you don’t mind leaving the biggest audience on Earth behind, not to mention the platform where most of your friends are,” J.M. Berger, a fellow at the International Centre for Counter-Terrorism-The Hague, told Wired. “It’s a very tough tradeoff, particularly for activists like those discussed in the story, who need access to the biggest possible audience to do their work.”

 

Photo by Karen Bleier/AFP/Getty Images