Skip to main content
Categories
News

Leaked Documents Show Facebook’s Post-Charlottesville Reckoning with American Nazis

“James Fields did nothing wrong,” the post on Facebook read, referring to the man who drove a car through a crowd protesting against white supremacy in Charlottesville in August 2017, killing one. The post accompanied an article from Squawker.org, a conservative website. In training materials given to its army of moderators, Facebook says the post is an example of content “praising hate crime,” and it and others like it should be removed.

But after Charlottesville Facebook {snip} pushed to re-educate its moderators about American white supremacists in particular, according to a cache of Facebook documents obtained by Motherboard.

The documents provide more specific insights into how Facebook views and classifies white supremacy and neo-Nazis, and how those views have evolved {snip}.

“Recent incidents in the United States (i.e. Charlottesville) have shown that there is potentially confusion about our hate org policies and the specific hate orgs in specific markets,” a training document for moderators created shortly after the protest, and obtained by Motherboard, reads.

One of the training documents includes a log of when Facebook has modified the material, including adding new examples of hate speech as the network identifies them. {snip}

In January, 5 months after Charlottesville, Facebook added slides discussing the company’s position on white nationalism, supremacy, and separatism. While it says Facebook does not allow praise, support, or representation of white supremacy, it does allow the same sort of positions for white nationalism and separatism, according to one of the slides obtained by Motherboard.

Explaining its motivation, another section of the document reads that nationalism is an “extreme right movement and ideology, but it doesn’t seem to be always associated with racism (at least not explicitly).” Facebook then acknowledges that “In fact, some white nationalists carefully avoid the term supremacy because it has negative connotations.”

{snip}

Another slide asks “Can you say you’re a racist on Facebook?”.

“No,” is the response. “By definition, as a racist, you hate on at least one of our characteristics that are protected.”

Facebook classifies hate groups, individuals, and high profile figures based on “strong, medium, and weak signals,” according to one of the documents focused on hate speech in America. A strong signal would be if the individual is a founder or prominent member of a hate organization (or, “h8 org”, in Facebook parlance); medium would include the name or symbol of a banned hate group, or using dehumanizing language against certain groups of people. Partnership or some form of alliance with a banned hate organization — including participating in rallies together, of particular relevance to events like Charlottesville — Facebook sees as a weak signal, as well as an individual receiving a guilty verdict for distributing forbidden propaganda material.

{snip}

In its policy clarification document around hate groups in America, Facebook specifically points to the Ku Klux Klan (KKK), United Klans of America, Aryan Nations, and several other groups that are either based in or are popular in the US. Another document, dated April of this year, includes many other white supremacist organizations from around the world, including Atomwaffen Division, a neo-Nazi group linked to several murders in the US. Another document explicitly says that Facebook does not consider every organization the Anti-Defamation League (ADL) flags a hate group as such. (In its statement Facebook said “Online extremism can only be tackled with strong partnerships which is why we continue to work closely with academics and organisations, including the Anti-Defamation League, to further develop and refine this process.”)

{snip}

In April, Facebook released a selection of rules for when it takes down content, including hate speech. {snip}

“Our policies against organised hate groups and individuals are longstanding and explicit — we don’t allow these groups to maintain a presence on Facebook because we don’t want to be a platform for hate. Using a combination of technology and people we work aggressively to root out extremist content and hate organisations from our platform,” Facebook added in its statement.