Facebook said Monday it would take new steps to eliminate content promoting white nationalism and white separatism after an external audit said its efforts were "too narrow."
The audit led by civil rights attorney Laura Murphy indicated Facebook's policy did not go far enough by only barring "explicit praise, support, or representation" of these terms.
"As a result, content that would cause the same harm is permitted to remain on the platform," said the audit team led by Murphy, formerly of the American Civil Liberties Union.
Responding to the civil rights audit begun last year, Facebook chief operating officer Sheryl Sandberg said Facebook would seek to tighten its policy on these kinds of hate speech.
The social network has been battered by criticism that it was more focused on growth than protecting users or thwarting deception, bullying and harassment.
"Today's report recommends we go further to include content that supports white nationalist ideology even if the terms 'white nationalism' and 'white separatism' aren't explicitly used," Sandberg said in a statement.
"We're addressing this by identifying hate slogans and symbols connected to white nationalism and white separatism to better enforce our policy."
The report was the second by Murphy, who began a civil rights audit in May 2018 at the request of the leading social network, which has more than two billion users.
"While this audit report shows Facebook is making progress, there is still much more work to do -- and Facebook acknowledges this," Murphy said.
"As the company's work in this area continues, I want this effort to be even more robust and deeply embedded in the company's DNA."
The auditors pointed out that Facebook has been plagued by "persistent enforcement errors" in its efforts to remove hate speech.
"For example, sometimes users post photos that would otherwise violate Facebook's hate speech policies, but accompany those photos with captions to indicate they are not embracing the hateful content but instead are calling attention to racism or hate," the auditors said.
"Facebook's investigation revealed that its content review system does not always place sufficient emphasis on captions and context... More explicitly prompting reviewers to consider whether the user was condemning or discussing hate speech, rather than espousing it, may reduce errors."
On a related matter, Facebook said it would ramp up efforts to thwart any effort to manipulate the 2020 US census, treating the population survey as if it were an election.
Sandberg said Facebook has made it a priority to prevent manipulation in elections and census counts, as noted in the audit.
"We're building a team dedicated to these census efforts and introducing a new policy in the fall that protects against misinformation related to the census," she said.
"We'll enforce it using artificial intelligence. We'll also partner with non-partisan groups to help promote proactive participation in the census."