Facebook has announced it is updating its policies on “harmful and hateful” content after a campaign complaining it was allowing jokes and other offensive comments about rape and domestic abuse.
Facebook vice president of global public policy Marne Levine said the social network would “complete our review and update the guidelines” around hate speech and seek comments from legal experts, women’s organizations and other groups “that have historically faced discrimination.”
“In recent days, it has become clear that our systems to identify and remove hate speech have failed to work as effectively as we would like, particularly around issues of gender-based hate,” Levine’s statement said Tuesday.
“In some cases, content is not being removed as quickly as we want. In other cases, content that should be removed has not been or has been evaluated using outdated criteria… We need to do better — and we will.”
Facebook announced the change a week after a campaign organized by the activist group Women, Action & The Media, which claimed the social network “has long allowed content endorsing violence against women.”
“They claim that these pages fall under the ‘humor’ part of their guidelines, or are expressions of ‘free speech,'” the group said in launching the campaign.
After the Facebook announcement, the organization hailed the action.
“We are reaching an international tipping point in attitudes towards rape and violence against women,” said Jaclyn Friedman, executive director of the group.
“We hope that this effort stands as a testament to the power of collaborative action.”