US lawmaker says tech companies must quickly remove violent content after New Zealand
Relatives of a member of the Bangladeshi community wait for news at a community centre in Christchurch, New Zealand, March 17, 2019. REUTERS/Jorge Silva

Following the live-streaming on social media of the mass shooting in New Zealand, the chair of the U.S. House Committee on Homeland Security wrote a letter to top executives of four major technology companies urging them to do a better job of removing violent political content.

In a letter dated Monday and released on Tuesday, Representative Bennie Thompson urged the chief executives of Facebook, Alphabet’s Google, which owns YouTube, Twitter and Microsoft to more swiftly remove content that would spawn political extremism.

The letter follows the fatal shootings of 50 worshippers in two mosques in Christchurch last week. The shooter, a suspected white supremacist, live-streamed the killings on social media, where it was widely shared.

“Your companies must prioritize responding to these toxic and violent ideologies with resources and attention,” Thomson wrote. “If you are unwilling to do so, Congress must consider policies to ensure that terrorist content is not distributed on your platforms, including by studying the examples being set by other countries.

“The video was widely available on your platforms well after the attack, despite calls from New Zealand authorities to take these videos down,” he wrote.

Facebook said it removed 1.5 million videos showing the attack in the first 24 hours after it occurred.

Thompson also asked the companies for a briefing on the matter.

A Facebook spokesman said the company “will brief the committee soon.” Google, Twitter and Microsoft did not immediately respond to requests for comment.

Senator Ron Wyden, an Oregon Democrat who has been critical of Facebook for privacy lapses, said on Tuesday that the government should tread carefully in reining in tech companies for fear of aiding dictators and other bad actors.

Wyden warned against revoking protections given in Section 230 of the Communications Decency Act that specifies tech companies are not responsible for what users say on their platform.

“If politicians want to restrict the First Amendment or eliminate the tools with which much of the world communicates in real time, they should understand they are also taking away the tools that bear witness to government brutality, war crimes, corporate lawlessness and incidents of racial bias,” Wyden said in a statement.

The Electronic Frontier Foundation (EFF), a nonprofit that advocates for civil liberties in the digital world, cautioned policymakers last week not to rush to regulate speech on online platforms or else it could “disproportionately silence” the most vulnerable users, such as Egyptian journalist Wael Abbas, who was kicked off YouTube for posting videos on police brutality.

EFF also called for guidelines that urge social platforms to be transparent about how many posts and accounts they remove, and give users notice and a chance to appeal if one of their posts is taken down.

(Reporting by Diane Bartz; Additional reporting by David Shepardson and Sarah Lynch; editing by Bill Berkrot)