Efforts to take down the 8chan website where a racist "manifesto" was posted shortly before the El Paso shooting highlight the legal and ethical difficulties in curbing online hate speech that foments violence.
The digital security firm Cloudflare said Sunday it was terminating its services to 8chan, making it more difficult for the message board to remain online.
But hours later, an 8chan administrator said the service was migrating to BitMitigate, which calls itself "a non discriminatory provider" of security "that operates in the fullest consistency to free speech."
But 8chan's forum is the latest to raise questions about policing the internet without curbing digital rights or free speech.
Mark Potok, a senior fellow at the Centre for Analysis of the Radical Right, said it is entirely appropriate for private web hosting and security providers to shut down sites like 8chan.
The site "is a cesspool of people egging each other on to all kinds of violence, not only violence against non-whites, but violence against women and more," Potok said.
"The private companies hosting these websites have every moral obligation to shut them down."
Potok added that a more proactive effort could be made to monitor sites like 8chan promoting violence.
"Law enforcement should be allowed to look at 8chan and other venues like that without violating people's rights," he said.
8chan, which promotes itself as a site devoted to the "darkest reaches of the internet" appeared to be offline Monday, but posted a message on Twitter saying "there might be some downtime in the next 24-48 hours while we find a solution."
- Responsibility to filter? -
The deadly El Paso shooting has prompted fresh calls for online firms to step up efforts to weed out calls to violence.
"Technology platforms have a responsibility to filter out extremist groups that are inciting violence," said Darrell West, director of the Center for Technology Innovation at the Brookings Institution.
"That is not a freedom of speech issue because people do not have the right to encourage others to use violence. It is very damaging for society to allow people to engage in violence, hate speech, and actions that endanger other individuals," West said.
President Donald Trump said Monday that the internet "has provided a dangerous avenue to radicalize disturbed minds and perform demented acts" and added that "we must shine light on the dark recesses of the internet and stop mass murders before they start."
But Karen Kornbluh, head of the German Marshall Fund's digital innovation democracy initiative, said it's often difficult for firms to determine when to remove content.
"These decisions are very difficult for companies. They are rightly reluctant to take down speech," Kornbluh said.
Kornbluh said one way for companies to deal with incitement to violence would be to report any likely criminal activity to law enforcement authorities.
"Ironically the companies would remove some of the controversy if they would clarify up front that they have zero tolerance policies toward this illegal activity and will report it to the FBI where they see it, just as they report child pornography," she said.
- 'Lawless' platform -
Cloudflare chief executive Matthew Prince defended the decision to cut off 8chan, describing the site as "lawless" and responsible for "tragic deaths."
"Even if 8chan may not have violated the letter of the law in refusing to moderate their hate-filled community, they have created an environment that revels in violating its spirit," Prince said in a blog post.
Shortly before the El Paso mass shooting on Saturday, the suspect named by the media as Patrick Crusius, who is white, was believed to have posted a racist "manifesto" on 8chan that includes passages railing against the "Hispanic invasion" of Texas.
The author praised the Christchurch mosque attacks in New Zealand, which were also announced on 8chan in a racist manifesto allegedly posted by the perpetrator of that massacre.
Kate Klonick, a St. John's University professor specializing in internet law and online speech, said services such as Cloudflare may not be the best to make "hard content moderation decisions" which social networks are struggling with.
Klonick tweeted a link to her 2017 opinion article pointing out the problems with certain "pipeline" firms making content decision.
"What if Cloudflare started suspending service for a political candidate that its chief executive didn't like?" she wrote.
"The people who run these companies are not elected officials, yet we still expect them to safeguard our basic liberties while also meeting our cultural expectations."