Last week YouTube removed videos posted by Elliot Rodger, who killed six students in California. Was this the right course of action?
Elliot Rodger's California shooting spree last weekend was at first, for many, a seemingly familiar episode. Disaffected American youth, with access to almost unlimited firepower, goes on indiscriminate rampage, lashing out against the world in general. It's a script we think we know. What changed my view was learning about Rodger's YouTube videos and message board postings, in which he expressed, at length, his anger at women, his frustration at life as an "incel" (involuntary celibate") and feelings of being oppressed, as a man, by feminism.
Rodger was apparently a men's rights activist (MRA, in web parlance). This was something I'd been dimly aware of, as a sort of joke, but it thrives online, as do many other gender and sexual politics movements that are likely to grow in the coming years.
His martyrdom video was certainly hard to watch, and there is a very strong case to be made that it breached YouTube's community guidelines, but I fear that by hiding it from public view, we may be end up avoiding learning about Rodger and his ilk's ideology, leaving ourselves ill-equipped to argue against it effectively. Rodgers, like Anders Behring Breivik before him, may have acted alone, but he was part of a nebulous, dangerous movement, connected by the web. We need to keep our eyes firmly on extreme ideologies such as the men's rights activism, and their adherents, rather than avert our gaze and hope they'll go away.
Joshua Rozenberg, legal commentator and broadcaster
Elliot Rodger was not a martyr. He was a murderer. He did not have an ideology. He had an illness of the mind – or so it seems. To describe him as a misogynist, as some have done, is offensive to misogyny.
If he was, as you say, part of a "nebulous, dangerous movement, connected by the web", we should try to close down that movement by removing the links that connect its members. Among those links are the videos and postings to which you refer. They may be of use to psychiatrists and law enforcement officers whose job it is to identify potential killers and thwart their ambitions. But a respectable publisher such as YouTube is under no obligation to publish material which, as you acknowledge, is in breach of its guidelines.
Although I have not searched for the material myself, I understand it has been reposted elsewhere. So there is no question of hiding it from public view.
A responsible broadcaster does not show gratuitous images of dead bodies after an explosion. A responsible newspaper does not report graphic details of the sexual assaults alleged against a defendant on trial. This is not censorship. This an attempt to apply editorial standards of taste, decency and respect for the audience as a whole.
PR Please don't for a moment imagine that by suggesting Rodger saw himself as a martyr for a cause I am somehow legitimising his ideas or actions. I'm not. Martyrdom is one of the worst ideas humanity has ever come up with, and acting for the sake of an ideology doesn't justify anything. But while he was clearly an unwell man, with a sad history of mental illness from childhood, it does not follow that his actions happened in a void, with no ideological motivation.
You say responsible media outlets do not show gratuitous images of dead bodies or run lurid details of sexual assaults. But that is a question of the details and consequences of violence, rather than the motivation, which is what the Rodger videos are about. And besides, the key word there is "gratuitous". This week, at least one newspaper I can think of printed a close-up picture of the corpse of Farzana Iqbal, stoned to death in Lahore. A shocking picture, certainly, but not, at least I think, gratuitous. That's a matter of editorial judgment. Which brings us to the interesting question of whether YouTube can be judged in the same way as traditional "responsible broadcasters". It is certainly not the free-for-all video-sharing arena of some of its rivals, but neither is it making editorial judgments in the traditional sense – that is to say, before broadcast.
JR You are right to say that YouTube does not assess material before it is published. But it does make editorial decisions after publication. Questionable material is flagged by users, assessed by moderators and removed if it breaks the site's rules. YouTube does not permit hate speech. It is not a shock site. "There is zero tolerance of predatory behaviour."
And what's wrong with that? Its how many newspaper websites operate. It's what their users expect. It's sometimes what the law demands.
You have not persuaded me that YouTube should be required to publish material that most people are likely to find deeply offensive. You have certainly not persuaded me that YouTube should publish material that might inspire others to commit appalling crimes.
On the contrary, there is every reason for limiting the exposure we give to these warped manifestos. The more attention we pay to the rantings of people such as Breivik and Rodger, the more likely it is that others suffering from similar mental illness will think that killing people is what you have to do to get your message circulated by responsible publishers. The oxygen of publicity is the last thing we should be giving these people.
PR There is no "requirement" for YouTube to publish anything, and I would not wish there to be one. It is ultimately a privately owned website that has its own standards. That is not what the matter of concern is here.
What does concern me is a growing belief that with the click of a mouse, we can make the bad things go away. We've seen this with everything from Twitterstorms over controversial newspaper columns to the ongoing debate about opt-in/opt-out pornography filters to the recent European court of justice ruling on the "right to be forgotten".
It is my belief that the apparent ease with which things can be removed, filtered or blocked can sometimes mean that broader discussion is left behind.
On the issue of the "oxygen of publicity", I still believe that the best way to deal with malign or wrongheaded ideas is in open debate. We have to run our society on the foundation that adults can deal with argument and counterargument, and rationally come to their own conclusion. Exposing bad ideas as dangerous nonsense is a far better path for society than keeping them in the dark and allowing them to putrefy and turn poisonous.
JR There is much on which we can agree. As you say, hiding bad things does not make them go away. Public debate between rational adults, such as this exchange, is a good way to explore and perhaps resolve contentious issues. But that does not mean we should give a public platform to hate speech, to individuals who incite others to violence or to those who have simply put themselves beyond the pale of civilised discourse.
I might agree with your conclusion if I believed that deranged individuals could be won over by rational argument. I fear that such people would regard any publication of their views by respectable websites as endorsement rather than condemnation. Sometimes, it's best not to encourage them.
Like you, I am against censorship. But the alternative is not a free-for-all. Not everything that an individual may wish to say will add to the sum of human knowledge or contribute to a debate of general public importance. Not everything an individual may film deserves to be seen.
While I understand your wish to broaden the argument, all we are meant to be discussing is whether YouTube was right to take down a video apparently posted by Elliot Rodger. It has been an enjoyable debate but I, for one, am sure that YouTube was right.
guardian.co.uk © Guardian News and Media 2014