The alt-right’s bullying troll culture has made much of the Internet a dangerous place
“Don’t feed the trolls,” the saying goes, as if it were really that easy. As a prescriptive for navigating the harassment, hatred and bile that now fester on and darken the Internet and social media, it’s both woefully inadequate and unrealistic advice. Like its closely related partner, “Don’t read the comments,” the suggestion that we all just ignore the toxic venom spewed online by actors who often travel in packs and attack in hordes, underestimates the unignorable provocation, emotional trauma and bonafide fear they purposely create and instill. The bunk idea that we can all just look away—or more annoyingly, log off, shut down or shut up—is the quaint, ineffective (and in our current troll-glutted climate, offensive) relic of a bygone era. It’s a holdover from a time when the internet was a kinder, gentler digital space and the trolls who roamed it less malicious monsters than playful pranksters.
The evolution of trolling, like that of the internet itself, has occurred with surprising and unpredictable speed. In the early days of the World Wide Web, trolling took the form of a relatively innocuous—though intrusive and annoying—type of merry pranksterism. Fusion contributor Kristen V. Brown describes 1990s Usenet forums as sites where trolling was considered “a little like a prank phone call”; one 2002 Urban Dictionary entry defines trolls as people who post “deliberately provocative message[s]…with the intention of causing maximum disruption and argument,” while another states that “trolling does not mean just making rude remarks: shouting swear words at someone doesn’t count as trolling…and isn’t funny.”
Examples of latter-day trolling might include putting up wantonly obtuse, logically circuitous, mind-blowingly stupid, off-topic or antagonistic messages (often dubbed “flaming”), crafted solely for the purpose of frustrating or otherwise irritating more sincere members of an online community. The idea was that trolls didn’t mean the dumb things they said, though successful trolling—and this is key—required that those they aimed to piss off believe that they did. “Troll” was a label the angry members of an online community imposed on troublemakers, a way to identify and ferret out those hellbent on ruining an otherwise good conversation.
“There would be guys who would go onto the Star Trek newsgroup and say, ‘You know, I think Spock was actually human,’” says Jon Hendren, who has become one of the internet’s most revered trolls based on a series of outrageous stunts, including appearing on a TV news segment about Edward Snowden and instead discussing the plight of Edward Scissorhands. “It would garner these huge lengthy responses from guys who would list every time Spock said something about being Vulcan. You knew the guy who said that [Spock was human] was not being serious, but he also wasn’t breaking any rules. He’s getting his jollies from that—and I think that’s what trolling is to me. It’s where you play within the bounds of an established system…to highlight the absurdity within those systems, hopefully in a funny way.”
4chan and Reddit went live in 2003 and 2005 respectively, and the two sites become epicenters for trollish behavior. Whereas trolling had previously been a thing you were accused of by those who disapproved of your behavior, now “trolls” began to claim the title for themselves, considering it a source of pride.
“4chan was when people really started to take that label on as something that they wore almost as a badge of honor,” says Whitney Phillips, a media folklorist and actual scholar on internet trolls, whose doctoral thesis served as the foundation for 2015’s This Is Why We Can’t Have Nice Things: Mapping the Relationship between Online Trolling and Mainstream Culture. “This was a point of subcultural identification. It was a way to connect with the people who were around you who were engaging in similar behaviors. It was a way of marking an in group, essentially.”
Phillips, who has spent nearly a decade e-staking out the virtual spaces trolls frequent, such as 4chan, Reddit, Facebook and Twitter, notes in her book that there is “every indication that the vast majority of subcultural trolls—certainly the ones I interacted with—are relatively privileged white males for whom English is either a first or second language.” In the mid-aughts, as the particularly white-male dominated spaces of 4chan and Reddit grew exponentially, trolls began taking up more and more digital space, not just on those sites but across the internet in general. Just as in real life (or IRL), when groupthink and mob mentality are added to the churn of toxic masculinity, sexism, racism, white supremacy and male fragility, the results were predictably ugly. Trolls had long claimed they were in it “for the lulz,” which is sort of like lolz, but at other people’s expense. In the decided shift in the tone of trolling, the incredibly unfunny invective was often directed at women, African Americans and other people of color, as well as other historically marginalized groups.
In many ways, internet forums in the U.S. offer an unvarnished look at America’s essential character; a transparent account of what this place is really about when you strip away accountability and offer anonymity in its place. On Reddit and 4chan, particularly the latter’s anything-goes /b/ board, extreme racism and vicious misogyny flourished, with cruelty and abuse becoming key traits of the hivemind personality. Administrators at 4chan and Reddit responded to the explosion of hate speech and vitriol by doing absolutely nothing, then followed that up by turning their inaction into a policy of sorts, framing it as vague, overly simplistic advocacy for free speech. That approach quickly succeeded in creating an environment less known as a site for the unimpeded exchange of ideas than a welcome haven for white nationalists, anti-Semites and women haters.
Along with the teeming masses of neo-Nazis and misogynists, this attitude attracted creeps whose interests included things like incest, rape and graphic pictures of dead children. One of the most prolific Redditors, a guy who went by Violentacrez until he was outed in 2012 as Texas military dad Michael Brutsch, created and/or moderated subreddits including r/chokeabitch, r/niggerjailbait, r/Hitler, r/jailbait and r/Jewmerica. (Not to wade too deep into the weeds here, but if you can conceive of it, or it’s so horrible you’d desperately like to forget it exists, there’s probably a 4chan board or subreddit dedicated to it.)
Dan McComas, a former higher-up Reddit until the site’s highly public reshuffling last year, reflected on the direction of trolling since the internet’s nascent days—when the worst thing you saw was “maybe sending somebody a disgusting picture and disguising it”—and what we’re seeing now. “I guess one thing leads to another, but I don’t know,” said McComas, who recently founded IMZY, a social media site focused on creating online communities where abuse and harassment don’t get in the way. “I have a hard time really buying into the slippery slope thing. I think that terrible people exist, and terrible people were given platforms to thrive. Not only to thrive, but to teach people and to influence people. For those things not only to become accepted, but to become encouraged, and that it turned into something huge.”
There are harmless, beloved memes and other internet ephemera that have come from 4chan and Reddit. We have those troll bastions to thank for LOLcats and Rickrolling, and the “hacktivist” collective emerged from the masses of 4chan’s Anonymous contributors. But in the same way that Reddit (the 11th most visited site in the U.S., with 542 million monthly visitors), can take cat memes and inadvertently turn them into widely shared, culturally embedded viral touchstones, the site is also very good at reflecting and amplifying the kind of hate that’s already pervasive in American society.
As those voices have become ever louder on Reddit, they’ve often drowned out, in volume and by sheer force of will, the voices of others attempting to build communities for themselves. (A kind of “free speech for me, but not for thee,” I guess.) What’s more, many of those trolls have played up their actions as a sort of heroic pushback against “political correctness” (i.e., the idea that the people who have always been able to say whatever abhorrent things they want might now be held accountable for saying those things), which they suggest somehow violates their First Amendment rights. Never mind that the sheer amount of hateful language they freely espouse all over the internet is proof that they’re wrong.
“When people talk about free speech online, they’re often referring to the very cartoon, weak version of what free speech is actually about,” Whitney Phillips told me. “The spirit of free speech is preserving the greatest amount of speech for the greatest amount of people. That’s the ideal. The problem with notions of free speech that privilege the voices of the antagonist, [is that they] create an environment in which there is actually less speech….The fact of the matter is that they’re not championing free speech on the whole. They’re championing their own speech and their own ability to do whatever they want without having to answer to anyone. They feel threatened because it could potentially take away their ability to do whatever they wanted. In defending that, they get the spirit of the thing totally false.”
“I believe in plurality of conversation,” Phillips adds. “I believe in debate. I believe in disagreement. You can’t have that if the biggest assholes have the floor. Rolling your eyes at the free speech defense online, that’s not rolling your eyes at free speech. That’s rolling your eyes at a very myopic, deeply privileged misunderstanding of what free speech even is or is supposed to be.”
A 2014 Daily Dot article recounts how members of the subreddit r/Blackladies were inundated with horrifying messages from racist trolls. Volunteer moderators of the group had to delete those messages one by one, even as new threats and insults flooded in, an experience that left them “exhausted and demoralized.” Likewise, one of the moderators of the subreddit r/rape, a group for survivors of sexual violence, told writer Aaron Sankin, “[W]e regularly get visitors who are sexually aroused by the stories that some of our users tell and often feel the need to inform them of that fact in the most graphic of ways possible. You could imagine how that makes a rape survivor feel, especially those with fresh trauma.” Another r/rape moderator abandoned the task after being endlessly harassed and “doxxed,” meaning her personal information and identity was published online, ensuring the abuse could follow her offline, too. Stories like these have become disturbingly typical.
“Reddit, like any other site, is a culture,” Dan McComas told me. “You can kind of guess what you’re going to see in every Reddit group. You click on a Reddit thread, and if it’s popular you know that the first highly voted comment is going to be a joke playing on the [subreddit name], and the second highest vote is probably going to be something terribly misogynistic. Then the fifth is super racist. That’s just the culture, right? I think this is the same problem you have with Twitter. It’s just a culture. And once something becomes the culture, I don’t think you can change it.”
That culture has long left the confines of places like Reddit and 4chan and infected the entire internet. Read any YouTube comment thread—regardless of the video subject—and you’ll encounter some random bit of misogyny or a racist slur. Anonymous eggs and alt-righties send tweets about the Jewish menace and fake stats about rampaging black criminals, which Donald Trump then retweets. Scroll down news articles written by or about women and people of color and you’re guaranteed commentary from readers with nothing to add but racism and sexist name-calling or worse, threats of violence.
In 2014, women-focused website Jezebel had to publicly call out parent company Gawker (R.I.P.) for refusing to do anything about “violent pornography” being strewn across its comment sections. Earlier this year, the Guardian conducted a study that found “eight of [its] 10 regular writers who got the most abuse from commenters online were women (four white and four non-white) and two were black men. (The 10 regular writers who got the least abuse were all men.)” Over the last couple years, the race hate and other garbage trolls have left behind have led to shutdowns of comment sections on the Daily Beast, Reuters, Mic, Verge , Re/code, NPR, The Week, National Journal, the Chicago Sun-Times and Popular Science.
A 2014 Pew Research Center study found that “73 percent of [U.S.-based] adult Internet users have seen someone be harassed in some way online and 40 percent have personally experienced it.” While that indicates the internet is pretty gross all over, women of all races and people of color in particular find themselves in the crosshairs of the worst and most relentless bad actors online. An Australian study this year suggests that online harassment against women may be on its way to becoming “an established norm in our digital society.” The Pew study concluded that while “men are more likely to experience name-calling and embarrassment…young women are particularly vulnerable to sexual harassment and stalking.” That’s the difference between a man online being called a jerk or an asshole, and a woman being told by an anonymous man on Twitter that he is going to “look you up, and when I find you, im going to rape you and remove your head,” as journalist Amanda Hess was. Writer Joel Stein, in a recent piece on trolls for Time Magazine, noted that “nearly half of the women on staff have considered quitting journalism because of hatred they’ve faced online, although none of the men [have].”
Women of color, especially those expressing feminist or womanist ideas online, often find themselves the targets of sickeningly violent threats and harassment. Broadway actress Pia Glenn, who has a healthy Twitter following, told writer Terrell Jermaine Starr how it too often goes for black women whose very existence online trolls resent. “It takes fewer back-and-forth vollies to get to ‘nigger bitch,’ ‘nigger cunt,'” Glenn said. “I’ve had lynching threats. People send me terrible historical pictures of our ancestors being lynched. So proportionately speaking, if you’re not a person of color, you will not get that. Let’s say there are 100 insults in the world, there are more of them that apply to us. When a white woman gets terrible harassment about being raped, attacked or killed, that’s very serious as well. But there’s no way she can get the lynching threats with historical pictures of black people. So there’s a whole other section of ugly, hideous things people feel they can say to us.”
African-American writer Jamie Nesbitt Golden changed her Twitter avatar (though not her bio) to a picture of a random white guy and found that “the number of snarky, condescending tweets dropped off considerably, and discussions on race and gender were less volatile. I had suddenly become reasonable and level-headed. My racial identity no longer clouded my ability to speak thoughtfully, and in good faith. It was like I was a new person. Once I went back to black, it was back to business as usual.”
Similarly, as part of the 2014 Twitter #RaceSwapExp experiment, white blogger Christopher Carbone replaced his avatar with a picture of black feminist writer Feminista Jones, who has said, “There’s not a day that goes by when someone isn’t trolling or harassing me.” Despite the fact that the content of Carbone’s messages remained the same, he wrote that the “level of hateful tweets [he received] went from zero to off the charts,” and noted the experiment allowed him to “experience a little bit of what black women and women of color deal with 24/7/365 in all online spaces: endless trolling, racist and misogynistic hate, tactics that silence and derail, demeaning assaults on their humanity.”
Several recent troll mass attacks have helped drive home the seriousness of what we’re dealing with. In 2014, there was Gamergate, in which an onslaught of trolls began hurling threats so intense at women in gaming that some women were forced to leave their homes and go into hiding; others who publicly opposed Gamergate were targets of swatting, or prank calls that dispatched SWAT police teams to their houses. In that case, the trolls painted themselves as valiant resistors to the tyranny of feminism and SJWs, or social justice warriors. That same year, a 4chan user leaked private, naked photos of famous women including Jill Scott, Jennifer Lawrence and Kate Upton. Alongside the outrage about privacy violations and misogyny, some unhelpfully suggested the women shouldn’t have been so bold as to take nude photos in the first place.
More recently, Leslie Jones, an African-American comedian and cast member on Saturday Night Live was deluged with racist messages by trolls outraged about this summer’s remake of Ghostbusters, which replaced the original quartet with women, including Jones. Sexist tantrums were on full display across social media from the moment the film was announced, but Jones, the lone black woman among the film’s co-stars, received the brunt of ire from trolls who imagined the movie as part of some larger conspiracy to promote #WhiteGenocide and a bunch of other nonsense they envisioned in paranoid fever dreams. Jones was so distraught by the attack—which continued to escalate until her site was hacked, its content replaced with nude photos and racist images—she abandoned Twitter for a period. In a rare move, the platform actually took action, banning the goader-in-chief of Jones’ attackers, alt-right mascot Milo Yiannopoulos.
This is where trollism—or what we now refer to as trolling—has led. On 4chan, Reddit, and other forums where hate was encouraged and allowed to grow unchecked, the alt-right has developed into a loose coalition of (mostly) angry white men who believe and propagate the idea that they are this country’s oppressed class. In this alt-landscape, Latinos and blacks are stealing their rightful spoils; feminism is upsetting the natural men-on-top order of things; Muslim invaders are destroying the country; and an international Jewish conspiracy is trying to wipe out the white race with tools that include miscegenation and homosexuality. Men who aren’t on board with the alt-right’s retrograde cause are labeled “cucks,” a corruption of cuckold (or “cuckservatives” in the case of Republican opponents). (The SPLC notes “the phrase has a racist undertone…implying that establishment conservatives are like white men who allow black men to sleep with their wives.”) They describe revelatory moments—in this warped context, going all in on racism and misogyny—as swallowing the “red pill,” as in the film The Matrix.
“Essentially, you choose the red pill of truth as opposed to the blue pill of delusion,” Richard B. Spencer, head of the National Policy Institute (a very neutral sounding name for a white nationalist think tank), told Vice recently. “That is, the truth about race, the truth about America, the truth about the Jewish influence, the truth about women…These are hard truths, and these are truths that go against the grain of liberal ideology and wishful thinking.”
Jared Taylor, an outspoken white nationalist, has stated that while there are “areas of disagreement” among alt-right adherents, “the central element of the alt-right is the position it takes on race.” If those sentiments seem a lot like the thinking espoused by Donald Trump, then you’ve stumbled onto the reasons why the movement has thrown its vast virtual weight behind the candidate. Subscribers to Reddit boards /r/WhiteRights and /r/The_Donald have increased by leaps and bounds, in some cases, rising by tens of millions month over month. It is not a coincidence this growth has happened as “the number of white nationalists and self-identified Nazi sympathizers on Twitter have multiplied more than 600 percent in the last four years, outperforming the so-called [ISIL] in everything from follower counts to number of daily tweets,” according to a George Washington University study.
Stephen Bannon, Trump’s campaign head and the executive chairman of Breitbart News, has proudly dubbed his publication “the platform for the alt-right.” There’s strength in numbers, and the spread of Trumpism and alt-righties has helped make the internet a difficult place to exist for many.
In recent months, targets of online hate, including New York Times writer Jonathan Weisman (who devoted a column to the anti-Semitism alt-righters and other Trumpites have tweeted at him) and feminist writer Jessica Valenti (whose 5-year-old daughter was threatened with rape by an Instagram troll) have abandoned parts of social media. Journalist Julia Ioffe told the Guardian that she has received calls from people who’ve serenaded her with Hitler speeches, and was the subject of a neo-Nazi website post titled “Empress Melania Attacked by Filthy Russian Kike Julia Ioffe in GQ!” Bethany Mandel wrote about being subject to so much anti-Semitism—she was called a “slimy Jewess” and told she “deserve[s] the oven”—that fear prompted her to buy a gun. Cleveland.com columnist Henry J. Gomez writes that he’s seen a rise in hate mail following Trump’s ascension, with messages suggesting he should be “on the other side of the wall” and that his heritage should “disqualify” him from covering the presidential campaign. Much like the possibility of a Trump presidency, the alt-right’s “trolling” isn’t a joke, but a dangerous reality that taints online and offline lives, which are one and the same.
“The thing that’s so bizarre is this demarcation, IRL, in real life, versus some otherwise place known as the internet,” Phillips told me. “The thing about real life is that it pretty much subsumes everything. It’s not that the line is fuzzy. There is no line, and it makes no sense for there to be a line other than the fact that it’s often used as a post hoc justification for certain people’s terrible behavior. It becomes part of a sort of apology: ‘I didn’t actually hurt your feelings because I said it to you online.’ What the hell does that even mean? It’s just a way of perpetrators to hide behind technologies and language to justify them doing whatever it is they feel like doing that the rest of us apparently have to deal with.”
“Just because something happens in an online space doesn’t mean that it isn’t fundamentally connected to that person’s embodied identity and experience,” Phillips added. “Of course it is. You can’t go online if you don’t have a body.”
When trolls patronizingly suggest their targets become thicker skinned, avert their eyes from the torrents of abuse or simply step away from the computer, they’re attempting to diminish the very real consequences their bad—in some cases and states, criminal—behavior has on real people.
“It doesn’t make any sense,” Phillips told me. “That you can just choose not to react emotionally and maybe if you weren’t so emotional then you wouldn’t be having these problems, so stop complaining. This is ultimately about you being too emotional. Think about the preponderance of this abuse that’s targeted specifically toward women and queer people and people of color. It’s very easy or comparatively easy for a white dude to be like, ‘Well then, just don’t take offense to racism.’ It becomes a mechanism of controlling, trying to police those sort of emotive boundaries of groups that have very real and embodied reasons for getting pissed off when they have to deal with certain kinds of content.”
In light of all this, it’s hard to make the connection between the trolling of yore and today’s online harassment and bullying. The remnants of what trolling once was—its more lighthearted past—raises the question of whether we’re doing more damage by using that word to describe behaviors that are far from playful.
“That’s the reason I don’t like the words ‘troll’ and ‘bully’” Anita Sarkeesian, a media critic who was targeted during Gamergate, told the Guardian. “It’s just a scary, violent, abusive temper tantrum.”
In a piece titled “Let’s Call ‘Trolling’ What It Really Is,” Whitney Phillips notes the importance of language and emphasizes the danger of misnaming terrible behaviors in a way that might minimize their threat. “If an online space is overrun by violently misogynist expression,” she writes, “then I call the behavior violent misogyny, regardless of how the aggressors might describe it.”
“It’s framed exclusively in terms of a kind of game,” she told me. “It’s a game that the troll always wins, and if you play it, by definition then you lose. That means that if you go ahead and ‘take the bait,’ you essentially have brought that upon yourself. You should know better that nothing on the internet should be taken seriously. You should know better that people are behaving playfully and that when they do they are not responsible for their own actions. They are teaching you a lesson. You should be thanking them.”
By now, the word trolling itself is almost certainly lost to those who might do their absolute worst under its auspices. Perhaps the more pressing question is how to talk about trolling—meaning harassment, stalking and abuse—without serving as a recruitment tool. The alt-right relishes every scrap of media attention it receives; like Trump, the philosophy seems to be that there’s no such thing as bad press. It’s critical to call attention to the dangerous rhetoric being put out by Trump and his followers. But it’s a Catch-22, with each story—including this one—helping to draw eyes and raise the profile of a steadily growing movement.
“The alt-right wouldn’t exist in the way that it does and it wouldn’t have the visibility that it does now if outlets were not engaging—in most cases to explicitly condemn,” Philips said. “Very few people other than participants are like, ‘Go, alt-right.’ Most of the coverage has been explicitly negative, but it doesn’t matter, they are harnessing [the media’s] labor…It’s the same thing with the Leslie Jones case. I can guarantee you that part of the game was to try to get journalists to repeat and retweet that imagery. That’s the end goal, is that you harness journalists in the service of your own nefarious ends and then you step back and laugh because now you’ve reached millions more people than you ever could have on your own.”
That problem, of elevation and amplification, is an issue not just for the media, but for any public figure who inadvertently offers exposure to a wider audience. In August, after Hillary Clinton delivered a speech attacking the alt-right, the online response from those who count themselves among the movement’s members, captured in a piece by Wired’s Issie Lapowsky, was downright celebratory. “Hillary just seeded #altright and #whitegenocide in the same speech. Everything is going according to plan….everything,” one tweet read. “I’d like to thank you @HillaryClinton, for all the promotion you are doing for us…A little goes a long way… #AltRightMeans #WhiteGenocide.”
“In a sense, we’ve managed to push white nationalism into a very mainstream position,” one anonymous white nationalist and Trump supporter told Olivia Nuzzi, writing at the Daily Beast. “Trump’s online support has been crucial to his success, I believe, and the fact is that his biggest and most devoted online supporters are white nationalists. Now, we’ve pushed the Overton window. People have adopted our rhetoric, sometimes without even realizing it. We’re setting up for a massive cultural shift.”
He’s right. The question now is how to wrangle with the fact that every mention, no matter how negative, helps the movement gain visibility and an odd dimension of legitimacy.
“That’s where the conversation really needs to be,” Phillips concludes, “and [it] raises all kinds of uncomfortable questions for people like you and people like me who engage with this content all day long, and also by engaging with it, we perpetuate it. Even if our intentions are really good, it still means that the narrative seeds continue to be cast. That’s what they’re banking on, and so far we have lived up to their expectations. I don’t see how we could not.”