Calls are growing for heavier restrictions on social media platforms after a white supremacist live-streamed his shooting spree in Buffalo, New York, on Saturday, resulting in 10 deaths and three wounded. While the video was removed from Twitch within minutes, platforms such as Twitter and Facebook allowed it to circulate for days and gain over a million views. The 18-year-old shooter was radicalized through online forums such as 4chan, according to a racist screed he authored. “What we are dealing with is the backend business models that are creating a structure where certain things are being able to be profited from, certain things travel differently, and hate-filled content has more of a space to be engaged with,” says Rashad Robinson, president of Color of Change. Color of Change has called for social media platforms to institute changes to their terms of service and urged Twitch to conduct a racial equity audit.
Buffalo Massacre Fuel Push to Regulate Social Media Platforms Where Hate Flourishes www.youtube.com
TranscriptThis is a rush transcript. Copy may not be in its final form.
AMY GOODMAN: President Biden is visiting Buffalo today to meet with families mourning the victims of Saturday’s massacre, when a white 18-year-old suspect killed 10 people at a supermarket in the heart of Buffalo’s Black community. As families plan funerals, calls for justice are growing. Civil rights lawyer Ben Crump spoke Monday alongside the son and daughter of the oldest victim, 86-year-old Ruth Whitfield.
GARNELL WHITFIELD JR.: There’s nothing we can do that’s going to take away the hurt, take away these tears, take away the pain, take away the hole in our hearts, because part of us is gone, senselessly taken from us by hate.
ROBIN HARRIS: He took away my mother and my best friend. How dare you! How dare you! This needs to be fixed ASAP.
BENJAMIN CRUMP: Amen.
ROBIN HARRIS: Mom, we love you. Thank you.
BENJAMIN CRUMP: You think about this race replacement theory that he talked about in this manifesto. There are people who are pushing this hatred on these young people, indoctrinating their minds to go out and commit violence. I mean, these politicians who are trying to, you know, use fear to inspire their base, to help get cable news ratings, it’s these people who are accomplices to this mass murder. And, Cliff, even though they may not have pulled the trigger, they did load the gun for this young white supremacist. They loaded the gun. And we have to hold them accountable, too.
AMY GOODMAN: So, that’s Ben Crump and, before that, Robin and Garnell Whitfield. They lost Ruth Whitfield, the oldest, at 86, of the victims in the massacre on Saturday. Garnell Whitfield is also the former fire commissioner of Buffalo. The Whitfield family and others may sue Bushmaster, the company that makes the assault-style weapon used in the Buffalo attack, which Crump, Biden and others are calling an act of domestic terrorism.
The alleged gunman told investigators he was filled with hate toward Black people. His online record shows he had been previously studying hate attacks. He posted a 180-page manifesto citing racist replacement theories and had the number 14 stenciled on the barrel of his gun, which is reportedly a reference to a 14-word white supremacist phrase.
He live-streamed the attack on Twitch, which our next guest tweeted about. Color of Change President Rashad Robinson tweeted, “Twitch took down the livestream of [the] white supremacist shooting in Buffalo in under 2 minutes. Meanwhile, the video is now on Facebook with 1.8 million views — and they’re not removing it because it ’doesn’t violate their terms of service,’” Rashad says. Color of Change has called for social media platforms to institute changes to their terms of service in order to keep us safe, saying, quote, “There is a direct through line between the tech industry’s lack of regulation and the anti-Black violence we saw in Buffalo.”
Rashad Robinson, welcome back to Democracy Now!
RASHAD ROBINSON: Thank you for having me.
AMY GOODMAN: If you could start off by responding to what happened? And then, most specifically, because we are seeing massacre after massacre — and, of course, this is one is so reminiscent of what happened at Mother Emanuel in South Carolina a few years ago, the targeting of the Black community. We understand that the white supremacist teenager who did this had also considered schools and churches but was afraid of security.
RASHAD ROBINSON: Well, I mean, for all of us, this is both incredibly sad, but it also makes us deeply angry, because of all the things that could be and should have been done to deal with the climate that fuels this type of violence. At the end of the day, a whole incentive structure, a whole profit incentive structure, which has both incentivized the type of content and disinformation and hate-filled rhetoric that we see online, we have watched social media platforms refuse to deal with this, because self-regulated companies are unregulated companies. I have went before Congress to try to push members of Congress to actually deal with the immunity that these platforms have over this type of content. And it’s not simply about freedom of speech, Amy. This is about what they amplify. This is what they — this is about the content that they serve up to users as they sign on. This is about all the ways in which their product is designed in order to create people to have more time on these platforms, more engagement on these platforms, to be engaged in more hate-filled rhetoric. And at the end of the day, we are watching, we are seeing firsthand the impact of it. We should not have to go to billionaires to beg them to protect our civil rights. Our government should be doing the work to hold corporations accountable. And right now they are not doing it.
And at the end of the day, we live, exist in a climate. It’s not just the social media platforms. It’s also Fox News. It’s also the ways in which the big carriers, cable carriers, the Verizons and Comcasts of the world, pay more money for Fox News than they do for other cable programs. And so, people should have a choice. They should be able to not actually have Fox News. We should be able to drive down the profits of Fox News. But we can’t, because 90% of their money comes actually from carrier services, carrier fees, not from advertisers.
So there’s so much at stake here. But at the end of the day, what folks should recognize is that there is an incentive structure. There is a team. There are players. There are coaches. There are owners. And this killer did not act alone. He is part of a larger network. And until we deal with the incentive structure, we will see more of this.
JUAN GONZÁLEZ: And, Rashad, could you talk about Twitch, the platform that many people are unfamiliar with that enabled the shooter to live-stream this attack? Explain what the platform is and how he used it.
RASHAD ROBINSON: Well, Twitch is a platform that allows creators to post and engage platforms. It allows people to monetize in different ways their creativity. We’ve been actually running a campaign with Black Twitch creators because of all of the sort of hate attacks and the raids that have happened on their chats and on their channels. And what we’ve seen is the inability of Twitch to do anything about it. We’ve been back and forth with Twitch. We’ve demanded racial equity audits of Twitch. We have been engaged to try to push this platform to actually do better. And so, for us, the engagement at Color of Change with Twitch did not start recently. We have been engaged. We have been warning Twitch. We have recognized how their platform was used and how it’s become a home, a safe haven, for the type of hate, for folks that actually know that they can engage in this type of behavior and not be held accountable.
But at the end of the day, these platforms get to do it because they believe that they are protected. And, in essence, they have been protected by a set of laws that actually give them, in some ways, immunity over being held accountable liably in all sorts of ways. And so, until we actually deal with the incentive structures, until we create more accountability over the algorithms — and I know folks are wondering, like, “This technology is so complicated.” Well, we’ve had complicated things over time. And what folks should recognize, as we think about sort of what does it mean to regulate these companies, what does it mean to hold these companies accountable — our cars are not safe. Our seatbelts don’t work because of the benevolence of the auto industry. They work because there is government infrastructure and regulation. And there are consequences when those things do not work, when standards are not met.
Right now folks can go out to Silicon Valley, call themselves engineers and build all sorts of things without any rules or regulations or accountability for what they build. And so, right now the technology that should be bringing us into the future and bringing us together is dragging us into the past. And it’s doing that not as an accident, but because we have a set of rules, or a lack of rules, that are manufacturing all the things that we’re seeing.
JUAN GONZÁLEZ: But, Rashad, I wanted to ask you: How do you draw the line between being able to regulate these companies and at the same time allow for legitimate dissent and for dissent not to be closed down by these companies at the same time? I’m thinking, for instance, even now with the situation with Elon Musk taking over Twitter, and he’s going in the opposite direction, insisting on more freedom on these platforms.
RASHAD ROBINSON: Well, people use the word “freedom,” and it only means freedom for some, right? If the ability of certain people in our communities to be able to go to a grocery store or to go to a church is diminished because of a climate for profit that is created, that actually doesn’t make us more free.
But, you know, to the extent that — we’re not talking about here about whether or not someone should be able to post things that we may not like or we might find distasteful. What we are talking about is algorithmic amplification. We are talking about how these platforms put energy behind making some type of content travel because that type of content creates more energy. So, an example is, on Facebook, if Facebook tomorrow decided that they were going to just make it so you spend more time looking at your friends and family, even if your friends and family are sometimes sharing things that are distasteful, that would diminish greatly the type of content that is hate-filled. But, in fact, what they are incentivizing is us all being part of groups where people are arguing, because — or if we see a piece of content and maybe engage in it, even if we don’t join that group, it gets served up to us. And what we know is that this is all part of the larger scheme at these platforms to keep us on the platforms longer, to keep us engaged there longer, because it drives up their profits.
And they can do that, because at the end of the day, that they don’t actually have accountability. So, now imagine a company that produces meat saying that, “You know what? We just want to give consumers more choice, so we’re going to put some meat on the supermarket counters that’s actually safe, and we’re going to put some that, you know, is a couple of months old. And, you know, you pick, and we just want to give people choice.” Those companies would be held accountable. What we are dealing with is not simply, you know, these ideas of what people are posting. What we are dealing with is the backend business models that are creating a structure where certain things are being able to be profited from, certain things travel differently, and hate-filled content has more of a space to be engaged with. And that’s what we are talking about.
If these companies had a level of accountability, you better believe that the innovators in Silicon Valley would figure out how to engage. I think that there is this idea that regulation and accountability stifles innovation. But what we know is that climate innovation has helped in so many ways give us new types of vehicles and new types of ways to actually move from place to place. Innovation can both make us safe, and it can actually spark — or, regulation can make us safe and help us be more innovative. And that’s what we have to focus on. But at the end of the day, none of us should have to rely on Mark Zuckerberg or Twitch, which is owned by Amazon, or all of these companies to actually decide whether or not they’re going to keep us safe and to balance that up against their growth and their profit, because they will always choose their growth and profit over our lives, because they’ve proven it time and time again.