Here's how YouTube's algorithm promotes extremist content -- and Google could easily fix it
Paul Joseph Watson (YouTube)

Google could easily fix one of the problems that quickly leads some users to extremist content online -- but the tech giant has so far shown little interest.


Right-wing extremists have learned to exploit YouTube's algorithm to get their videos recommended alongside less extreme content, and the engineers who designed it worry that it reinforces fringe views, reported The Daily Beast.

“People think it’s suggesting the most relevant, this thing that’s very specialized for you. That’s not the case,” said Guillaume Chaslot, who worked on the team that developed the algorithm. “The goal of the algorithm is really to keep you in line the longest."

Some former employees say that emphasis on watch time filters out opposing viewpoints, which then helps to reinforce fringe ideas.

“I realized really fast that YouTube’s recommendation was putting people into filter bubbles,” Chaslot said. “There was no way out. If a person was into flat Earth conspiracies, it was bad for watch-time to recommend anti-flat Earth videos, so it won’t even recommend them.”

Matt, a former right-winger who asked to withhold his name, told The Daily Beast that his indoctrination began with a video of Bill Maher and Ben Affleck discussing Islam, which then led him to an extreme anti-Islam video by Infowars conspiracy theorist Paul Joseph Watson -- which took him even further down the rabbit hole.

“Delve into [Watson’s] channel and start finding his anti-immigration stuff which often in turn leads people to become more sympathetic to ethno-nationalist politics,” Matt said.

The same scenario plays out in the gaming and atheism communities, where an opposition to feminism leads young men toward extreme opposition to so-called "social justice warriors."

“I think the anti-SJW stuff appeals to young white guys who feel like they're losing their status for lack of a better term,” Matt said. “They see that minorities are advocating for their own rights, and this makes them uncomfortable so they try and fight against it.”

Andrew, another former right-winger, told The Daily Beast that YouTube videos persuaded him that social justice causes were a plot against him -- and that helped him embrace extremism.

“Once you've gotten someone to believe that, you can actually go all the way to white supremacy fairly quickly,” he said.

Chaslot, the former YouTube engineer, said he suggested to the company that users should be allowed to opt out of the recommendation algorithm -- but he said Google was not interested.

Google’s chief executive officer, Sundar Pichai, was questioned last week during a congressional hearing about problematic recommendations, but he made no promise to do anything about the algorithm.

“It’s an area we acknowledge there’s more work to be done, and we’ll definitely continue doing that,” Pichai told lawmakers. “But I want to acknowledge there is more work to be done. With our growth comes more responsibility. And we are committed to doing better as we invest more in this area.”