Social media algorithms ‘introduce terrorists to like-minded terrorists’: Ex-FBI official on the rise of white nationalism
Southerners rally for succession, photo via the League of the South Facebook page.

The former assistant director for counterintelligence at the Federal Bureau of Investigation explained the role of tech companies in spreading white nationalism during the era of President Donald Trump during an appearance on MSNBC with the Rev. Al Sharpton.

"If there's been one constructive thing to come out of Trump era, it's the current focus on white supremacy as a national and global security threat as racist violence from Kentucky to Christchurch, New Zealand, seems to have galvanized everyone but the Republican party," Sharpton explained.

"And our commander-in-chief, who continues to deny the threat despite warnings from the highest levels of law enforcement, recently downgrading the federal response to domestic terror while continuing to push for a racist border wall that would not have stopped a young woman from being run over and killed by neo-nazis in Virginia in 2017, those 12 Jewish worshippers from being shot to death by a racist in Pittsburgh last year, or most recently, those three black churches in Louisiana that authorities suspect were burned down by the white son of a local sheriff's deputy," he continued.

"In other words, folks, the enemy is already here and getting help from the top," Sharpton concluded.

For analysis, the host was joined by former FBI Assist Director Frank Figliuzzi.

"Frank, what can government do and what should they do around making sure that the tech companies do not continue to be an avenue for this kind of hate that is turning into actual violence with an increase in the numbers of victims?" Sharpton asked.

"Yeah, it's time for Congress to not just cross their fingers and hope for the best out of Silicon Valley, because look, they've proven they're unable to handle this without some adult supervision," Figliuzzi replied. "They're driven by profit and quite frankly they're driven by their own algorithms."

You and I on social media, we like the idea they're suggesting people we should friend or we might want to think about joining this group," he explained. "Well, guess what, those same algorithms apply for people who are involved in hate and hate speech and we're finding on the international terrorism side that the algorithms actually introduce terrorists to like-minded terrorists."

"And so Congress needs to get everybody together and say the algorithms need to be tweaked so that there's a filter for hate, violence and hate speech and then Congress needs ask law enforcement, do you have what you need to get the job done? Right now, the answer is no," Figliuzzi concluded.