The mystery is deepening over how much Facebook knows about Russian use of the social media platform to inject content and messaging intended to influence voters during the 2016 presidential election.
Earlier this week, Jonathan Albright, the research director at Tow Center for Digital Journalism at Columbia University Journalism School, discovered that the online investigative pathway he followed to uncover how six of 470 pages Facebook said had been created by Russian agents, had been shut down. As the Washington Post noted, “They also had scrubbed from the Internet nearly everything—thousands of Facebook pages and the related data—that had made the [investigative reporting] work possible.”
“Facebook is cooperating fully with federal investigations and are providing info to the relevant authorities,” a Facebook spokesperson said in a statement, noting that Facebook had modified software to prevent users from accessing older posts.
“We identified and fixed a bug in CrowdTangle [software] that allowed users to see cached information from all inactive Facebook pages. Across all our platforms we have privacy commitments to make all inactive content, that is no longer available, inaccessible.”
Facebook’s quick creation of a firewall blocking reporters like Albright raises more questions than it settles—and is the latest of many responses where it has sought to downplay and discount its role in facilitating the spread of propaganda in 2016. Immediately after the November election, CEO Mark Zuckerberg said it was “crazy” that so-called fake news on his platform had influenced the election. In the months since, Facebook has acknowledged partisans of all stripes used his platform to spread every variety of political messaging, but said only 10 million people had read the Russian-created content and ad placements.
Albright turned his attention to the few known Russian-created Facebook pages that fell under his larger research project: tracking the most influential political websites and patterns of online interaction in 2016. Six of them were named in various news accounts: Blacktivists, United Muslims of America, Being Patriotic, Heart of Texas, Secured Borders, and LGBT United.
Using Facebook’s tool for advertisers, CrowdTangle, he downloaded the data for these six pages and found a total of 19.1 million “interactions,” which could range from liking a post to adding an emoji to commenting or sharing it. CrowdTangle also said the content on these pages had been “shared” 340 million times, which the Washington Post and other national newspapers reported. While some tech writers doubted that figure was accurate, because of malware that can be used to spread content to raise its online profile, it cast large doubts on Facebook’s initial statement that all of the content from 470 Russian accounts had been shared 10 million times. Needless to say, Albright was dismayed by Facebook’s move.
“This is public interest data,” he told thePost. “This data allowed us to at least reconstruct some of the pieces of the puzzle. Not everything, but it allowed us to make sense of some of this thing.”
This “thing,” as Albright put it, is the way American-based online platforms—not just Facebook, but also Google, YouTube, Twitter, Instagram and others—have been turned into glittering tools to target voters and influence elections in ways political consultants and intelligence agencies couldn’t imagine 20 years ago. Today’s largest online platforms are computer-driven ecosystems that track every word and post from users to compile detailed dossiers for target audiences that are sold to anyone willing to pay for them. The buyers don’t just know what political party users belong to, but their pet peeves. That enabled Russian agents to create and place ads preying on beliefs and prejudices held by Americans of all political persuasions.
Albright and the congressional committees investigating Russia’s role are seeking to understand how the internet is shaping election outcomes and political discourse—or the lack of it. Facebook’s comment that it merely fixed the CrowdTangle “bug” out of privacy concerns—especially when it spies on its users—was not very persuasive.
As Rob LeFebvre wrote for the tech website Engadget, “It’s hard not to see this as a convenient excuse to hide tens of millions of potentially damning data, of course, especially as [Facebook] COO Sheryl Sandberg has committed the company to transparency around the fake Russian ads. Social media analysis has become a large part of figuring out what happens in our society, and not allowing access to even ‘taken down’ posts can seem alarming.”
Facebook has been taken to task for its “blind eye to Russia’s manipulation of the social media network,” as a recent New York Times editorial put it. While some tech writers have described the use of its platform in 2016 as its “Frankenstein moment,” creating a monster it could not control, other investigative reporters have noted that overseas governments have used Facebook for their political messaging and propaganda for years.
It may be that investigative reporters like Albright have temporarily lost their access to data that clarifies what happened on Facebook in 2016. But if Facebook’s comment that it is “cooperating fully with federal investigations” is accurate, then the truth will eventually come out. Whether that’s soon enough to understand the landscape before 2018’s campaign season begins is another story. Those contests, starting with primaries, are only months away.