Joel Finkelstein offers an instructive case study in how ideas form, take shape and spread — notions both good and bad.
It starts in 2016 at Princeton University, where Finkelstein was pursuing a doctorate in neuroscience psychology. The son of a rabbi, he grew up in Tyler, Texas and was seeking a contemplative, respected career as a university professor immersed in the theoretical research of the human mind.
Then the world intervened.
Finkelstein started noticing that a growing number of his friends, from both ends of the political spectrum, “were going crazy,” as he put it. “Intelligent, reasonable people who I respected … were suddenly embracing hateful identity politics. They were becoming really exclusionary.”
It was not an isolated fluke, he thought, it was happening at a time when the sociopolitical climate in America and beyond was in dramatic upheaval. “I started thinking about it: Trump had just won the presidency. Brexit had just passed. Where did that come from?”
There was one common thread: “I had this hunch that social media had something to do with it,” he said.
The question became consuming and began dominating his conversations with a handful of friends in neuroscience and engineering. They decided to explore their premise, diving into “some very nasty web communities that weren’t understood,” Finkelstein said.
Cryptic code words and memes
By 2018, they had created their own computer platform, and had compiled massive amounts of online data generated by extremist groups. They created an artificial intelligence model to dissect and analyze that data. Along the way, they discovered that cryptic code words and memes were an essential tool in the incendiary propagation of spurious conspiracy theories. Their program could identify virtual outbreaks of disinformation in nearly real time, like an epidemiologist tracking a deadly virus.
“Most researchers only have data on one platform, or one niche topic,” according to Chris Meserole, director of research and policy at the Artificial Intelligence and Emerging Technology Initiative. “By contrast, NRCI has a dataset that spans multiple platforms and periods and can track memes as they evolve over time and spread across different social media. At a technical level, the platform they’ve built is genuinely impressive.”
Their research started painting an alarming picture.
“We saw the writing on the wall,” he said. “We saw things leap from being individuals becoming radicalized to entire militias becoming radicalized … driven by an apocalyptic myth.” His original suspicion was confirmed: “Social media was creating our reality. …
“So I changed careers,” he said. “We don’t live in the world I was planning any more.”
In May 2018, after “just barely” earning his PhD in psychology, he established the Network Contagion Research Institute (NCRI) as a nonprofit organization.
“There is no glory in this job,” he said. “But I believe it’s important work.”
His new path would soon lead him to several high-profile collaborators including John Farmer Jr., a former New Jersey attorney general who is now director of Rutgers University’s Eagleton Institute of Politics, as well as of its Miller Center for Community Protection and Resilience. Farmer, the former lead counsel for the 9/11 Commission, had been studying extremist violence for years. Last year, Farmer joined the NCRI leadership team and brought Finkelstein to the Miller Center on fellowship. The partnership set NCRI on a new, higher trajectory. Since then, NCRI has become prolific in its research, publishing seven reports. Some of them broke new ground by identifying viral outbreaks of online hate that were leading to violence. They repeatedly warned that the vitriol was reaching critical mass.
Then Jan. 6 happened.
Pittsburgh synagogue shooting
NCRI’s first study in 2018 attracted some national attention. It “warned of a near doubling of antisemitism, white supremacy, and other forms of ethnic hate boiling to the surface.” Collecting and analyzing hundreds of millions of web comments and tens of millions of images from popular white supremacist web communities, it revealed how fringe online hate platforms — particularly Gab and 4chan — were surging and spilling over into mainstream media.
One recurrent Gab user, using the handle onedingo, was writing and sharing an incessant stream of white supremacy theories and apocalyptic posts. His name was Robert Bowers.
Just two weeks after the NCRI report, Bowers stormed into the Tree of Life synagogue in Pittsburgh, armed with an assault rifle and three handguns. Shouting “All Jews must die,” Bowers killed 11 people in the deadliest attack against Jews in American history.
A week earlier, he had reposted a message that Western civilization is “headed towards certain extinction within the next 200 years and we’re not even aware of it.”
Finally, just before the attack, Bowers posted: “I can’t sit by and watch my people get slaughtered. … I’m going in.”
NCRI’s warning had come unmistakably true: the online fantasies of white-supremacist bigotry and hate turned into real-life murder.
The Pittsburgh massacre was part of a spreading “epidemic,” Finkelstein said in an interview the next day. It is being spread by a loose online “organization that no one understands, no one knows how to police, and there’s not a civil mechanism to deal with it.”
Analyzing huge amounts of data
As the fledgling NCRI team continued amassing and analyzing huge amounts of data in 2019, Finkelstein was recruiting a notable, and eclectic, team of collaborators and investors. The group is nonpartisan and its investors include the Anti-Defamation League, George Soros’ Open Society Foundations and the Charles Koch Foundation.
A dinner meeting in October 2019 at Princeton University would change NCRI’s orbit.
There, Finkelstein found himself sitting across from Gen. John R. Allen, president of the Brookings Institution and a retired four-star general and former commander of the NATO and U.S. forces in Afghanistan. He told Allen what he was doing, and he was instantly intrigued, Finkelstein said. It would turn out that it was Allen who “opened all the doors for me” going forward.
In turn, Allen suggested to Paul Goldenberg, a colleague on the U.S. Homeland Security Advisory Council, that he should hear Finkelstein out. Goldenberg, also a fellow at Rutgers’ Miller Center, did just that and introduced him to its director, Farmer, who almost immediately formed a partnership with NCRI.
In July, NCRI published a report that revealed the dangers embedded in the codes of the Boogaloo Bois, who were now conducting kidnapping exercises. In September, the showed how law enforcement was becoming the target of hate groups. In December, they showed how the QAnon conspiracy movement had doubled in size and become increasingly militarized.
Detecting emerging threats
“We’ve developed a means of detecting emerging threats, that we believe is very reliable,” Finkelstein said.
Last year, Finkelstein also landed former Republican Rep. Denver Riggleman, a former Air Force intelligence officer and a longtime data analytics expert. The Virginia congressman caught Finkelstein’s eye last year when Riggleman teamed up with New Jersey Rep. Tom Malinowski (D-7th) to sponsor and pass a resolution condemning QAnon.
“This is the guy we were waiting for,” Finkelstein said. “A data intelligence expert who gets it.”
Riggleman in a tweet on Aug. 25, announced he had put in the resolution. He immediately received 10,500 comments in response, most of them negative from QAnon conspiracists. “They were calling me a pedophile, a deep-state actor; they were sending me pictures of nooses — stuff like that. …. We know it was a coordinated activity.”
Finkelstein and a handful of colleagues met Riggleman at a Washington restaurant in September and gave him a full briefing. Riggleman had already read their reports and was impressed by their analytical acumen. “They spoke the right language,” Riggleman said.”
Finkelstein asked him to be NCRI’s chief strategist. Riggleman accepted the offer to be an unpaid adviser on the spot.
Finkelstein also landed another “significant” investor last year in David Magerman.
Magerman is famous, not just for the millions of dollars he made by developing computer-driven algorithms at Renaissance Technologies, the hedge fund that developed machine learning programs that proved better than humans at trading stocks. Magerman’s life also changed dramatically in 2016, just as had Finkelstein’s.
That is when he had a very public falling-out with his boss and mentor, Robert Mercer. It was then Magerman learned that Mercer was one of Donald Trump’s biggest donors and had also invested in Breitbart News and the notorious Cambridge Analytica, the political consulting firm that played a major role in both the Trump and Brexit campaigns in 2016.
Magerman was repulsed, he told The Wall Street Journal so, and was soon fired.
Today, Magerman is still making hefty sums of money as a venture capitalist, but he has also become a steadfast anti-surveillance advocate and philanthropist. Magerman says he wants to kill the cash cow of the internet economy — data collection — and replace it with something less profitable and better for society. He seeks a very different internet that would provide more privacy, data protection and transparency.
Magerman wants to create the “inverse of Cambridge Analytica,” Finkelstein said, and therein lies the core of the conflict between fact and fiction.
Facebook, Google and surveillance capitalism
A handful of social media companies have made trillions of dollars in ad revenues on what has become known as surveillance capitalism. It is a business model in which digital companies including Facebook and Google monitor their users’ every virtual move, mining that data so that their artificial intelligence algorithms can refine them into lucrative personality traits and predilections that are then sold to advertisers.
“In terms of selling goods, these algorithms are very effective but the consequences have been really devastating,” Farmer says. Facebook CEO “Mark Zuckerberg and others have made a lot of money at a great social cost to the country, and to the world for that matter.”
While social media companies are selling products, fanatics on the far right and the far left are usurping their tools to sell us something much more sinister.
Alex Goldenberg (no relation to Paul), NCRI’s lead intelligence analyst, saw this first-hand in 2019 when he and his colleagues were tracking an emerging trend in the online world of extremism, “the boogaloo” — an anti-government, pro-gun crusade that was moving from fringe to mainstream. The boogaloo meme, they discovered, was actually white supremacist code for a coming second Civil War. They were now openly preaching violence.
Goldenberg was following some of its members on Facebook. Soon, a Facebook computer algorithm started sending him recommendations for new boogaloo pages to visit — increasingly more violent than the ones he had chosen to follow.
“People say boogaloo was organized on Facebook, but actually it was Facebook that organized boogaloo,” Goldenberg said.
Lies more likely to be shared than truth
A 2018 MIT study, also showed how conspiracy theories make their way from fringe internet chat groups to mainstream America. False news, the study stated, travels six times the speed of real news on Twitter. And those lies were 70% more likely than the truth to be shared.
Here is one example of this phenomenon, as reported by The New York Times:
At 1:51 p.m. on Jan. 6, a relatively obscure right-wing radio host tweeted his own speculation about rioters who had just breached the U.S. Capitol: “Antifa or BLM or other insurgents could be doing it disguised as Trump supporters … Come on, man, have you never heard of psy-ops?”
His tweet caught the attention of another conservative radio pundit, who was sitting in on Rush Limbaugh’s national radio program. He repeated the theory on air. Then it exploded on Twitter, and then Fox News. Florida Rep. Matt Gaetz, standing in the ruins of the Capitol insurrection, announced many rioters “were members of the violent terrorist group Antifa.”
The assertion has been repeatedly debunked by federal authorities, most recently by FBI Director Christopher Wray. But it is still embraced by many Americans. More than half of Trump voters said that the riot was “mostly an Antifa-inspired attack,” according to a recent USA Today poll.
Finkelstein is well aware that today’s social media platforms are merely the latest repository for mass-produced conspiracy theories steeped in racist accusations. He points out that more than 500 years ago, The Witch Hammer was produced on the new, transformational Gutenberg printing press. The inflammatory book, which espoused the evils of satanic sorcery and how to eradicate it, was reprinted 20 times. At one point it became second only to the bible in popularity.
Today, Big Tech is not about to give up its lucrative business model, Finkelstein added.
“There is no one to hold the platforms accountable. … they’re allowed to harbor seditious extremists,” he said. “The problem is our government can’t control that without becoming places like China or Russia. … Democracies that can’t control platforms end up being controlled by them.”
Tomorrow, Part 3: ‘They were inevitable’
Major funding for Exploring Hate has been provided by the Sylvia A. and Simon B. Poyta Programming Endowment to Fight Antisemitism, The Peter G. Peterson and Joan Ganz Cooney Fund and Patti Askwith Kenner.