upper waypoint

How Hate-Filled Groups Incite Violence From the Extreme Corners of the Internet

01:05
Save ArticleSave Article
Failed to save article

Please try again

People pay their respects on Aug. 6, 2019, at the makeshift memorial for victims of the Aug. 3 shooting that left a total of 22 people dead at the Cielo Vista Mall Walmart in El Paso, Texas.  (Mark Ralston/AFP/Getty Images)

Note: this article’s visual assets contain offensive language.

Mass shootings have become all too common in America, with attacks weekly, if not daily, now. In the wake of Gilroy, Dayton and El Paso, it’s worth noting that the frequency of these attacks was predicted a year ago by a man who studies hate speech on social media.

Joel Finkelstein, director of Network Contagion Research Institute, says the attacks may seem random, but they’re not. “They’re not coming out of nowhere,” he said.

Finkelstein’s institute uses machine learning tools to identify, track and expose hate speech online, drawing from mainstream as well as extremist communities. The typically young, white, male gunmen more often than not reference language seen on platforms like 4chan, 8chan and Gab.

Detail of a word cloud based on more than 50 million comments on 4chan through October 2018. The comments were pulled from Politically Incorrect, 4chan's board for discussing and debating politics and current events.
Detail of a word cloud based on more than 50 million comments on 4chan through October 2018. The comments were pulled from Politically Incorrect, 4chan’s board for discussing and debating politics and current events. (Courtesy of Network Contagion Research Institute)

Finkelstein said these platforms need to be held legally accountable, perhaps by the families of those killed in mass shootings, who could sue platforms over their willingness to host conversations inciting readers to violence.

Recall what Matthew Prince, CEO of Cloudflare, wrote as he announced that the San Francisco-based web infrastructure company terminated its support services for 8chan earlier this week: “The rationale is simple: they have proven themselves to be lawless and that lawlessness has caused multiple tragic deaths.”

8chan has been unreachable since Cloudflare cut its ties. But 4chan, home of the message board Politically Incorrect, where the words in the graphic above were pulled from, is still up and running. Should that platform be taken down, it’s not difficult to imagine another entity, perhaps overseas, hosting something like 16chan.

“This isn’t state-sponsored terror. It’s platform-sponsored terror,” said Finkelstein. “Instead of going deep, indoctrinating people, investing the energy, they’re doing like Twitter. They’re going wide. People that fall into it, they don’t have to come in with a lot of convictions. They just have to be lost, and they have to find meaning in these terrible ideologies.”

As he said in conversation with the Anti-Defamation League earlier this year, “This is what AI (artificial intelligence) learned, not from studying the murders, but from studying their communities. [The shooters are] downloading the language of their stated motives for murder. What does that sound like, guys? That sounds like ISIS.”

In the case of North American white supremacists, the short list of target groups include Jews, Muslims, African-Americans and Latinos.

Take the relatively modern conspiracy theory that Jews are somehow organizing or promoting nonwhite immigration to Western Europe and the United States, thus threatening to effect “white genocide.” The corollary to that conversation in the U.S. is hatred toward Latinos, the demographic most represented in the debate over illegal immigration.

related coverage

Finkelstein argues that impressionable, anti-social readers of hate speech quickly learn there’s a clear path to action that will give them a sense of self-importance in front of peers they may never know by their real names.

“They begin to train one another as to how to become more expertly anti-social. Now you have a race to the bottom. Who can say the edgiest, craziest thing?” he said “Now, someone goes out and actually commits something. That then causes the entire community to rally, to celebrate. They can’t stop thinking about these horrible things to do. Eventually, that feeds an impulse to actually do the thing.”

Finkelstein is building an argument that a platform that makes psychopathy its focus is really psychopathic itself. “A platform can be a psychopath. Their business plan is murder,” he said.

So what can be legally done? It depends, said Craig Fair, assistant special agent in charge of the San Francisco division of the FBI.

“The internet is a vast, vast wilderness of opinions, dogma, data points, billboards where people can post virtually anything they want. The U.S. government has to be very selective about not only where we go, but lawfully, where we can go, and what we can watch,” Fair said. “Anything that is considered free speech that does not contain threats of violence is simply something we are not going to monitor on an ongoing basis.”

Sponsored

Fair said Silicon Valley companies can help the FBI identify where they are legally able to tread.  “For example, social media companies, when they see communications that go beyond what they assess would be First Amendment-protected activities, report that to law enforcement,” he said. “It gives law enforcement the opportunity to step in and actually neutralize that threat before anything bad can happen.”

Despite hiring tens of thousands of contract workers to screen out hate speech, mainstream social media companies like Facebook and Twitter still struggle to keep offensive material off their platforms.

But the more that mainstream platforms screen, the more people who want to read hate speech move to platforms that will allow it, like 8chan. For that matter, the more that U.S.-based support services like Cloudflare shut down platforms like 8chan, the more likely purveyors of hate speech will pop up where U.S. law enforcement and others cannot reach them.

And that’s why, Finkelstein predicts, the U.S. will increasingly look like a war zone. “Where is the corporate responsibility, if it’s not in the platform? If it’s not in the platform, then nobody has responsibility,” Finkelstein said. “That’s the problem. That’s why we can’t police this.”

KQED’s Polly Stryker contributed to this report.

lower waypoint
next waypoint