Even after police identified the shooter, the wrong man's name appeared for hours in tweets. On Facebook, it appeared on an official "safety check" page for the Las Vegas shooting, which displayed a post from a site called Alt-Right News. And on Google, top searches linked to sites that said he was the shooter. When you searched his name, a 4chan thread about him was promoted as a top story.
So, why did parts of these hugely powerful companies continue to point to an innocent man?
Bill Hartzer, an expert on search, says Google is constantly crawling the web and picking up new information as it appears. The innocent man went from hardly having anything online to have a whole bunch of stuff.
"Google has not had the time to really vet the search results yet," Hartzer says. "So what they'll do is they will show what they know about this particular name or this particular keyword."
In a statement, Google said the results should not have appeared, but the company will "continue to make algorithmic improvements to prevent this from happening in the future."
One improvement that Greg Sterling thinks Google should make is putting less weight on certain websites, like 4chan. "In this particular context had they weighted sites that were deemed credible more heavily you might not have seen that," says Sterling, a contributing editor at Search Engine Land. On the other hand, he says "if news sites ... were given some sort of preference in this context, you might not have seen that."
Unfortunately, it seemed like Facebook was giving those same sites credibility. In a statement, Facebook said it was working on a way to fix the issue that caused the fake news to appear. (Disclosure: Facebook pays NPR and other leading news organizations to produce live video streams that run on the site.)
But Sterling says part of the issue with having these companies determine what's news is that they're run by engineers. "For the most part the engineers and the people who are running Google search don't think like journalists," he says. "They think like engineers running a product that's very important."
And then there is the scale of what Google and Facebook do. They are huge. And that's only possible because computers do a lot of the work. Yochai Benkler, a law professor at Harvard, says that with such massive scale, even if there were humans helping out there would be mistakes.
Benkler says that even if Facebook and Google blocked sites like 4chan, it wouldn't solve the problem. "Tomorrow in another situation like this someone will find some other workaround," Benkler says. "It's not realistic to imagine perfect filtering in real time in moments of such crisis."
But, for the man who spent hours being accused of mass murder, the technical problems at Google and Facebook probably aren't much comfort. And they won't be much comfort to the next person who lands in the crosshairs of fake news.
Copyright 2017 NPR. To see more, visit http://www.npr.org/.
9(MDAxOTAwOTE4MDEyMTkxMDAzNjczZDljZA004))