KELLY MCEVERS, HOST:
In the hours after the massacre in Las Vegas, fake news about it started showing up on Google and Facebook. A man was falsely accused of being the shooter. His name bubbled up on a Facebook safety check site and at the top of Google search results. And all of that was automated. NPR's Laura Sydell reports that as these powerful tech companies continue to be a main destination for news, this problem is likely to happen again.
LAURA SYDELL, BYLINE: His name first appeared on a message board on a site called 4chan. 4chan is known as this gathering spot for underground hackers in the alt-right. Everyone who posts is anonymous. And we're not saying the man's name because he's been through enough. Shortly after the shooting, the police announced that a woman named Marilou Danley was a person of interest. She'd been living with the shooter in Nevada. On a message board called /pol/ - Politically Incorrect, someone said it was her ex-husband who was the shooter. His Facebook page indicated he was a liberal, and the far-right trolls on /pol/ went to work to spread the word.
Even after police identified the shooter, the wrong man's name appeared for hours in tweets. On Facebook, it appeared on an official safety check page for the Las Vegas shooting, which displayed a post from a site called Alt-Right News. And on Google, the top searches linked to places that said he was the shooter. When you searched his name, a 4chan thread about him was promoted as a top story. So why did parts of these hugely powerful companies continue to point to a totally innocent man?
Bill Hartzer is an expert on search. He says Google is constantly searching the web and picking up new information as it appears. The innocent man went from hardly having anything online to having a whole bunch of stuff.
BILL HARTZER: Google has not had the time to really vet the search results yet. So what they'll do is they will show what they know about this particular name or this particular keyword.
SYDELL: In a statement, Google said the results should not have appeared. And it will, quote, "continue to make algorithmic improvements to prevent this from happening in the future." One improvement that Greg Sterling thinks Google should make is putting less weight on certain websites, like 4chan. Sterling's a contributing editor at Search Engine Land.
GREG STERLING: In this particular context, had they awaited sites that were deemed credible more heavily, you might not have seen that. So if news sites, for example, were given some sort of preference in this context, you might not have seen that.
SYDELL: Unfortunately it seemed like Facebook was giving these same site's credibility. In a statement, Facebook said it was working on a way to fix the issue that caused the fake news to appear. But Sterling thinks part of the issue with having these companies determine what's news is that they're run by engineers.
STERLING: For the most part, the engineers and the people who are running Google Search don't think like journalists. They think like engineers running a product that's very important.
SYDELL: And there is this scale of what Google and Facebook do. They're massive. Computers have to do a lot of the work. And with such huge scale, even if there were humans, there would be mistakes, says Yochai Benkler, a law professor at Harvard who studies online news. Benkler thinks if Facebook and Google were to block sites like 4chan, it would not solve the problem.
YOCHAI BENKLER: So tomorrow, in another situation like this, someone will find some other workaround. It's not realistic to imagine perfect filtering in real time in moments of such crisis.
SYDELL: But for the man who spent hours being accused of mass murder, the technical problems at Google and Facebook probably aren't much comfort. And they won't be much comfort for the next person who lands in the crosshairs of fake news. Laura Sydell, NPR News, San Francisco.
(SOUNDBITE OF THE BARR BROTHER'S SONG, "STATIC ORPHANS")