Facebook has been growing its "News Feed Integrity Team" since the 2016 presidential election. (Sam Harnett/KQED)
Every day people post around 1 billion pieces of content to Facebook, some of which could be misleading, manipulative or straight-up malicious. In advance of the midterm elections, Facebook has grown what it's calling its News Feed integrity team to try to curtail misinformation.
At Facebook headquarters in Menlo Park, there is no easy way to identify the News Feed integrity team. It’s a collection of academics, engineers and product managers. They’re scattered among thousands of Facebook employees in the largest open-floor workspace in the world.
Tessa Lyons is a product manager on the News Feed integrity team. That term, “integrity,” is a big buzzword here.
“When we talk about integrity,” Lyons said, “we’re talking about any attempts to abuse our platform in order to create bad experiences for people.”
Lyons said the team is trying to deal with three big problems. “There are problems of bad actors like fake accounts or foreign interference. There are problems of behavior, which is coordinated inauthentic activity, spam. And there’s a problem of content — fake claims, sensationalism, clickbait.”
The first two issues are more straightforward. The team tries to identify and remove fake accounts or posts that don’t comply with Facebook’s community guidelines, which it recently made public.
But the content problem is trickier. People at Facebook have to make a subjective judgment call about the quality of a story. And users of Facebook can post any kind of story they want on the social media site.
Lyons showed me a post with a headline that falsely said, “Julia Roberts claims Michelle Obama isn’t fit to clean Melania’s toilet.”
It's estimated that there are about 2 billion monthly active users on Facebook. The company doesn’t have an editorial staff to go through everything that is posted. Instead, an algorithm identifies problematic stories, and then those stories are sent to fact-checking organizations like Snopes and PolitiFact. Humans there do research and reporting, which can take three to five days. Meanwhile, millions of people could have shared the story.
Outrageous content sells. Just think about old-school newspaper headlines, said Sarah Sue, another project manager on the News Feed integrity team. “When you see the headline with all caps and the exclamation points,” Sue said. “The only reason that headline exists is to get you to stop and buy the paper.”
Traditional news organizations take on the job of trying to distinguish between fact and fiction. Facebook leaves it up to the users. Part of what the News Feed integrity team does is try to create tools that improve the “news literacy” of Facebook’s users.
“I feel really strongly that we needed to be doubling down on our efforts to help people develop the muscles to identify high-quality news themselves,” Sue said.
Even if Facebook determines a story is filled with false information, it does not remove it. Instead, the News Feed integrity team makes the story smaller in news feeds and it appears less frequently. It also may add a link to the Wikipedia article about the story’s source.
Alex Levitt is a former academic on the News Feed integrity team. “What I’m really excited about is hopefully having people engage with these tools more so people can be a little more self-conscious about the things they see,” Levitt said, “and then hopefully that will help shift their behaviors a bit more and make them conscious of what’s good and what might be bad.”
Emotional, exaggerated and potentially misleading stories get high engagement on Facebook. They stick out on a website where there is a ton of competition for attention. News stories appear next to advertisements, quizzes and posts from friends. And many look the same. It’s hard to stand out in this endless scroll of content. To compete on the site, many news organizations employ teams who work on ways to make their stories and headlines stand out.
Facebook has always said that it is not a news organization, which would make it liable for the content posted on its site. But now, according to the Pew Research Center, more than two-thirds of us get some of our news from Facebook. And the website has been used to spread hoaxes, propaganda and misinformation into the minds of millions of Americans.
The upcoming midterm elections will be a test of whether Facebook’s News Feed integrity team has made any impact since voters cast their ballots in 2016.