Last week, Facebook used its developer conference F8 to unveil a new mantra: The future is private.
As Facebook Pivots to Private Platforms, How Will We Monitor Fake News and Hate Speech?
CEO Mark Zuckerberg promised to refocus the company on its private messaging products like Messenger, WhatsApp and Stories. "As the world gets bigger and more connected, we need that sense of intimacy more than ever," he said.
But in this brave new world of encrypted messaging, how are researchers, regulators and even Facebook itself supposed to keep track of things like fake news, harassment and race baiting? Especially when Silicon Valley social media giants have proven ineffective thus far doing so on public platforms?
This week alone, Facebook is fending off embarrassing headlines about vaccine misinformation on Instagram and automatically generated videos celebrating extremist images.
A number of civil liberties advocates say it’s about time. Global Voices Executive Director Ivan Sigal wrote, "Strong encryption and privacy is a bedrock concept. Those who want to see encryption weakened are usually state security agencies, police forces and authoritarian governments."
In that context, Sigal argued, "Journalists, human rights groups and similar need privacy and strong encryption to do their work."
Ian Sherr, editor-at-large of CNET News, said it’s not such a bad thing for us to switch to platforms where it’s on us to police ourselves. "What do we feel is appropriate, what isn’t, and how do we draw these lines? Right now, we’re abdicating that to Facebook and Twitter and YouTube and Instagram and all these other people."
It's also true that social media giants make most of their money from advertising, and advertising favors virality. So, the thinking goes, platforms that don't incentivize virality might reduce the structural appeal to Silicon Valley media giants of helping spread misinformation.
WhatsApp is encrypted now. Facebook is promising Messenger and Instagram Direct will become so, too. It’s not impossible for social media giants to eavesdrop on encrypted conversations: by saving a record of everything in a non-encrypted environment, where artificial intelligence can look for keywords and trends.
But keeping up with problematic content is going to be a lot harder unless somebody inside the conversation flags it for Facebook.
"Closed messaging apps are a completely different beast to regulate and watch," said Robyn Caplan, a media and information policy scholar at the Data and Society research institute in New York. "You can have a WhatsApp group that has two people, and you can have a WhatsApp group that has 250 people. The latter means that disinformation can often spread a lot faster."
Facebook has taken steps to reduce the spread of fake news on WhatsApp, which has fueled mob violence in countries like India, Myanmar and Mexico. That's even though 90% of WhatsApp messages are between two users. In January, Facebook limited how many WhatsApp conversations you can forward messages to at once.
Riana Pfefferkorn, associate director of Surveillance and Cybersecurity at the Stanford Center for Internet and Society, points out old-fashioned gum shoe work still has value.
"Right now, when researchers/law enforcement/etc. want to keep tabs on groups that use private messaging channels (to organize, plot, etc.), they infiltrate encrypted chat channels (e.g. Telegram) so that they can observe, as 'plainclothes' participants, the activity of others in the channel," she wrote.
Brian Levin, director of the Center for the Study of Hate and Extremism, added individuals who want to make a big splash online will still seek out public platforms over private ones.
"Most of these guys don't have a big social circle, even on affinity-based social media. In this new chapter in the holy book of evil, all they have to do is be violent and they're on TV and public radio," Levin said.
"These platforms are going to continue to police content, and what we need is more oversight to understand what they’re doing behind the scenes. Because if not, it’s really just left up to them entirely," Caplan said.
Left up to us, as individuals using these private platforms. Because social media companies have a history of dragging their feet as content screeners until publicly pressured to act.