"We've always had a set of community standards that the public can see," Facebook Vice President Monika Bickert told NPR's Steve Inskeep, "but now we're actually explaining how we define those terms for our review teams and how we enforce those policies."
Those new explanations are nothing if not comprehensive. They detail dozens of reasons posts can be removed and read more like the product of a team of lawyers than the words of an upstart tech company. The standards outline methods for categorizing content and provide specific definitions for terms like "hate speech," "terrorism" and "threats of violence."
"People define those things in different ways," Bickert said, "and people who are using Facebook want to know how we define it and I think that's fair."
Some objectionable content is classified into tiers, with Facebook's response matching the severity of the violation. Other content is removed if it satisfies multiple conditions in a point-like system. A threat of violence, for example, can be deemed "credible" and removed if it provides a target and "two or more of the following: location, timing, method."
Still, other standards target particular categories of offensive posts. Content that "promotes, encourages, coordinates, or provides instructions" for eating disorders or self-harm are specifically mentioned. And under its "harassment" section, Facebook says it will not tolerate claims that survivors of traumatic events are "paid or employed to mislead people about their role in the event." Other standards prohibit advertising drugs, revealing the identities of undercover law enforcement officers, and depicting graphic violence.
When it comes to judging content, though, context is crucial. Facebook has been criticized in the past for its blundering approach to community moderation. In 2016, for example, the company reversed its decision to remove a post containing the Pulitzer-winning "napalm girl" photo, which depicted a nude and burned child in the Vietnam War.
Bickert says that example proves that exceptions are needed for newsworthy and culturally significant content.
Facebook's updated standards now list some exceptions for depictions of adult nudity, including "acts of protests," "breast-feeding" and "post-mastectomy scarring."
Still, questions remain over Facebook's content moderation program. Despite Zuckerberg's stated desire to use artificial intelligence to flag offensive content, the process remains very human. According to Bickert, the company has over 7,500 moderators who are stationed around the globe and work 24/7.
But conversations with those moderators paint a much bleaker image of Facebook's processes than the one Bickert provides. In 2016, NPR's Aarti Shahani detailed a workforce comprised primarily of subcontractors who are stationed in distant countries and asked to review large quantities of posts every shift.
It's not hard to imagine how someone located thousands of miles away, who grew up in a different culture, and who is under immense pressure to review as many posts as possible, might mess up.
The Appeal of Appeals
Facebook is seeking to address that problem with its new appeals system. Now, if your post is removed for "nudity, sexual activity, hate speech, or violence" you will be presented with a chance to request a review.
Facebook promises that appeals will be reviewed within 24 hours by its Community Operations team. But it remains unclear what relationship the team has with Facebook and with its first-line reviewers. If appeals are reviewed under the same conditions that initial content decisions are made, the process may be nothing more than an empty gesture.
Facebook points out that the content review and appeals process is just one way to clean up your experience on the site. Users have the ability to unilaterally block, unfollow, or hide posts or posters they don't want to see.
For the social media giant, it's a question of balance. Balance between free speech and user safety. Balance between curbing "fake news" and encouraging open political discourse. And balance between Facebook's obligation to serve as a steward of a welcoming environment and the realities of running a for-profit, publicly owned corporation.
"We do try to allow as much speech as possible," Bickert said, "and we know sometimes that might make people uncomfortable."
Facebook says that Tuesday's announcements are just one step in a continuous process of improvement and adjustment to its standards and policies. How much of an improvement this step represents remains to be seen.
Copyright 2018 NPR. To see more, visit http://www.npr.org/.