During Trump’s presidency, he regularly made false and inflammatory statements about a wide range of matters. But only a small handful were removed by Facebook, as when the then-president made dangerous claims like saying COVID-19 was less dangerous than the flu or stating that children were “almost immune from this disease.”
Facebook has previously defended its approach to such controversial and misleading statements, saying politicians like Trump should be allowed to say what they believe so the public knows what they think. Facebook CEO Mark Zuckerberg has also repeatedly insisted that Facebook is merely a platform, not the “arbiter of truth.”
But the documents suggest Facebook’s policy of treating influential people differently — codified in a VIP system called XCheck — was created in large part to prevent a public relations backlash from celebrities and other high-profile users.
The entire premise of the XCheck system, the Journal’s Jeff Horwitz told NPR in September, “is to never publicly tangle with anyone who is influential enough to do you harm.”
Facebook’s own oversight board sharply criticized the program last week, saying the company has not been forthcoming enough about its varying standards for content moderation.
A Facebook spokesperson told NPR in a statement that the company asked the board to review the program because it aims “to be clearer in our explanations to them going forward.”
3. Young people see Facebook content as “boring, misleading, and negative.”
For much of the past decade, older people have been the fastest-growing U.S. demographic on Facebook — a dramatic turnabout for a company whose founding mystique rests on the image of a hoodie-wearing coder creating a space for college kids to connect.
During the same time span, Facebook has seen younger people become less likely to join the site. It’s a worrying trend for the company — Facebook insiders got an update on that trend this year, in an internal presentation that is reflected in the documents.
“Most young adults perceive Facebook as a place for people in their 40s and 50s,” the company’s researchers said, according to The Verge. “Young adults perceive content as boring, misleading, and negative. They often have to get past irrelevant content to get to what matters.”
Along with that stumbling block, young users were found to have negative views of Facebook due to privacy concerns and its potential “impact to their wellbeing,” The Verge reports.
Haugen previously leaked a Facebook study that found that 13.5% of British teen girls in a survey said their suicidal thoughts became more frequent after they joined Instagram.
In addition to its namesake platform, Facebook owns Instagram and WhatsApp.
“It is clear that Facebook prioritizes profit over the well-being of children and all users,” Sen. Marsha Blackburn, R-Tenn., said during a Senate hearing this month in which Haugen testified.
4. Facebook’s global reach exceeds its grasp.
While much of the focus on Facebook in the U.S. has to do with its role in enabling and intensifying political divisions, the documents also fault the company for its activities in numerous other countries.
The documents portray Facebook as failing to deal with a number of social and language complexities stemming from its more than 2.8 billion users worldwide. The results have been especially dangerous and harmful in countries where unrest or rights abuses are common, the documents state.
“Two years ago, Apple threatened to pull Facebook and Instagram from its app store over concerns about the platform being used as a tool to trade and sell maids in the Mideast,” The Associated Press reports.
The company routinely struggles with posts and comments in Arabic, both on its main platform and on Instagram, according to the documents. Arabic is one of the world’s most widely spoken languages, but its many dialects are highly distinct from each another.
Facebook “doesn’t have anyone who can speak most of them or can understand most of them in terms of sort of the vernacular,” Horwitz told NPR. “And it also doesn’t have a system to route content in those dialects to the right people.”
The problem extends beyond Arabic and has a wide range of effects.
“In countries like Afghanistan and Myanmar, these loopholes have allowed inflammatory language to flourish on the platform,” the AP reports, “while in Syria and the Palestinian territories, Facebook suppresses ordinary speech, imposing blanket bans on common words.”
As similar stories emerged over the weekend about India and Ethiopia, Facebook said that it has more than 40,000 people “working on safety and security, including global content review teams in over 20 sites around the world reviewing content in over 70 languages.”