After years of keeping its content review guidelines internal, Facebook on Tuesday suddenly decided to make them public. And while the 27 pages of guidelines clear up some questions about what constitutes offensive content, many artists who previously had their work removed by Facebook say the company's transparency is a case of too little, too late.
"What's frustrating is that the policies are administered by people who do not seem to have a grasp on what is or isn't actually 'offensive,'" Christopher Bickel said.
Bickel, a musician, writer and director of the film The Theta Girl, used to run the popular Facebook group Too Soon. He and four other friends started the page two years ago to share dark, funny memes, often original creations "poking fun at the cult of celebrity whenever a celebrity died." Bickel says it was immediately popular, and quickly grew to about 500,000 users.
But the page came down this past March, after Bickel posted an image marking the death of Stephen Hawking: it consisted of a picture of Mark E. Smith, the late singer of the post punk band The Fall, with the words "RIP Stephen Hawking."
Bickel says Too Soon appealed to Facebook, but administrators stood by their decision.
Bickel says his page's memes had been reported before, but they were always anti-Trump. This meme, featuring an obscure musician, was "too random to violate any kind of community standard," Bickel says.
"Compare this to one of the few times I ever reported anything as 'offensive': some white-power person had posted a cartoon drawing of a Klansman in full robes, laying in a hammock comprised of the bodies of two lynched black men," Bickel messaged on Facebook. "My report of this image was responded to the following day with a note that the image did not violate Facebook's community standards. Go figure."
It's Instagram, too
Facebook's guidelines and subjective policing apply to Instagram as well. But before Facebook bought Instagram in 2012 for $1 billion, the photo-centric app was a refuge for edgy artists like photographer Brian Henderson. Though Henderson's creepy, horror-themed photos are not erotic, they do feature nudity, which proved to be a problem for Facebook.
"I used to post on FB a lot but photos would get removed from time to time. It was usually when I would post an image of a nude male," Henderson wrote in an email. "I would post nude females all the time and then one penis shows up and boom, they would remove it. I contested a few times but just gave up because I heard nothing back."
When Instagram launched, Henderson started an account and found he could post without scrutiny. That changed after Facebook bought the app, when his photos began to be removed without warning and, he said, without explanation — "usually just threats that I would lose my account." Then, in one day, all three of his Instagram accounts were taken down. He received a notice for only one of them.
The only site where Henderson's photos can live now is on Flickr, he said.
"As an artist, if you do anything edgy or push any kind of boundary you won’t have a platform on FB or Instagram. But for some weird reason you can search through FB and Instagram and see that photos of a woman breastfeeding are being censored, yet very sexual nude imagery is left alone," Henderson wrote. "The whole thing seems very random and based on the opinions of whoever is heading up the witch hunt that day or time."
The Appeals Process
In what appears to be an attempt to assuage criticism over its subjective censorship policies, Facebook also announced Tuesday that it would set up an appeals process for removed posts. Previously, users could only appeal the removal of groups and pages.
But painter Kevin Davidson says he already found an appeals process outside of Facebook: the media.
Earlier this month, Davidson attempted to take out a Facebook ad featuring a painting he made in response to the Parkland shooting in February. Religious in tone, the painting "Parkland Pieta in Stained Glass" shows a mother praying over a gunshot victim that looks like Jesus Christ, with the faces of 17 victims of the massacre in the background.
Facebook denied the ad, telling Davidson that the picture was "too violent, too controversial, stuff like that." Davidson appealed to Facebook, and then he contacted Fox 40 News in Sacramento. On his way to the station for an interview, Davidson received an email saying his ad had been approved.
At the station, Davidson learned that a Fox 40 reporter, Nicole Comstock, had contacted Facebook, who responded quickly and informed her that the employee who denied Davidson's ad had made a mistake.
Davidson is convinced that Comstock's inquiry is what changed the administrators decision.
"If they truly believed it was offensive, they would've stuck with it," Davidson said in a phone call Wednesday.
Two weeks later, Facebook released its guidelines, which Davidson interprets as the social media site saying, "We can do whatever we want." He's also convinced that the proposed appeal process is to stem bad publicity at a time when Facebook is still dealing with the fallout from having Cambridge Analytica access its users' data.
As far as Facebook's censorship problems go, Davidson has no grand solutions for the company, nor does he want the government to step in. He thinks the website just needs to be smarter.
"If Facebook was smart, I don't think it would be a problem. It's common-sense stuff," Davidson said. "Just be transparent about it."