Not too long ago, the barriers to entry for disseminating written information on a mass scale included financial resources enough to pay for millions of dollars worth of reporters, editors, layout designers, press operators, printing equipment, delivery trucks and tweens on bikes with poor aim. Short of that, you're handing out pamphlets on a street corner.
But today, you can run your own mass communications operation with little more than computer, a WordPress account and a little savvy about Facebook algorithms. Whether it's done for clicks, tricks or kicks, fake news has become a profitable cottage industry.
Wikipedia, being a site that actually invites members of the public to write their own content with no vetting of their qualifications to do so, would seem to be ripe for an onslaught of material laden with inaccuracy and bias.
“We see people attempting to promote alternative medicine, we see people attempting to promote conventional medicine beyond the level of evidence that's available,” said Heilman, the medical editor.
But Wikipedia polices the creation of new entries and changes to existing pages with both machine and human interventions.
Pages are "analyzed by smart bots that deal with a large majority of low-level vandalism and many copyright issues,” Heilman wrote in an email. The New Pages Patrol, consisting of seasoned human editors who have made at least 500 uncontested edits to Wikipedia pages, reviews every freshly created page.
Wikipedia requires contributors to justify every assertion of fact in an article, and it has blacklisted certain sites from being used as verification this way. Wikipedians can also put individual pages on watch lists, which will trigger notifications any time an edit is made. For example, the HIV/AIDS page is currently monitored by almost a thousand editors, any of whom can jump in and quickly challenge changes they deem to be unreliable.
Finally, some administrators can “protect” articles, so that only established editors can make changes. Editors who create consistently problematic content can be kicked off the site.
When Heilman emailed me, he had just finished reverting an edit to the Wikipedia "Vaccine" page back to its former state, due to a change made in the "Adverse effects" section. The topic of vaccinations has spawned a lot of misinformation on exaggerated or even nonexistent risks, driving down inoculation rates in some communities. Below is the short-lived edit, followed by the original version that Heilman restored:
There is growing evidence that vaccines trigger autoimmune disease. A mass Hepatitis B vaccination program was halted in France in 1994 following a significant increase in incidence of multiple sclerosis 1-2 years following vaccination. In 2009, GSK's Pandemrix influenza vaccine was proven to trigger narcolepsy in multiple cases across Europe leading to multiple legal claims. Despite this growing evidence the medical establishment maintain that vaccinations given during childhood are generally safe.
Reversion to previous entry:
Vaccination given during childhood is generally safe. The rate of side effects depends on the vaccine in question. Some common side effects include: fever, pain around the injection site, and muscle aches. Additionally, some individuals may be allergic to ingredients in the vaccine. MMR vaccine is rarely associated with febrile seizures.
The Election on Wikipedia
On Sunday, a 28-year-old North Carolina man shot off his assault weapon in a Washington D.C. pizza restaurant. Afterward he reportedly told police he wanted to find the truth about a viral fake news story claiming the eatery was involved in a Democratic child prostitution ring. Contrast the rapid metastasizing of the hoax with the ongoing Wikipedia discussion on whether to publish a draft page on the topic, and you can gain a little appreciation of the deliberate way Wikipedia contributors make decisions.
Another election article Wikipedians had to grapple with was “The Impeachment of Donald Trump,” created on Nov. 11. Just three hours after it went live, editors deleted it. The article, based on the opinions of two academics, said Trump's impeachment was “expected to occur sometime after inauguration." A handful of editors quickly came to the conclusion that speculation by two people did not a valid Wikipedia entry make.
Humans vs. Algorithms
Facebook, where most of the fake news spread this election season, is now reportedly developing artificial intelligence to cope with the problem. Heilman doesn’t think that will be enough.
“People who are involved in generating fake news -- that's a higher level of vandalism than a computer program can take care of," he said. "The only people who can manage to deal with that in my opinion is human beings.”
One reason, he said, is that spotting errors requires layers of knowledge and understanding of the context in which a statement is made.
“You need good faith people who have looked at the whole breadth of literature out there, who have researched the topic in question.”
Katherine Maher, executive director of the Wikimedia Foundation, thinks the main problem with Facebook’s newsfeed is that the reasons it serves up certain stories are opaque.
"Are you receiving the information because it’s accurate, timely or important?" she says. "Or just because a lot of other people have been sharing it, regardless of its validity?"
When people have that information, she believes, they're smart enough to make an informed decision as to whether that which has been placed before them is relevant or meaningless.
Heilman, too, thinks "Facebook and Twitter could learn a lot from the mechanisms Wikipedia has put into place."
But even Wikipedians acknowledge that they, too, have a lot of work to do when it comes to halting the spread of unvetted information. For example, just a small fraction of Wikipedia's health articles—63—have received the highest quality grade by section editors. A page called Reliability of Wikipedia points out the good and the bad. And the site keeps a list of hoaxes that have appeared on its pages, a number of which stayed visible for many years.
In the months and maybe years ahead, Wikipedians -- and anyone involved in the gathering and dissemination of fact-based information, actually -- could have their work cut out for them. If the president-elect's own proclivity for creating news out of whole cloth is any indication, this problem is not going away.