Anti-Vaccine Misinformation on Facebook Has Gained a Lot of Traction During Pandemic

Save ArticleSave Article

Failed to save article

Please try again

This article is more than 3 years old.
Employees work in Facebook's nerve center for the fight against misinformation on the platform during a media demonstration on Oct. 17, 2018, in Menlo Park, California.  (Noah Berger/AFP via Getty Images)

As scientists begin to clear a path to a potential coronavirus vaccine, researchers and advocates are increasingly sounding the alarm over what they see as a looming threat: Facebook’s apparent inability to police dangerous falsehoods about vaccines.

Since the outset of the pandemic, vaccine-related falsehoods have ballooned on the platform — and recent research suggests some of those inaccurate posts are gaining traction among people who weren’t previously opposed to vaccinations. Part of the problem appears to be the way Facebook’s algorithms capitalize on divisive or extremist content.

Compounding the issue is Facebook’s history of hesitating to address misinformation until a particular subject has snowballed into an urgent problem. In the middle of a global measles outbreak last year spurred by low vaccination rates, Facebook rolled out a series of policies to curb vaccine misinformation. But the changes did little to prevent the problem from resurfacing again amid the COVID-19 pandemic.

Concerned that Facebook’s failure to crack down on vaccine falsehoods will backfire at the worst possible time, advocates and scientists have called on Facebook co-founder Mark Zuckerberg to take action.

“COVID-19 misinformation is the equivalent of an ideological dirty bomb: It has the capacity to hurt tens of thousands of people when it detonates in the moment that vaccines are available,” said Imran Ahmed, founder and chief executive officer of the U.K.-based nonprofit Center for Countering Digital Hate, which recently assessed the growing influence of anti-vaccination content on social media platforms including Facebook since the outset of the pandemic.


Facebook did not respond to a request for comment about its efforts to combat COVID-19 vaccine information. But in a July report, Ahmed’s organization assessed the influence of 409 English-language social media accounts sharing anti-vaccination content between May and June of 2020, just after the pandemic began. The report found that falsehoods about a COVID-19 vaccine have been prevalent on Facebook and other social media sites for months.

It’s a dire situation — and one that has only worsened with time.

“The first thing that came back from our researchers was, ‘Oh no. They’ve grown a lot,’” Ahmed said. In their sample, Facebook groups and pages spreading vaccination falsehoods had over 31 million followers, representing more than half of the combined following of all the social media accounts they studied.

Among the pages spreading falsehoods, the researchers identified one prominent category: people who sell or profit off of vaccine misinformation. These “anti-vaccination entrepreneurs” — who collectively garnered a total following of 28 million people — saw their followers grow by 854,000 between May and June. Zeroing in on groups, the researchers identified 64 that regularly shared vaccine misinformation, with a collective following of 1 million that has also kept growing.

Based on the research findings, Ahmed is deeply concerned that the growth in vaccine-related falsehoods will sow distrust in an eventual vaccine before one even arrives.

“Personally, I’m scared,” he said.

Since people have been asked to stay at home, more conversations have moved online — giving misinformation a chance to gain even more of a foothold on social media sites. Anti-vaccination content, in particular, has thrived.

“It’s basically like if you injected adrenaline into them,” said Neil Johnson, a professor of physics and researcher at the Institute for Data, Democracy, & Politics at George Washington University.

In May, Johnson and his colleagues published a study in Nature that showed a sizable uptick in the followers of pages promoting anti-vaccine rhetoric on Facebook between February and October of 2019. While pages spreading falsehoods about vaccines had fewer followers than pages that shared factual vaccine content, the researchers found that there were more pages spreading falsehoods, and those pages were faster-growing and increasingly more connected to neutral pages where people did not yet have a clear leaning one way or the other. If the trend continues, Johnson predicted that anti-vaccination rhetoric will dominate online discussion by 2030.

He and his team have been studying the change on a rolling basis in the months since their study ended — and from their view, the problem is growing more pressing.

Based on what we’re seeing now, “it would only take something to go wrong with one of the vaccines and then that [change] would happen in a year,” Johnson said.

Critics say another possible factor in the social media giant’s failure to get rid of vaccine misinformation is Zuckerberg’s own approach to moderation. Zuckerberg has, in the past, presented the social network as a stalwart of free speech, even if that speech is potentially harmful — one of the reasons the company has repeatedly opted to permit lies in political ads.

In recent months, some of Zuckerberg’s dissenters have appealed to his more philanthropic intentions. In a June letter, hundreds of scientists affiliated with the Chan Zuckerberg Initiative — the research organization he and his wife, pediatrician Priscilla Chan, founded in 2015 — called the social network’s practices “antithetical” to their benefactor’s mission.

“The spread of news that is not vetted for factual accuracy leads to confusion and a mistrust of experts,” they wrote.

Facebook’s failure to eradicate scientific falsehoods is difficult to reconcile with the research efforts spearheaded by the Chan Zuckerberg Initiative. While Facebook has avoided public calls to uniformly crack down on misinformation, Zuckerberg and Chan have continued to funnel millions into CZI, including funding for several research efforts aimed at better understanding the spread of the coronavirus.

In April, Zuckerberg and Chan invested $13.6 million into a pair of coronavirus studies, led by CZI, to flag new infections and better understand how immunity to the virus develops.

“From my vantage point, we’re all just trying to figure out how to do our part to answer these questions and rebuild,” Chan said in an interview with STAT in April. At the same time, though, rampant falsehoods about the virus — including the dangerous notion that an eventual vaccine will be deadly — were spreading quickly across Facebook.

It’s not that Facebook is doing nothing about the issue; rather, its piecemeal approaches are simply too little too late, according to researchers.

Last March, amid worldwide measles outbreaks, for example, the company instituted its first policy to combat vaccine misinformation, which involved using its algorithms to demote groups and pages that spread vaccine-related falsehoods and to exclude them from recommendations and predictions when users search within the platform. But as researchers pointed out at the time, demoting certain pages doesn’t stop anti-vaccination conversations from cropping up in others.

Among the pages where Johnson found anti-vaccine falsehoods, most were dedicated to things like hobbies and mutual interests. There were pages tailored to parents of kids who played soccer, people who like cats, and aficionados of organic produce, for example. But every so often, a strand of a conversation about something like the best way to treat a leg sprain would devolve into an inaccurate — and dangerous — diatribe against vaccines.

“Often in these communities, the number of what I’d call ‘bad’ posts is small — they’re mostly talking about cats or their favorite sports team or whatever, and then all of a sudden there’s a post stuck in there about vaccines,” Johnson said.

More recently in response to the coronavirus, Facebook began targeting people who had interacted with certain COVID-19 falsehoods with generic messages that read, “Help friends and family avoid false information about COVID-19.” Those messages point users to the World Health Organization’s myth-busting page about the coronavirus.

While Facebook said the strategy was based on psychological research, the authors of the studies the platform cited as justification for their policies told STAT in May that Facebook seemed to have misinterpreted their findings. Rather than placing a generic message in people’s news feeds, Facebook should be correcting falsehoods by providing the facts, the researchers said.

Experts said that if Facebook is to combat misinformation in time for the arrival of an eventual coronavirus vaccine, there’s a significant amount of work that needs to happen quickly.

“It’s going to be a heavy lift,” Johnson said.


This story was originally published by STAT, an online publication of Boston Globe Media that covers health, medicine, and scientific discovery.