“The concept that vaccines harm — instead of help — is at the foundation of a lot of misinformation,” said Jeanine Guidry, assistant professor of public relations and director of the Media+Culture Lab at the Virginia Commonwealth University School of Media and Culture.
She added that, if enforced properly, the company's new policy could stop bad information from influencing, for example, a new parent who is using the internet to research whether or not to vaccinate their child.
Up until Wednesday, anti-vaccine influencers had used YouTube to stoke fears around vaccines that health experts point out have been safely administered for decades. The YouTube channel of an organization run by environmental activist Robert F. Kennedy, Jr. was one of several popular anti-vaccine accounts that was gone by Wednesday morning.
In an emailed statement to The Associated Press, Kennedy criticized the ban: "There is no instance in history when censorship and secrecy have advanced either democracy or public health."
YouTube declined to provide details on how many accounts were removed in the crackdown. And the company, which is owned by Google, warned that the more widespread removal of videos may take some time as it works to implement and enforce the policy.
As a number of prominent social media companies, like YouTube and Facebook, have tightened their restrictions regarding vaccine misinformation over the last year, many conspiracy theorists have begun migrating to other less-regulated platforms. Rumble, another video-sharing site, has become a popular choice for far-right groups and others who are vaccine-resistant, Slate reported in March.
But many conservative pages that spread vaccine misinformation are still active on YouTube, and their videos continue to attract millions of views.
Despite tech companies announcing a string of new rules around COVID-19 and vaccine misinformation during the pandemic, falsehoods have still found big audiences on the platforms.
In March, Twitter began labeling content that made misleading claims about coronavirus vaccines and said it would ban accounts that repeatedly share such posts. Facebook, which also owns Instagram, had already prohibited posts claiming coronavirus vaccines cause infertility or contain tracking microchips, and in February announced it would similarly remove claims that vaccines are toxic or can cause health problems such as autism.
Yet popular anti-vaccine influencers remain on Facebook, Instagram and Twitter, where they actively use the platforms to sell books or videos. On Facebook and Instagram alone, a handful of anti-vaccine influencers still have a combined 6.4 million followers, according to the Center for Countering Digital Hate, a social media watchdog group. And coronavirus vaccine misinformation has been so pervasive on Facebook that President Biden in July accused influencers on the platform of "killing people" with falsehoods about the vaccine.
Other platforms have taken a harder line. Pinterest, for example, prohibited any kind of vaccine misinformation even before the pandemic began. Now, if users search for content about vaccines on the site, they are directed to visit authoritative websites operated by the Centers for Disease Control and Prevention, and the WHO.
Editor's note: Google is among NPR's financial supporters.
This post contains additional reporting from The Associated Press.
Copyright 2021 NPR. To see more, visit npr.org.