YouTube blocks all anti-vaccine content, including things unrelated to COVID-19
We all know how rampant anti-vaxx information has been on social media. Conspiracy theory groups and influencers have been repeatedly spreading misinformation since the start of the pandemic, from at-home concoctions including drinking cleaning solution as a COVID-19 cure to rumors that the coronavirus vaccine leaves one infertile. Some anti-vaxx groups have even gone as far as to encourage members not to go to the emergency room.
As a result of these deadly conspiracy theories revolving around vaccines, some social media platforms have finally decided to take action. YouTube, which formerly enabled anyone to broadcast whatever they wanted about vaccines, has finally updated its policy to remove several prominent accounts that have been linked to the spread of misinformation.
“We’ve steadily seen false claims about the coronavirus vaccines spill over into misinformation about vaccines in general, and we’re now at a point where it’s more important than ever to expand the work we started with COVID-19 to other vaccines,” the company said. The move follows similar actions by other platforms including Facebook.
The policy went into effect Wednesday and includes removing videos by Joseph Mercola, a vaccine critic with nearly half a million YouTube subscribers, and Children’s Health Defense, a group affiliated with Robert F. Kennedy Jr., The Washington Post reported. According to the Post, experts believe Mercola contributed to the declining vaccination rates across the country.
“YouTube is the vector for a lot of this misinformation. If you see misinformation on Facebook or other places, a lot of the time it’s YouTube videos. Our conversation often doesn’t include YouTube when it should,” said Lisa Fazio, an associate professor at Vanderbilt college who studies misinformation.
While the company said it will allow “scientific discussion” around vaccines, including videos about vaccine trials, results, and failures, it will remove baseless claims and theories. YouTube said it will also permit personal testimonies, such as a parent talking about their child’s experiences getting vaccinated. However, if testimonials make broader claims questioning vaccine efficacy, they will be removed.
“You create this breeding ground and when you deplatform it doesn’t go away, they just migrate,” said Hany Farid, a computer science professor and misinformation researcher at the University of California at Berkeley. “This is not one that should have been complicated. We had 18 months to think about these issues, we knew the vaccine was coming, why was this not the policy from the very beginning?”
According to the Post, for years researchers have tied anti-vaccine content on YouTube to growing skepticism about lifesaving vaccines around the world. In the U.S. particularly, the vaccination rate has slowed down, with only 56% of the U.S. population receiving the full two doses of the COVID-19 vaccine as opposed to the 71% vaccination rate in the neighboring country of Canada.
According to YouTube’s vice president of global trust and safety, Matt Halprin, the company didn’t act sooner because it was focusing on information specifically about the coronavirus. As it noticed claims regarding other vaccines and the harm it causes, it made the decision to expand its policy.
“Developing robust policies takes time,” Halprin said. “We wanted to launch a policy that is comprehensive, enforceable with consistency and adequately addresses the challenge.”
As of February 2020, YouTube said it has taken down more than 1 million videos for misinformation all related to COVID-19. The new policy applies not just to COVID-19 misinformation, but any vaccine. Prior to the policy, claims about vaccines—including those for measles or chickenpox—were not monitored.
“We’ll remove claims that vaccines are dangerous or cause a lot of health effects, that vaccines cause autism, cancer, infertility or contain microchips,” Halprin said. “At least hundreds” of moderators at YouTube are working specifically on medical misinformation in all the languages YouTube operates in, he said.
According to NBC News, over the last decade anti-vaccine activists have taken to YouTube to express their fears or theories of COVID-19 being a tool for the government to control individuals. Movements against COVID-19 treatments and the virus being a hoax have thus emerged.
“Anti-vaccine activists have been very vocal about the fact that they saw Covid as an opportunity to undermine confidence in the childhood vaccine schedule,” Renée DiResta, who leads research on anti-vaccine disinformation at the Stanford Internet Observatory, told NBC News. “Seeing YouTube take this action is reflective of the fact that it seems to be aware that that tactic and dynamic was beginning to take shape.” According to YouTube’s new policy, videos will not only be removed but anti-vaccine influencer accounts will also be terminated.