top of page
  • MaryGrace Lerin

YouTube Takes a Stricter Stand Against COVID Vaccine Misinformation with Revised Rules

It's taken a long time, but YouTube has now declared that it will take a firmer stand against COVID-19 misinformation, including deceptive content relating to COVID vaccinations, which could be fueling anti-vax movements around the world.


According to YouTube:


“Crafting policy around medical misinformation comes charged with inherent challenges and trade-offs. Scientific understanding evolves as new research emerges, and firsthand, personal experience regularly plays a powerful role in online discourse. Vaccines in particular have been a source of fierce debate over the years, despite consistent guidance from health authorities about their effectiveness. Today, we're expanding our medical misinformation policies on YouTube with new guidelines on currently administered vaccines that are approved and confirmed to be safe and effective by local health authorities and the WHO.”

The amended policy is as follows:


“Don’t post content on YouTube if it includes harmful misinformation about currently approved and administered vaccines on any of the following:

  • Vaccine safety - Content alleging that vaccines cause chronic side effects, outside of rare side effects that are recognized by health authorities

  • Efficacy of vaccines - Content claiming that vaccines do not reduce transmission or contraction of disease

  • Ingredients in vaccines - Content misrepresenting the substances contained in vaccines”

Videos that include any of these claims will now be removed from YouTube, with channels that distribute them receiving a warning first, then strikes. Any channel that receives three strikes in less than 90 days will be shut down.


To be clear, YouTube's policies already restrict certain sorts of medical misinformation, like contents that pushes dangerous cures, with the company claiming that over 130,000 videos have been removed in the last year for breaching its COVID-19 vaccination policies.

The platform has sought to improve its approach to such over time and in the midst of the ongoing pandemic, but it has also been recognized as a primary source of medical misinformation, which has aided in the spread of hazardous movements, which runs opposite to health officials' intentions.


Last year, a conspiracy-fueled ‘Plandemic' clip on YouTube was seen over 7 million times before being taken down. The purpose of the video was to spread suspicions that the National Institute of Allergy and Infectious Diseases had concealed studies on how vaccines can harm people's immune systems.


Researchers from the universities of Oxford and Southampton discovered earlier this year that people who rely on social media for information - specifically YouTube - are less inclined to be vaccinated against COVID-19, and they asked the government and social media companies to act quickly.


According to the report:

“Trust in health institutions and experts and perceived personal threat are vital, with focus groups revealing that COVID-19 vaccine hesitancy is driven by a misunderstanding of herd immunity as providing protection, fear of rapid vaccine development and side effects, and beliefs that the virus is man-made and used for population control. In particular, those who obtain information from relatively unregulated social media sources - such as YouTube - that have recommendations tailored by watch history, and who hold general conspiratorial beliefs, are less willing to be vaccinated.”


Given this, YouTube's stricter attitude has been long overdue, and it's encouraging to see the company take a more measured approach to plainly inaccurate medical content, which may have serious real-world consequences for both individuals and society as a whole.

However, the change will not be welcomed by all, and it may put YouTube in confrontation with some regional governments.


The Russian government threatened to block YouTube this week unless the platform restored two German-language channels run by Russia's state media organization, RT, which were removed for spreading vaccine misinformation.


According to The Washington Post:

“Russia’s communications ministry, Roskomnadzor, said it had sent a letter to Google “demanding that all restrictions on YouTube channels RT DE and Der Fehlende Part, operated by the Russian media outlet Russia Today, be lifted as soon as possible,” the Interfax news agency reported. The ministry threatened to fully or partially restrict YouTube in Russia, or fine Google, if the channels were not restored.”


Many politicians and civic leaders support those who oppose vaccine mandates, and you can reckon that a huge platform like YouTube taking a firmer stance on the issue will stoke fears that private companies are trying to control the media narrative and have an excessive amount of control over what can and cannot be shared once again.


Which is a legitimate concern, but as YouTube points out, it's following official health advice from both local health authorities and the World Health Organization, and as a result, its standpoint is based on the guidelines that governs all of our health procedures and approaches, keeping us safe from such consequences.


Many of the competing perspectives are based on misunderstandings, and allowing them to grow is harmful because it would prolong the COVID pause, which might have devastating long-term consequences. As a result, it's encouraging to see YouTube take a tougher stance and move to incorporate official health authorities' results into its moderating method.

0 views0 comments

Discover Roundabout's free reporting tool for every social media campaign

Download the app

Apple and Android

apple_google_edited.png
apple_google_edited.png
bottom of page