Big Tech /

YouTube ‘Unbanned’ Questioning Mask Effectiveness in The Latest Iteration of the Platform’s COVID-19 Misinformation Policy

CEO Susan Wojcicki said the policy was intended to 'raise up authoritative and trusted content'


YouTube ‘Unbanned’ Questioning Mask Effectiveness in The Latest Iteration of the Platform’s COVID-19 Misinformation Policy

YouTube removed several significant bans from its COVID-19 misinformation policies which had previously led to the removal of hundreds of thousands of videos from its website. 


In the name of stopping the spread of misinformation, YouTube has made a variety of statements regarding COVID-19 requirements and bannable offenses. This included banning content that claimed masks could cause individual oxygen levels to drop, lung cancer, brain damage, or an individual to contract COVID-19. The rules led to accusations of censorship and a massive purge of videos deemed to violate the pandemic-era standards.

But recently, the platform quietly rescinded its stance on masking misinformation and walked back its definition of barred claims regarding vaccines.

YouTube’s policies governing permitted information regarding COVID-19 have fluctuated since the virus's global outbreak at the beginning of 2020.

In May of 2020, YouTube’s COVID-19 Medical Misinformation Policy did not include a specific prohibition on the content that questioned the effectiveness of masks or social distancing. The policy at the time said YouTube would not permit the spread of misinformation that “contradicts the World Health Organization (WHO) or local health authorities’ medical information about COVID-19” regarding treatment, prevention, diagnosis, or transmission.

Examples of prohibited content included “claims that there’s a guaranteed vaccine for COVID-19” or “content that encourages the use of home remedies in place of medical treatment such as  consulting a doctor or going to the hospital.”

In an August 2020 blog post that coincided with the release of a quarterly  Community Guidelines Enforcement Report, YouTube said it uses a combination of “technology and people” to enforce its policies. 

The company said that following the onset of the pandemic it had two options for how to enforce its misinformation policies – either to “dial back our technology and limit our enforcement to only what could be handled” by the staff not impacted by COVID-19 lockdown restrictions or to use “automated systems to cast a wider net so that the most content that could potentially harm the community would be quickly removed from YouTube” even though “many videos would not receive a human review” and that it was likely that videos that did “not violate [YouTube’s] policies would be removed.”

YouTube elected to rely on technology and removed over twice as many videos as it had during the first quarter of the year. 

To minimize the disruption to creators, YouTube officials wrote, “We made a decision to not issue strikes on content removed without human review, except in cases where we have very high confidence that it violates our policies.”

YouTube expanded its policies in October of 2020 to specifically prohibit content that was regarded as disputed self-isolation or social distancing recommendations. The company reported that general discussions of “broad concerns" would still be permitted although videos deemed to have “borderline” content would be removed.

The World Economic Forum reported on Oct. 15 that “conspiracy theories and misinformation about the new coronavirus vaccines have proliferated on social media during the pandemic, including through anti-vaccine personalities on YouTube and through viral videos shared across multiple platforms.”

In January 2021, YouTube’s CEO Susan Wojcicki announced that more than 500,000 videos had been removed from the platform for violating the ban on misinformation.

"It's a priority to continue to update our approach to responsibility so people find high-quality information when they come to our platform," Wojcicki wrote in a letter to video creators. “Our approach to responsibility is to remove content experts say could lead to real world harm, raise up authoritative and trusted content, reduce views of borderline content, and reward creators who meet our even higher bar for monetization.”

She noted that the platform would “remove egregious medical misinformation about COVID-19 to prohibit things like saying the virus is a hoax or promoting medically unsubstantiated cures in place of seeking treatment.”

Specifically, information that deviated from the “consensus” presented by the Centers for Disease Control or the World Health Organization and other select health experts was deemed to be misinformation. YouTube also announced partnerships with the American Public Health Association, the Cleveland Clinic, and the Mayo Clinic in 2021 with the aim of increasing “authoritative health information” to the platform. 

By the end of January 2021, YouTube had altered its Medical Misinformation Policy to prohibit content that claimed “an approved COVID-19 vaccine will cause death, infertility, or contraction of other infectious diseases,” that “an approved COVID-19 vaccine will contain substances that are not on the vaccine ingredient list, such as fetal tissue,” or that “any vaccine causes contraction of COVID-19.”

The platform also removed any claims that “a specific population will be required (by any entity except for a government) to take part in vaccine trials or receive the vaccine first.”

“We may allow content that violates the misinformation policies noted on this page if that content includes context that gives equal or greater weight to countervailing views from local health authorities or to medical or scientific consensus,” the policy stated at the time. “We may also make exceptions if the purpose of the content is to condemn or dispute misinformation that violates our policies.”

The policy was subsequently further revised so that by May of 2021, video creators were explicitly barred from saying wearing a mask could harm an individual’s health and that masks did not prevent the contraction or transmission of COVID-19.

YouTube additionally explicitly prohibited recommending Ivermectin or Hydroxychloroquine as effective treatments for COVID-19. Claims that “achieving herd immunity through natural infection is safer than vaccinating the population” was included on the list of examples of prohibited content.

After over a year of creating an increasingly detailed list of prohibited sentiments, YouTube noticibly reversed course in the spring of 2022. Between early April and mid-May, the company removed any references to masks from its misinformation policy. The explicit ban on “claims that COVID-19 vaccines do not reduce risk of contracting COVID-19" was also removed, effectively opening the door for debate on YouTube about the role the vaccines play in the prevention of the virus.

The misinformation policy pages has instead included a prohibition on "claims that COVID-19 vaccines do not reduce risk of serious illness or death."

The changes may have been made after the CDC revised its recommendations regarding masks. On April 20, changes to the agency's masking guidelines meant 70% of Americans were no longer required to wear a mask indoors or outdoors. The CDC cited the significant decrease in COVID-19 cases. The agency also relaxed social distancing guidelines for most areas of the country. 

Mask mandates, which have been part of the COVID-19 infection prevention strategy at local, state, and national levels, have not been popular among everyone in the U.S., where individualism prevails,” wrote Kathy Katella of the Yale School of Medicine at the time. 

The Washington Examiner reported on Aug. 9 that YouTube removed a dozen videos from local government agencies in Kansas, North Carolina, and Washington that included claims prohibited under its misinformation policies. Following backlash from officials involved, the company “reversed course and created an exception to its content moderation rules for local government meetings” where the “intention isn’t to promote misinformation.”

Several of the videos included public comment portions of local government meetings where community members had expressed their personal opinions. 

The current iteration of YouTube’s Medical Misinformation Policy prohibits “claims about COVID-19 vaccinations that contradict expert consensus from local health authorities or WHO.” Additionally, users are banned from stating Ivermectin and Hydroxychloroquine “are safe to use in the prevention of COVID-19” as well as from claiming categorically that “Ivermectin is an effective treatment for COVID-19.”

The platform also currently removes content that “claims that approved COVID-19 tests cannot diagnose COVID-19” and provides “instructions to counterfeit vaccine certificates, or offers of sale for such documents.”

YouTube’s leadership has not publicly commented on its decision to unban debate about masks and the effectiveness of COVID-19 vaccines from the platform.

*For corrections please email [email protected]*