- YouTube is banning fake content about coronavirus vaccines, including unfounded claims that they will “kill people or cause infertility.”
- YouTube’s policies regarding COVID-19 misinformation, originally published in May 2020, bars content that contradicts facts from the World Health Organization.
- The company has been slow to ban COVID-19 misinformation altogether. Business Insider’s Paige Leskin found evidence of sellers peddling cures or treatments for the disease on YouTube back in April.
- “We’re very proactive in terms of removing it, and I think you’ll see us continue to be so,” YouTube CEO Susan Wojcicki told CNN’s Poppy Harlow regarding QAnon conspiracy theories, some of which have amplified COVID-19 misinformation.
- Visit Business Insider’s homepage for more stories.
YouTube will ban videos and other content that contain fake information about coronavirus vaccines.
The tech giant updated its policy on COVID-19 misinformation Wednesday to bar users from spreading false claims about a vaccine, including that it “will kill people or cause infertility,” or that “microchips will be implanted in people who receive the vaccine,” the company told Business Insider. CNET first reported the news.
YouTube’s policies regarding COVID-19 misinformation, originally published in May 2020, bars content that spreads false claims that contradicts facts from the World Health Organization and local health authorities. The company said it has removed more than 200,000 videos related to misleading information on COVID-19 since early February.
The company said it will release more information on how the company will “raise authoritative sources” related to COVID-19 vaccine content.
YouTube, as well as other large tech platforms like Facebook and Twitter, have been slow in combatting misinformation about COVID-19. Business Insider’s Paige Leskin found evidence of sellers peddling cures or treatments for the disease on YouTube back in April.
One study in the American Journal of Tropical Medicine and Hygiene found that nearly 6,000 people were hospitalized due to reading false medical information about COVID-19 on Facebook and Twitter in the first three months of the year.
Many false claims originate from the far-right conspiracy theory QAnon, which falsely claims that reported deaths from COVID-19 are fake. YouTube CEO Susan Wojcicki told CNN this week that she hasn't committed to banning QAnon on the platform.
"We're very proactive in terms of removing it, and I think you'll see us continue to be so," Wojcicki told CNN's Poppy Harlow.
If you have information to share, you can contact the author at [email protected].
Dit artikel is oorspronkelijk verschenen op z24.nl