Facebook Logo
Facebook logo is seen displayed on a phone screen in photo taken in Poland on November 29, 2020.
Jakub Porzycki/NurPhoto via Getty Images
  • Facebook has banned false claims about COVID-19 vaccines in its latest attempt to stop the spread of misinformation.
  • Under the new rules, any claims about the safety, efficacy, ingredients or side effects of the vaccines which have been debunked will be removed as will any conspiracy theories.
  • The news comes as the UK became the first Western nation in the world to approve a COVID-19 vaccine made by Pfizer this week.
  • Visit Business Insider’s homepage for more stories.

Facebook has banned false claims about COVID-19 vaccines in its latest attempt to stop the rapid spread of misinformation on the platform.

Under the new rules, any claims about the safety, efficacy, ingredients, or side effects of the vaccines which have been debunked will be removed, as will any conspiracy theories.

Kang-Xing Jin, Head of Health at Facebook, said in a blog statement on Thursday: “For example, we will remove false claims that COVID-19 vaccines contain microchips, or anything else that isn’t on the official vaccine ingredient list.

“This is another way that we are applying our policy to remove misinformation about the virus that could lead to imminent physical harm.”

One of the most widespread but debunked conspiracy theories is that the pandemic has been created by Microsoft founder, Bill Gates, to insert microchips into humans through a COVID-19 vaccine.

The news comes as the UK became the first Western nation in the world to approve a COVID-19 vaccine made by Pfizer this week.

In other steps, Facebook removed 12m coronavirus-related misinformation posts between March and October, including a video of Donald Trump saying children are "virtually immune," according to AP.

Warning labels were put on 167m pieces of content, with 50m in April alone and 95% of people not clicking past to view the content resulting in the same time frame, the Independent reported.

In October, the company banned all ads discouraging vaccinations other than government advocacy ones and promoted articles debunking fake news on its information center, AP added.

The social networking site did not regulate anti-vaxx content until 2019, when it introduced its first policy of deleting misinformation, which could lead to physical harm, according to The Guardian.

Jin added that the new rules would not be enforced overnight "since it's early and facts about COVID-19 vaccines will continue to evolve, we will regularly update the claims we remove based on guidance from public health authorities as they learn more."

YouTube, owned by Google, and TikTok have also said they will remove false claims about COVID-19 vaccines, NPR reported.

Read the original article on Business Insider