- On Thursday, YouTube defended its decision to leave up videos spreading false or misleading claims about the 2020 election, including that President Donald Trump won.
- “Like other companies, we’re allowing these videos because discussion of election results & the process of counting votes is allowed on YT,” YouTube tweeted, adding that it wasn’t actively recommending the videos to users.
- The tweet came a few hours after Bloomberg journalist Mark Bergen criticized YouTube’s slow moderation of videos that falsely claim “Trump won.”
- Two days after the election, YouTube said it would allow videos with false or misleading election results, but that it wouldn’t run adverts on them.
- Visit Business Insider’s homepage for more stories.
YouTube on Thursday defended its decision to leave up videos falsely claiming that President Donald Trump won the 2020 election.
It was responding to Bloomberg journalist Mark Bergen’s tweet criticizing YouTube’s moderation of election videos, which he claimed was slow.
Bergen had previously reported about YouTube deciding to leave up videos from cable outlet One America News Network (OAN) that falsely claimed “Trump won.”
In a response to Bergen’s tweet, YouTube’s official account wrote: “Like other companies, we’re allowing these videos because discussion of election results & the process of counting votes is allowed on YT.
“These videos are not being surfaced or recommended in any prominent way.”
Two days after the election, YouTube said it would allow videos with false or misleading election results, but that it wouldn't run adverts on them.
In a follow-up tweet, the tech company said that the most popular videos about the US election came from "authoritative news organizations," but didn't say what it considers authoritative.
"On average, 88% of the videos in top-10 results in the U.S. come from high-auth sources when people search for election-related content," it said.
—YouTubeInsider (@YouTubeInsider) November 12, 2020
YouTube said in a third tweet that it links its election video panels with a Google webpage displaying verified election results.
YouTube didn't immediately respond to a request for comment from Business Insider.
The online video-sharing site has faced increased scrutiny throughout the 2020 presidential election for how it handles videos spreading misinformation.
For example, OAN, a pro-Trump broadcast network, posted a video on November 4 claiming that "Trump won four more years in office last night."
The video, which has amassed nearly 500,000 views as of writing, does not violate YouTube's community guidelines, according to the company.
Read more: How much money a YouTube video with about 100,000 views makes, according to 5 creators
Ivy Choi, a YouTube spokesperson, told Insider in a statement: "Our Community Guidelines prohibit content misleading viewers about voting, for example content aiming to mislead voters about the time, place, means or eligibility requirements for voting, or false claims that could materially discourage voting. The content of this video doesn't rise to that level."
YouTube said that it had demonetized the video.
Major social media platforms such as Twitter and Facebook have labeled posts containing false or unverified claims about the election.
Twitter has labeled several of Trump's own tweets as "disputed," and Facebook has actively demoted misinformation on its platform.