mark zuckerberg facebook
Facebook CEO Mark Zuckerberg in Washington D.C. on Oct. 23, 2019
Andrew Harnik/AP
  • Facebook will reduce how much political content people see in their News Feeds, Axios reports.
  • The platform will specifically stop pushing as much content in front of people based on past engagement habits.
  • Facebook has tried to de-prioritize political posts on its site over the past year.

Facebook is rolling out another effort to stymie the proliferation of potentially contentious political posts online.

Axios reported Tuesday that Facebook will use negative user feedback to de-prioritize political and current events content in their feeds.

The platform specifically will stop relying so heavily on its algorithm that decides how likely someone is to share or comment on a certain post based on their past engagement, according to the outlet. Instead, it will rely on what users express interest in through surveys and other feedback.

The change, which the company plans to start testing the change in countries outside the US, could affect news publishers whose content focuses on politics. Facebook did not immediately respond to Insider's request for comment.

Facebook has been heavily criticized in recent years, especially in 2020, over its role in political misinformation online. It's historically taken a hands-off approach to moderating all kinds of content in an attempt to not be the "arbiters of truth," as CEO Mark Zuckerberg has touted.

Critics have zeroed in on its algorithm specifically for pushing more extreme and partisan content in front of people that it deems would more likely engage with it, which would prompt them to spend more time on the platform.

Facebook's 10 top-performing posts by engagement are often dominated by conservative content, according to data from Facebook-owned Crowdtangle. Earlier this month, the platform released its first-ever "Widely Viewed Content Report," which ranked popular content on the site based on what people see in their feeds rather than by engagement. The New York Times later reported that Facebook, fearing criticism, held an earlier version of the report that showed the most-viewed link was one that featured coronavirus misinformation.

The move detailed in the report isn't the first effort Facebook has made to limit the amount of political and potentially divisive content on its platform.

In June 2020, Zuckerberg wrote in a USA Today op-ed that the company would allow users to turn off political ads.

"Everyone wants to see politicians held accountable for what they say - and I know many people want us to moderate and remove more of their content," Zuckerberg wrote.

In January, after the US Capitol insurrection - whose participants were found to have organized in advance on Facebook and other websites - the company said it would stop recommending political groups to users for the "long term." Sen. Ed Markey a few days before had written a scathing letter to Zuckerberg, condemning Facebook groups as "breeding grounds for hate."

And the company in February started testing the temporary reduction of political posts on some News Feeds for users in the US and Canada, among other nations. The move, according to Zuckerberg, was because many users on the platform didn't want their feeds to heavily consist of political content.

But the company didn't start grappling with this problem in 2020. In late 2019, Facebook was fielding heavy blowback over its policy to not fact-check political advertising.

"We don't believe, however, that it's an appropriate role for us to referee political debates and prevent a politician's speech from reaching its audience and being subject to public debate and scrutiny," Facebook's VP of global affairs and communications Nick Clegg said at the time.

Read the original article on Business Insider