mark zuckerberg
Facebook CEO Mark Zuckerberg.
Drew Angerer/Getty Images
  • Facebook will let people in groups become experts, whose posts will be amplified.
  • The idea is to designate "knowledgeable experts" who can serve as authoritative voices.
  • But the move is another way Facebook is skirting responsibility for what is posted on its site.
  • See more stories on Insider's business page.

Facebook is once again enlisting someone else to do its work for it.

The company said this week that it would let the administrators in its groups appoint users as "experts."

That means the people who run the online communities – which are comprised of everything from housing opportunities to dogs to extremists – will choose people within those groups to become an authoritative voice that others can rely on, an initiative partly designed to stamp out misinformation.

So, for example, an admin of a group devoted to conspiracy theories or white supremacy could designate someone within that space to be an expert, whose posts will be amplified and upon whom others would trust.

The new move is reminiscent of Facebook's creation of the oversight board, often called its "supreme court." As the company faced mounting pressure over how strictly it moderates potentially harmful posts, it didn't fully take responsibility itself.

Instead, it poured $130 million into designating a group of people outside of the company to review Facebook's decisions.

That move backfired in May, when the group punted a case back to Facebook, telling it to do its own work, stop being "lazy," and make its own rules.

If the oversight board's scolding is any indication, Facebook's new "experts" tool could similarly produce a less-than-productive outcome.

And not only that, it could have negative consequences if the platform is empowering certain users of online pockets that are devoted to misinformation-breeding topics.

Groups were originally designed to curb misinformation

protestors gather at the capitol in Janaury
Pro-Trump protesters gather in front of the U.S. Capitol Building on January 6, 2021 in Washington, DC.
Brent Stirton/Getty Images

Facebook put a larger focus on groups after the 2016 presidential election when the company really started fielding backlash over how misinformation spreads on its platform.

So CEO Mark Zuckerberg attempted to shift attention from Facebook's News Feed and to its groups to "help connect one billion people with meaningful communities."

But issues have arisen from the feature, and Facebook has removed some groups that it said risked incitement of violence.

A "Stop the Steal" group, for example, was created in November. 365,000 members joined that were convinced the 2020 presidential election was stolen from former President Donald Trump. But just two days later, Facebook removed it since it said the group was organized around "the delegitimization of the election process," and some members were making "worrying calls for violence.

And before that, in mid-2019, ProPublica reported on a 9,500-strong private group of current and former Border Patrol agents that were joking about immigrants dying and making crude comments about Rep. Alexandria Ocasio Cortez.

Facebook said in its announcement that there are more than 70 million admins and moderators operating active groups around the globe.

Moderating that much content is no easy feat, but Facebook's experts tool shows that it still hasn't found an accountable solution to how false facts can spread like weeds on its site.

Facebook did not respond to a request for comment.

Read the original article on Business Insider