Meta shelves fact-checking for Community Notes model in major policy reversal
Social media giant Meta on Tuesday scrapped its US fact-checking program for a community-based system similar to X and reduced the curbs on discussions around contentious topics such as immigration and gender identity.
The move marks a reversal in Meta's content moderation policy as CEO Mark Zuckerberg has long championed active content moderation despite criticism from conservatives for alleged censorship on its platforms.
It also comes shortly after the company named prominent Republican policy executive Joel Kaplan as global affairs head and elected Dana White, CEO of Ultimate Fighting Championship and a close friend of President-elect Donald Trump, to its board.
"We've reached a point where it's just too many mistakes and too much censorship. It's time to get back to our roots around free expression," Zuckerberg said in a video.
"We're going to focus on reducing mistakes, simplifying our policies and restoring free expression on our platforms. We're going to tune our content filters to require much higher confidence before taking down content."
The end of the fact-checking program, started in 2016, caught some partner organizations by surprise.
"We didn't know that this move was happening and it comes as a shock to us. This is definitely going to affect us," said Jesse Stiller, managing editor at Check Your Fact.
Other partners, including Reuters AFP and USA Today, did not immediately respond to requests for comment. Meta's independent Oversight Board welcomed the move.
The latest changes will affect Facebook, Instagram and Threads - three of the biggest social media platforms with more than 3 billion users worldwide.
Zuckerberg has in recent months expressed regret over certain content moderation actions on topics including COVID-19. Meta also donated $1 million to Trump's inaugural fund, in a departure from its past practice.
Community Notes model
The success of the move, however, remains to be seen.
Elon Musk's X is already under European Commission investigation over dissemination of illegal content in the EU, and the effectiveness of measures taken to combat information manipulation, including the "Community Notes" system.
The Commission launched the probe in December 2023, several months after X launched "community notes" feature.
The Commission spokespersons were not immediately available for comment.
Meta said it would start phasing in Community Notes in the U.S. over the next couple of months and improve the model over the year.
The new model will allow users to call out posts that are potentially misleading and need more context, rather than placing the responsibility on independent fact-checking organizations and experts.
Meta will not decide which Community Notes show up on posts.
"It's a smart move by Zuck and something I expect other platforms will follow," X CEO Linda Yaccarino said in a post.
Meta will also shift its trust and safety teams overseeing content policies and review content out of California to Texas and other US locations.
Meta said it will focus its automated systems on illegal and high-severity violations, including terrorism and drugs. — Reuters