• HOME»
  • Science & Tech»
  • Meta Ditches Fact-Checkers, Bets On Free Speech And Embraces User-Led Moderation

Meta Ditches Fact-Checkers, Bets On Free Speech And Embraces User-Led Moderation

Mark Zuckerberg announced Meta's shift to user-led moderation, reducing censorship and boosting political content while addressing bias concerns.

Advertisement
Meta Ditches Fact-Checkers, Bets On Free Speech And Embraces User-Led Moderation

Meta CEO Mark Zuckerberg announced major changes to the company’s content moderation strategy, focusing on free speech and less censorship on platforms like Facebook, Instagram, and Threads. In a recent video message, Zuckerberg revealed plans to eliminate fact-checkers, opting instead for a community-driven moderation system akin to the one used by X, where users provide context and caveats to controversial posts.

Zuckerberg cited political bias as a primary reason for discontinuing Meta’s current fact-checking system, arguing it had eroded trust. He also stated that content moderation teams would be relocated from California to Texas to address concerns about bias. Acknowledging the risks, Zuckerberg admitted the new approach would likely result in fewer harmful posts being caught but stressed the importance of prioritizing free expression.

The reforms will further relax restrictions over sensitive subjects, such as immigration and gender issues, which he described as totally out of sync with mainstream discourses. Zuckerberg vowed to cooperate with President Donald Trump in combating international moves aimed at silencing American tech giants, which, according to Zuckerberg, aim at censoring Americans.

Zuckerberg also hit out at European and Latin America’s regulatory settings, accusing the latter of hampering innovation with censorship laws and secret court orders.

This comes after Nick Clegg, Meta’s president of global affairs, resigned and will be replaced by Joel Kaplan, a well-known Republican. Meta’s oversight board welcomed the shift towards community moderation but emphasized that broad input from platform users is necessary to ensure effective and speech-friendly policies.

Mr Zuckerberg framed these as a return to his 2019 position, emphasizing the value of balancing removing harmful content from the platform against free speech and admitting that allowing the decision sometimes means trading-off increased harmful content slipping through with the focus solely on illegal, high-severity violations.

With this new moderation model, Meta hopes to transition to a more open discourse and leverage its huge user base in the management of the content of the platform.

Advertisement