Meta Shifts Gear: Embracing Community-Based Content Moderation
Meta announced the end of its U.S. fact-checking program, opting for a community-based system for moderating content, akin to the X platform. This change signifies a reversal in its moderation policy, allowing more open discussions on sensitive topics. Meta's Oversight Board supports the move.
In a significant policy shift, social media giant Meta has scrapped its U.S. fact-checking program in favor of a new community-based system for content moderation. This approach, similar to the system employed by X, marks a reversal for Meta, long known for its strict content moderation under the leadership of CEO Mark Zuckerberg.
The change follows the appointment of Joel Kaplan, a Republican policy executive, as the new head of global affairs, as well as Dana White's election to the board. Zuckerberg emphasized the need to reduce errors, streamline policies, and restore free expression on the platform, pointing to past pressures and criticisms of undue censorship.
The cessation of the fact-checking program, launched in 2016, shocked some partner organizations, while others, including the Oversight Board, welcomed the move. Meta plans to implement the community notes system, allowing users to flag misleading posts for further context. The shift will impact major platforms like Facebook, Instagram, and Threads, affecting over 3 billion users worldwide.
(With inputs from agencies.)