Meta, the parent company of Facebook and Instagram, is overhauling its content moderation approach by eliminating third-party fact-checkers in favor of user-generated “community notes.” This change, announced by CEO Mark Zuckerberg on Tuesday, follows a similar shift by Elon Musk on X (formerly Twitter). The new system, which will initially roll out in the United States, aims to address accusations of political bias in moderation, especially from conservative groups.
Zuckerberg’s Rationale: Reducing Bias and Restoring Trust
Zuckerberg explained that the shift was motivated by concerns that fact-checking systems had become politically biased, undermining trust in the platform. He acknowledged that while the move would reduce the oversight of harmful content, it was necessary to counteract the censorship that some felt targeted their views.
“What started as a movement to be more inclusive has increasingly been used to shut down opinions,” Zuckerberg said in a video. He emphasized that Meta was trying to balance free expression with the need to limit misinformation.
Political Influence and Shifting Leadership
Meta’s decision also coincides with a broader political shift within the company. The appointment of Joel Kaplan as Meta’s Chief of Global Affairs signals a more conservative direction, which includes closer ties to former President Donald Trump. Kaplan, a former Republican strategist, acknowledged that Meta had been under political pressure to increase content moderation over the past few years.
In a move that reinforced these political shifts, Meta also named UFC CEO Dana White, a Trump ally, to its board and pledged $1 million to Trump’s inaugural fund. Kaplan admitted that these changes were made with the incoming administration in mind, which is more supportive of free speech.
Trump’s Support and Criticism of the Change
Trump, who had been a vocal critic of Meta’s previous moderation policies, praised the decision. At a Mar-a-Lago press conference, he noted that Meta had “come a long way” and suggested that the policy change was likely in response to his past criticisms. The former president expressed satisfaction with the company’s shift, signaling a potential alliance with Meta’s new direction.
On the other hand, some groups criticized the change. The Real Facebook Oversight Board, an independent organization focused on holding Meta accountable, condemned the decision, accusing the company of pandering to political interests. The group warned that the new policy could lead to an increase in harmful content across the platform.
Following Elon Musk’s Lead
Meta’s move to replace fact-checkers with community notes mirrors the approach taken by Musk after his acquisition of Twitter. Musk dismantled Twitter’s fact-checking system and introduced community-driven content moderation. Kaplan applauded Musk’s role in shifting the debate on free expression, acknowledging that this model had influenced Meta’s decision.
X CEO Linda Yaccarino and Musk both praised the change, with Musk calling it “cool” and Yaccarino lauding it as a successful model for balancing free speech with content accountability.
Modifying Automated Systems and Redefining Policy Violations
Alongside the shift to community notes, Meta will scale back its automated systems that previously flagged content for policy violations. Zuckerberg admitted that these systems often resulted in over-censorship, with legitimate content being removed mistakenly. Moving forward, automated systems will focus on detecting severe violations such as terrorism, child sexual exploitation, and fraud. Other content will be reviewed only if flagged by users.
Zuckerberg acknowledged that this change might allow more harmful content to remain but insisted that it was necessary to reduce the number of wrongful removals.
Relaxing Restrictions on Sensitive Topics
Meta is also loosening restrictions on politically sensitive topics, including immigration and gender identity. The company will allow more political content in users’ feeds and ease up on content moderation related to these subjects. Additionally, Meta is relocating its trust and safety teams from California to Texas, a move intended to help address concerns about perceived bias within its moderation teams.