Meta Ends Third-Party Fact-Checking Program, Expands Free Expression Policies

Meta is overhauling its content moderation policies, ending third-party fact-checking in the U.S., introducing a user-driven Community Notes system.

Meta Ends Third-Party Fact-Checking Program, Expands Free Expression Policies

Meta announced sweeping changes to its content moderation policies, including the end of its third-party fact-checking program in the United States. The company will transition to a Community Notes model, aiming to reduce censorship while maintaining transparency. These changes are part of a broader effort to prioritize free expression on its platforms, which include Facebook, Instagram, and Threads.

Meta’s Transition to Community Notes

The third-party fact-checking program, launched in 2016, faced criticism for perceived bias and overreach. Meta acknowledged that the program often led to the unintended censorship of legitimate political discourse.

The new Community Notes system, modeled after a similar initiative on X (formerly Twitter), will allow users to contribute context to posts deemed potentially misleading. These notes will be collaboratively written and rated by contributors from diverse perspectives. Meta stated it would not write or select the notes displayed on its platforms.

“Once the program is up and running, Meta won’t write Community Notes or decide which ones show up,” said Joel Kaplan, Meta’s Chief Global Affairs Officer. The company plans to phase in the program over the coming months, starting in the U.S.

Lifting Restrictions on Speech

Meta is also removing restrictions on several topics, such as immigration and gender identity, which it views as central to political discourse. The company acknowledged that its content moderation systems have been overly restrictive, leading to the wrongful removal of content and user frustration.

In December 2024 alone, Meta removed millions of pieces of content daily, but the company estimates that 10-20% of these actions may have been errors. To address this, Meta will focus automated systems on high-severity violations, including terrorism and fraud, while relying on user reports for less severe issues.

“We are in the process of getting rid of most [content] demotions and requiring greater confidence that the content violates [policies],” Kaplan noted.

Revisions to Enforcement and Appeals

Meta is revising its enforcement mechanisms to reduce errors. Changes include requiring multiple reviewers to agree before content is taken down and using large language models (LLMs) to provide second opinions on enforcement decisions.

To improve the account recovery process, Meta is testing facial recognition technology and expanding its support teams to handle appeals more efficiently.

A Personalized Approach to Political Content

Meta plans to reintroduce more political and civic content to user feeds but with a personalized approach. The company’s previous efforts to reduce such content based on user feedback were deemed too broad.

Meta will now rank political content from followed accounts using explicit signals, such as likes, and implicit signals, like time spent viewing posts. Users will have expanded options to control how much political content appears in their feeds.

This article, "Meta Ends Third-Party Fact-Checking Program, Expands Free Expression Policies" was first published on Small Business Trends