Meta is ending its third-party fact-checking program and introducing a ‘community notes’ system similar to the one used by X. This change, announced by Meta CEO Mark Zuckerberg on Tuesday (January 7), is part of a broader shift in the company’s content moderation strategy.
The new system will allow users to contribute to the fact-checking process, aiming to reduce errors and simplify policies. Zuckerberg stated that the decision was influenced by recent elections and a desire to prioritize free speech. He emphasized that Meta’s previous moderation systems were too complex and prone to mistakes, leading to unnecessary censorship.
“We’re going to get back to our roots and focus on reducing mistakes, simplifying our policies, and restoring free expression on our platforms,” Zuckerberg said in a video. “More specifically, here’s what we’re going to do. First, we’re going to get rid of fact-checkers and replace them with community notes similar to X, starting in the U.S.”
The changes will affect Facebook, Instagram, and Threads, impacting billions of users worldwide. Meta will continue to focus on moderating content related to drugs, terrorism, and child exploitation while easing restrictions on political topics, immigration, and gender issues. The company will also relocate its trust and safety team from California to Texas.
Meta’s previous fact-checking program, launched in 2016, involved over 90 organizations worldwide. It used third-party fact-checkers to verify content accuracy and label misinformation. The shift to community-driven moderation reflects a growing trend among social media platforms to involve users in content oversight.
Recent Comments