Meta Platforms has announced a major shift in its approach to content moderation in the United States, suspending its fact-checking program. The parent company of Facebook, Instagram, and Threads, which collectively boast over 3 billion users, will relax restrictions on controversial subjects such as immigration, sports, and gender identity. This move comes just as President-elect Donald Trump is poised to take office.
Key Changes
Rather than relying on an official fact-checking program to address questionable claims across its platforms, Meta plans to implement a community-based system similar to the one on Elon Musk’s X (formerly Twitter). This new approach, known as “community notes,” will enable users to provide context and flag misinformation, shifting responsibility from the company to the community. Additionally, Meta will no longer proactively monitor hate speech or other policy violations. Instead, posts will only be reviewed if flagged by users.
Follow THE FUTURE on LinkedIn, Facebook, Instagram, X and Telegram
Meta will also relocate the teams responsible for content moderation from California, where most of the company’s US operations are based, to Texas and other locations in the country.
A Shift In Policy
This overhaul represents the most significant shift in Meta’s content moderation strategy, signaling a potential shift in its alignment with the incoming presidential administration. CEO Mark Zuckerberg seems to be signaling a return to a more lenient stance on freedom of expression.
In a video statement, Zuckerberg stated, “We’ve reached a point where there’s just too much wrongdoing and too much censorship. It’s time to get back to our roots around freedom of expression.”
A Strategic Move
Meta’s decision follows its recent hiring of conservative figures to its board, including Joel Kaplan, a former Republican Party political strategist, who was appointed head of global affairs. Dana White, the CEO of the UFC and a close ally of Trump, was also named to the board. Zuckerberg has publicly expressed regret over some of Meta’s previous content moderation decisions, especially in relation to COVID-19. Additionally, Meta made a notable donation of $1 million to Trump’s inauguration fund, diverging from its past practices.
A Backlash
The decision to end the fact-checking program, which began in 2016, was met with strong opposition from partner organizations. Angie Drobnick Holan, head of the International Fact-Checking Network, called it “a serious blow to the community,” emphasizing that fact-checkers did not censor posts but provided additional context and debunked false claims and conspiracies.
What’s Next?
Currently, Meta plans to implement these changes exclusively in the US market. It remains unclear whether similar changes will be made in the European Union, which has adopted stricter regulations through the Digital Services Act. This law, which came into force in 2023, mandates large online platforms like Facebook and X to address illegal content and potential public safety risks. X’s “Community Notes” feature is already under scrutiny by the European Commission, which launched an investigation in December 2023.
If Meta or any other company violates EU regulations, they could face fines of up to 6% of their global revenue.