Meta is making sweeping changes to its content moderation policies, including abandoning third-party fact checks in favor of crowd-sourced “Community Notes” and loosening restrictions on topics like immigration and gender identity. Under the updated Hateful Conduct policy, for example, calling gay and trans people “mentally ill” is now allowed, while an explicit ban on referring to women as “household objects” has been removed.
New policy lead Joel Kaplan said that in pursuit of “More Speech and Fewer Mistakes,” Meta will focus more on preventing over-enforcement of its content policies and less on mediating potentially harmful but technically legal discussions on its platform.
It comes just two weeks before President-elect Donald Trump is set to return to the White House, and CEO Mark Zuckerberg’s announcement appealed to many of the incoming administration’s talking points. Zuckerberg has promised to move US content review from California to Texas, where he says there’s “less concern about the bias of our teams,” and said Meta would work with Trump to “push back on governments around the world that are going after American companies and pushing to censor more.”
-
Image: Alex Parkin / The Verge
As I write this, there are a lot of social network users who are wondering if they should look for a new home. Over at X, Elon Musk has essentially become part of the incoming Trump administration, while various changes have made the formerly popular social network a dark and forbidding forest for many of its former inhabitants.
Meanwhile, Meta’s announcement that it was abandoning third-party fact-checkers and moving its trust and safety teams from California to Texas is making some Facebook and Instagram members nervous. So nervous, in fact, that while we previously included Meta’s Threads social network in this article as a possible alternative to X, we’ve pulled it — at least for now.
-
Mark Zuckerberg is in Threads replies defending his content moderation changes.
The Meta CEO is pushing back on critics who say the company is only making its content policy changes because it’s “too hard for people to leave.” Zuckerberg shot back that he’s “counting on these changes actually making our platform better,” and while some may leave for “virtue signaling,” most users will enjoy the changes.
-
Meta’s third-party fact checking contracts will reportedly end in March.
The ten fact-checking organizations will continue to receive payments until August, and those who haven’t signed 2025 contracts could get severance, Business Insider reports. Meta told members of the International Fact-Checking Network that their partnerships were ending just 45 minutes before it publicly announced sweeping changes to its content moderation and fact checking policies.
-
Image: Cath Virginia / The Verge; Getty Image
Experts warn that Meta’s decision to end its third-party fact-checking program could allow disinformation and hate to fester online and permeate the real world.
The company announced today that it’s phasing out a program launched in 2016 where it partners with independent fact-checkers around the world to identify and review misinformation across its social media platforms. Meta is replacing the program with a crowdsourced approach to content moderation similar to X’s Community Notes.
-
“Mark, Meta — welcome to the party.”
X CEO Linda Yaccarino commended Mark Zuckerberg’s move to ditch third-party fact-checking in favor of a Community Notes-style moderation (inspired by X) onstage at CES. “It couldn’t be more validating,” Yaccarino said. “Mark and Meta realized that it’s the most effective, fastest fact checking, without bias.”
“Mark, Meta — welcome to the party,” she added.
-
Image: Cath Virginia / The Verge; Getty Images
As part of Meta’s sweeping changes to content moderation announced today, CEO Mark Zuckerberg says that the company will also be moving its content moderation teams from California to Texas to “help remove the concern that biased employees are overly censoring content,” he wrote on Threads.
“We’re going to move our trust and safety and content moderation teams out of California, and our US-based content review is going to be based in Texas,” Zuckerberg says in a video about the changes. “As we work to promote free expression, I think that it will help us build trust to do this work in places where there’s less concern about the bias of our teams.”
-
Oversight Board to Meta: hey, remember us?
The Meta Oversight Board — a semi-independent body that interprets Meta’s rules and suggests changes — has responded to the recent dissolution of the company’s third-party fact-checking system. Its statement contains a series of gently worded reminders to Meta that it exists and would very much like to continue existing in the future, pretty please.
-
Image: Cath Virginia / The Verge
I have to commend Meta CEO Mark Zuckerberg and his new policy chief Joel Kaplan on their timing. It’s not hugely surprising that, as the pair announced early today, Meta is giving up on professional third-party fact-checking. The operator of Facebook, Instagram, and Threads has been backing off moderation recently, and fact-checking has always been contentious. But it’s probably smart to do it two weeks before President-elect Donald Trump takes office — and nominates a Federal Communications Commission head who’s threatened the company over it.
Trump’s FCC chairman pick (and current FCC commissioner), Brendan Carr, is a self-identified free speech defender with a creative interpretation of the First Amendment. In mid-November, as part of a flurry of lightly menacing missives to various entities, Carr sent a letter to Meta, Apple, Google, and Microsoft attacking the companies’ fact-checking programs.
-
Laura Normand / The Verge
Facebook, Instagram, and Threads are ditching third-party fact-checkers in favor of a Community Notes program inspired by X, according to an announcement penned by Meta’s new Trump-friendly policy chief Joel Kaplan. Meta is also moving its trust and safety teams from California to Texas.
“We’ve seen this approach work on X – where they empower their community to decide when posts are potentially misleading and need more context, and people across a diverse range of perspectives decide what sort of context is helpful for other users to see.” Meta said. “We think this could be a better way of achieving our original intention of providing people with information about what they’re seeing – and one that’s less prone to bias.”