Join us

India’s Fact-Checkers React to Meta’s Policy Change

When Mark Zuckerberg suddenly abandoned his company’s third-party fact-checking program, I checked in with my old colleagues—who are reeling from the news.

January 13, 2025
Image source: Instagram / Illustration by Katie Kosma

Sign up for The Media Today, CJR’s daily newsletter.

In the wee hours of Tuesday morning, Silicon Valley time, Mark Zuckerberg, the CEO of Meta, announced sweeping changes to the company. “It’s time to get back to our roots around free expression,” he declared. Meta would loosen its filters, unleash more political content, and eliminate its third-party fact-checking program, starting in the United States. The last of these would slash an effort begun in 2016 following Donald Trump’s election—an investment of more than a hundred million dollars—and replace it, on the eve of his return to the White House, with “community notes,” a system by which account holders on Meta’s platforms will decide what’s misleading. “The fact-checkers have just been too politically biased,” Zuckerberg said, “and have destroyed more trust than they’ve created.” When I awoke to the news, I thought first of my former colleagues at an Indian outlet called The Quint, one of around a dozen newsrooms in the country with which Meta has partnered on fact-checking.

The Quint, where I worked as deputy editor, is an organization of some forty journalists, publishing in Hindi and English, with a special division called Webqoof, focused on combating mis- and disinformation and improving media literacy. The name, a play on the Hindi word bewakoof, is a reference to the foolishness of falling for disinformation on the Web. Even as the size of The Quint’s newsroom has diminished somewhat in recent years—Indian media has been subject to many of the same economic and political challenges as those in the US—the Webqoof team has remained fairly consistent, including six or seven fact-checkers who publish around ninety stories per month, or a thousand a year. That’s because their work has been funded reliably through grants from Meta’s third-party fact-checking program. Meta’s support has not only helped them combat falsehoods at scale, in what is arguably the world’s disinformation capital, by volume, but also made room for other important journalism: an immersive multimedia feature to help readers detect AI-generated images, a deeply reported video series on how online disinformation has led to real-world harm, and a series scrutinizing claims by politicians. Crucially, the fact-checking grants have been large enough to subsidize the work of other Quint colleagues.

That sense of stability has now abruptly been lost. Even though Zuckerberg’s announcement centered on the US, where he recently contributed a million dollars to Trump’s inaugural fund, Meta has been among the largest financial supporters of fact-checking operations worldwide, bankrolling efforts to combat disinformation in more than a hundred countries. Everyone can see what’s coming. “I was really tense all night,” a Quint journalist told me. “This beat that was thriving might not anymore due to a lack of funding. I might have to change my career trajectory.” Said another: “I am feeling hopeless and completely numb.” At the end of the week, The Quint joined dozens of other international fact-checking partners in signing an open letter to Meta, arguing that the decision to end its program “is a step backward for those who want to see an internet that prioritizes accurate and trustworthy information”—and pledging commitment to their work.

The deal has been that, through its third-party fact-checking program, Meta pays partner newsrooms such as The Quint to examine viral posts on its platforms and flag any that include misinformation, noting whether they’re false, have missing context, or include an altered photo or video. That information then appears with the posts in question, along with labels flagging that something has been reviewed by an independent source. Account holders have also been given the option to click on a “see why” box providing a link to a corresponding article by a fact-checking partner, explaining how its determination was reached. Each time a fact-checker has flagged a post as false, Meta has claimed, the company would “significantly reduce that content’s distribution so that fewer people see it, label it accordingly, and notify people who try to share it.” That has by no means been a perfect solution—the mis- and disinformation on Meta’s apps is just too rampant. But it has been a marriage of convenience, a consistent source of funding for outlets whose business models social media companies largely upended.

Pulling those resources could be particularly consequential in India, where the “community notes” approach would depend on the “wisdom” of a crowd that has repeatedly proven unreliable—and even fatal, as in 2018, when WhatsApp rumors of child abduction led more than a dozen people to be killed in mob attacks. A significant driver of the country’s disinformation crisis is anti-Muslim sentiment, targeting a minority community that is especially vulnerable as Narendra Modi, India’s prime minister and a Hindu nationalist, displays diminishing tolerance for other groups. As a case in point: In 2023, a deadly train accident was followed immediately by malicious online disinformation, claiming that the station master where the tragedy occurred was at fault—and a Muslim. Fact-checkers at The Quint quickly reviewed those claims against evidence, which they sourced from people who had been at the scene, authorities, and images. The fact-checkers concluded that the accusations circulating on social media were not true. Posts on Facebook that had shared those rumors were soon appended with warnings attesting to their falsehood. When I reached out to former colleagues, we agreed: Zuckerberg’s “community notes” would never have led to the same result—or credibility. “Having fact-checking partners was like community notes, but better,” an editor told me, “following the highest principles of not just fact-checking, but journalism.”

The range of mis- and disinformation that looms is vast—not only for politics, but for public health, religious freedom, education, and beyond. In the US, since Zuckerberg’s announcement, the way older fact-checks appear on Meta’s platforms has already begun changing. Journalists in India are following along anxiously. “It’s not just a scary situation for fact-checkers,” a former Quint colleague told me, “but also scary as an individual.”

Has America ever needed a media defender more than now? Help us by joining CJR today.

Meghnad Bose is a Delacorte fellow at CJR.