Sign up for The Media Today, CJR’s daily newsletter.
Since Donald Trump’s reelection, Wikipedia has been catching right-wing bullets. There have been taunts from Elon Musk, prompted by a page that described his Nazi-style salute during the inauguration. “Defund Wikipedia,” he posted on X—which, under his ownership, has positioned itself as a competing source of “authoritative” information. (“Elon Musk salute controversy” now has its own Wikipedia entry.) Other tech executives followed suit, including Aravind Srinivas, the CEO of an artificial intelligence company called Perplexity, who put out a call for an “unbiased” encyclopedia reliant on his product. Around the same time, The Forward reported that the Heritage Foundation, the right-wing think tank, plans to “identify and target” Wikipedia editors, and the New York Post published an editorial citing a “bombshell new report” from a group called the Media Research Center: “For Wikipedia, ‘true’ is now synonymous with ‘left wing.’”
Amid the attention, a group of Wikipedia editors in New York convened for a “Wiki Wednesday” meeting, a regular gathering to chat about articles in progress, troubleshoot, and mingle. In a co-op workspace a few blocks north of Times Square, the crowd munched on toasted empanadas, beef patties, and chocolate chip cookies. The host—a longtime Wikipedia editor, also known as a Wikimedian—let everyone know that they could head upstairs and check out a temporary shelter for rescued turtles. He announced an upcoming Wiki-fashion show; the editors reported updates on articles they’d been working on (number of babies born on the New York subway: nine, as of February 12). Then they conferred about threats from the MAGAsphere.
Molly Stark, an editor, bemoaned the ire, and expressed concern over implications for the site’s security. “The whole rest of the internet has been through ‘enshittification’—and everything else is under paywalls, or it’s under advertising of some sort, and Wikipedia is the last bastion of the old internet,” she said.
“I think mostly it’s just personal vendettas, right? Like people have an issue with their Wikipedia page or the way that they were portrayed, and then, you know, that kind of turns into ‘Wikipedia is bad,’” Pacita Rudder, who serves as the director of Wikimedia NYC, the official chapter for editors in the city, said. (Musk’s anti-Wiki vitriol goes back years, to the time he complained about having been identified as an “early investor” in Tesla, instead of a founder. He offered up a billion dollars if the site changed its name to “Dickipedia.”) “What people like Donald Trump and Elon Musk complain about are usually more contentious articles, like wars or US politics. I try to avoid these types of things.”
Others didn’t seem worried. “We’ve been fighting off attacks of one kind or another for twenty years, almost,” Jim Henderson, an editor since 2006, said. Henderson—who is seventy-six, and a former telephone operator—has been a prolific Wikipedia photographer, both shooting and, increasingly, tagging photos in Wikimedia Commons for several hours a day.
Fellow editors speedily recalled Wikipedia’s brushes with foreign governments. Last year, in Russia, the Kremlin cloned the site, edited out anything deemed unfit, then banned the original. In late 2023, the Wikimedia Foundation announced that the internet service provider from which Wikipedia receives most of its traffic in Gaza was subject to disruptions that cut off readership dramatically. Back in 2013, the French secret service threatened to arrest an editor for an article that officials said contained classified information. (The page had been online since 2009, and was later restored.) For Ryan Ng—who is twenty-six, and has been editing Wikipedia since middle school—that all means the United States is, at least, unexceptional. “Wikipedia is a global project,” he said. “It can’t just be centered on one country.”
But if the common feeling in the room was that Wikipedia wasn’t under existential threat, the editors still felt vulnerable. The creation of pages and revisions of entries are entirely transparent—just click “view history”—but editors tend to work under pseudonyms. The Heritage Foundation, according to The Forward, intends “to use facial recognition software and a database of hacked usernames and passwords in order to identify contributors to the online encyclopedia.” That is ostensibly in the interest of combating anti-Semitism—in June, a panel of Wikipedia editors categorized the Anti-Defamation League as a “generally unreliable” source of information on the Israeli-Palestinian conflict, limiting how it can be cited. In reality, the ADL is handled differently depending on the subject—a system fully explained, including the methodology behind the categorization, on Wikipedia’s “Reliable sources/Perennial sources” page. “When you choose to become an editor, it’s because you’re passionate about an issue or you’re passionate about making sure that knowledge exists and it’s free for people to use,” Rudder said. “You don’t get paid to do this, and you didn’t sign up to be attacked.”
No one was sure exactly what could be done to dox people or hack articles. Some suggested possibilities, which others shot down. It seemed a pain to think about, as one editor put it: “I don’t want Wikipedia to become, like, this center for resistance.”
For a few, the back-and-forth was reassuring. “If you meet three Wikipedia editors, you’d feel better about the long-standing stamina of Wikipedia,” Stark said. “Like, if you see all these editors, you’re just like, Oh, these are the people who are trying to make sure that Wikipedia is going to remain not just intact, but that we’re going to keep telling this evolving story of the internet.”
Has America ever needed a media defender more than now? Help us by joining CJR today.