Join us
Analysis

New study gives State of the Union on science fact-checking

September 19, 2018
 

Sign up for The Media Today, CJR’s daily newsletter.

Last October, at a Palo Alto workshop held in advance of the 2017 World Conference of Science Journalists, Deborah Blum faced a roomful of science editors and asked a relatively simple question: How many of them worked regularly with fact-checkers? There were 25 editors from across the industry in attendance; only six or so hands went up.

Blum, the director of MIT’s Knight Science Journalism Program and publisher of its magazine Undark, was surprised to see such a small showing. She was curious if this was representative of the industry, and she knew of no existing data.

So Blum hired Brooke Borel, a senior editor at Undark and author of The Chicago Guide to Fact-Checking, to compile some. The result is a report released last week called “The State of Fact-Checking in Science Journalism,” a systematic look at practices across the English-speaking science journalism world. Based on interviews with and surveys of editors, reporters, professors, and the fact-checkers themselves, the report reveals a healthy but inconsistent fact-checking practice, where procedures, rules, and pay vary across publications, platforms, and types of stories. And while a 300-word aggregated post may not require the same fact-checking process as a multipart investigative feature, firming up science journalism’s approach to accuracy can only advance it.

ICYMI: The Weekly Standard and the flaws in Facebook’s fact-checking program 

“If you have a robust process in place tailored to your publication, it can improve your output and improve your accuracy,” Borel says. “I think we all should be striving for fewer corrections and better reporting.”

Not everyone who responded to the survey felt that fact-checking was the most pressing need for science publications today, Borel says. But across the board, they generally agreed that fact-checking results in better-quality work, and many lamented the lack of resources available for it. One anonymous editor described “cringing” at stories published by their own publication that clearly hadn’t been fact-checked; Aeon editor Pamela Weintraub described losing any element of the editorial process, including fact-checking, as “taking away from the overall excellence.”

Sign up for CJR’s daily email

Science fact-checking can be resource intensive, demanding of the fact-checker a particular set of skills and knowledge. Fact-checkers need to be able to read and understand scientific literature that is intended for an audience of experts, and to evaluate an individual study in the context of the larger literature, often without academic expertise in the subject matter they are fact-checking. The report found that across publications, only 13 percent of fact-checkers had any kind of science degree, let alone an advanced degree in the specific subject of their stories.

“Fact-checkers who work on science and medical stories ideally should have some familiarity with scientific publishing and how peer review works,” says Laura Helmuth, president of the National Association of Science Writers and The Washington Post’s health, science, and environment editor. “It helps to understand clinical trials and some basic statistics. But the right attitude and approach are more important than an existing knowledge base: Any fact-checker can learn a field’s jargon and distinguish evidence from speculation.”

They also need to have a solid grounding in statistical concepts in order to properly evaluate claims. In medical journalism, for example, fact-checkers need to understand absolute versus relative risk. A treatment may accurately claim to decrease disease risk by 50 percent—a relative risk—while only reducing incidence from two patients out of 100 to one—the absolute risk.   

“It doesn’t take a PhD to do high-quality health care journalism reporting or science journalism reporting in general,” says Gary Schwitzer, founder and publisher of Health News Review, a website that evaluates stories in the medical press. “But it does require some knowledge or experience or training in certain key things that otherwise will mislead and misinform.”

Due to a lack of funding, Health News Review—which functions like a public fact-check on the healthcare media—is shuttering at the end of this year. On its way out, the publication is documenting instances when misreporting on medical studies harmed patients, like a couple who sank their life savings into opening a clinic as a last-ditch effort to treat the husband’s Type 1 Diabetes.

In an industry where misrepresenting results can lead to real harm for patients—and where clinical studies and press releases are rife with industry influence—fact-checkers who can catch these subtleties provide a true service to the public.

“It’s the nuance,” says Schwitzer. “You can technically be accurate and [still] you can be totally unhelpful and misleading.”

Schwitzer recommends all healthcare journalists and fact-checkers have good relationships with experts who can help them interpret claims—“have a friendly biostatistician that you buy a beer for every once in a while,” he says.

The report revealed a variety of ways in which publications could better support and utilize fact-checkers. While some receive excellent pay, many do not, and rates vary wildly across publications. The report found an average hourly rate of $27.76, with wages ranging from $15 to $50. (Some publications pay monthly or project-based flat fees.)

“Every time I’ve worked with a fact-checker, they’ve made a story better,” Helmuth says. “They catch details, ask questions a reporter or editor didn’t think of, and they notice when a chain of logic has some missing links.”

The industry also doesn’t have set standards or training in place, and many outlets—65 percent of those without dedicated fact-checkers, according to the report—don’t provide written guidelines to fact-checkers, leading to inconsistencies in the way publications handle portions of the fact-checking process, such as whether to show copy to sources before publication.

The debate over this practice has at times become contentious, both within the science journalism community and between journalists and scientists. In March, for example, an Undark reporter covering a Twitter poll about the practice found that none of the scientists discussing the issue openly online were willing to speak to her for a story, “citing a mistrust of journalists,” she wrote. Popular Science deputy editor Corinne Iozzio told KSJ that the inconsistency in policy had “cost us quality.”

“It’s kind of all over the place, which I wish it wasn’t,” Borel says. “I think it’s confusing for journalists and fact checkers and I think it’s confusing for sources.”

The mistrust arising from this inconsistency points to the larger role of fact-checking, beyond ensuring accuracy and helping to avoid lawsuits. Journalists trade on reliability, to their readers, to their sources, and to the public that they serve. An independent fact-checking process provides another layer of assurance to all parties that a story best represents the truth—just like the peer-review process is designed to do in science.

“I’m hoping people will read this and think about their own processes,” Borel says of the report. “I hope also this will have some newsrooms rethinking . . . where they’re allocating their resources for this sort of thing.”

ICYMI: Vanity Fair criticized for September cover shoot 

Has America ever needed a media defender more than now? Help us by joining CJR today.

Laura Dattaro is a science journalist, writer, and producer based in New York.