research misconduct – The Journalist's Resource https://journalistsresource.org Informing the news Wed, 24 Jan 2024 04:54:12 +0000 en-US hourly 1 https://wordpress.org/?v=6.5.5 https://journalistsresource.org/wp-content/uploads/2020/11/cropped-jr-favicon-32x32.png research misconduct – The Journalist's Resource https://journalistsresource.org 32 32 How to cover academic research fraud and errors: 4 big takeaways from our webinar https://journalistsresource.org/media/how-to-cover-academic-research-fraud-errors-webinar/ Tue, 05 Dec 2023 16:17:07 +0000 https://journalistsresource.org/?p=76886 Read on for great tips from Ivan Oransky, Elisabeth Bik and Jodi Cohen, three experts who have covered research misconduct or have hands-on experience monitoring or detecting it.

The post How to cover academic research fraud and errors: 4 big takeaways from our webinar appeared first on The Journalist's Resource.

]]>

In 2022, academic journals retracted more than 4,600 scientific papers, often because of ethical violations or research fraud, according to the Retraction Watch blog and database.

Although retractions represent a tiny fraction of all academic papers published each year, bad research can have tremendous impacts. Some studies involve new drugs, surgical procedures and disease prevention programs — all of which directly affect public health and safety. Also, government leaders rely on scholarly findings to help guide policymaking in areas such as crime, education, road safety, climate change and economic development.

On Nov. 30, The Journalist’s Resource hosted a free webinar to help journalists find and report on problematic research. Three experts who have covered research misconduct or have hands-on experience monitoring or detecting it offered a variety of tips and insights.

“How to Cover Academic Research Fraud and Errors” — a video of our Nov. 30 webinar

For those of you who missed the webinar, here are four of the big takeaways from our presenters, Ivan Oransky, a former president of the national Association of Health Care Journalists who teaches medical journalism at New York University and co-founded Retraction Watch; Elisabeth Bik, a microbiologist and science integrity consultant who has been called “the public face of image sleuthing;” and Jodi Cohen, an award-winning investigative reporter at ProPublica whose series “The $3 Million Research Breakdown” exposed misconduct in a psychiatric research study at the University of Illinois at Chicago.

1. Retraction Watch and PubPeer are two online resources that can help journalists identify and track research fraud and errors.

Retraction Watch, a blog launched in 2010, is a treasure-trove of information about research papers that have been removed from academic journals. The website features:

  • The Retraction Watch Database, which journalists can use to search for retractions connected to a specific researcher, university or research organization. Use it to look for patterns — for example, retractions among groups of researchers who tend to work together or among multiple researchers working at the same institution.
  • The Retraction Watch Leaderboard, an unofficial list of researchers with the highest number of paper retractions.
  • A list of scientific sleuths, including self-described “data thug” James Heathers and Michèle B. Nuijten, who, along with Chris Hartgerink, created statcheck, designed to find statistical mistakes in psychology papers. Some of these experts use aliases to protect against retaliation and harrassment.

Retraction Watch helped Cohen report on and provide context for a ProPublica investigation into the work of prominent child psychiatrist Mani Pavuluri.

It “was a huge resource in trying to understand this,” Cohen told webinar viewers. “The amount of information there and the ability to use that database — completely amazing.”

In her series, co-published in The Chronicle of Higher Education in 2018, Cohen revealed that Pavuluri “violated research rules by testing the powerful drug lithium on children younger than 13 although she was told not to, failed to properly alert parents of the study’s risks and falsified data to cover up the misconduct, records show.” The University of Illinois at Chicago, Cohen wrote, “paid a severe penalty for Pavuluri’s misconduct and its own lax oversight.” The federal government required the school to return the $3.1 million the National Institutes of Health gave it to fund Pavuluri’s study.

PubPeer is a website where researchers critique one another’s work. Comments are public, allowing journalists to observe part of the scientific process and collect information that could be useful in a news story.

Bik noted during the webinar that PubPeer is “heavily moderated” to reduce the likelihood of name-calling and speculation about a researcher’s work. The website explains its commenting rules in detail, warning users to base their statements on publicly verifiable information and to cite their sources. Allegations of misconduct are prohibited.

“You cannot just say, ‘You’re a fraud,’” Bik explained. “You have to come with evidence and arguments similar to a peer review report.”

PubPeer played a key role in student journalist Theo Baker’s investigation of academic papers co-authored by Stanford University President Marc Tessier-Lavigne. Tessier-Lavigne ultimately resigned and Holden Thorp, the editor-in-chief of the Science family of journals, announced in late August that two of Tessier-Lavigne’s papers had been retracted.

The Journalist’s Resource created a tip sheet on using PubPeer in August. Tip #1 from that tip sheet: Install a free PubPeer browser extension. When you look up a published research paper, or when you visit a website that links to a research paper, the browser extension will alert you to any comments made about it on PubPeer.

2. Early in the reporting process, ask independent experts to help you confirm whether a research study has problems.

Getting guidance from independent experts is critical when reporting on research fraud and errors. Experts like Elisabeth Bik can help you gauge whether problems exist, whether they appear to be intentional and how serious they are.

During the webinar, Bik advised journalists to ask for help early in the reporting process and seek out experts with the specific expertise needed to assess potential problems. Bik specializes in spotting misleading and manipulated images. Others specialize in, for example, statistical anomalies or conflicts of interest.

Bik’s work has resulted in 1,069 retractions, 1,008 corrections and 149 expressions of concern, according to her Science Integrity Digest blog. Journal editors typically issue an expression of concern about an academic paper when they become aware of a potential problem, or when an investigation is inconclusive but there are well-founded indicators of misleading information or research misconduct.

Bik stressed the importance of journalists helping correct the scientific record and holding researchers accountable.

“It seems that there’s relatively very few papers that have big problems that get corrected or retracted,” she said. “Institutional investigations take years to perform and there’s very rarely an action [as a result]. And senior researchers, who are the leaders, the mentors, the supervisors and the responsible people for these things happening in their lab, they are very rarely held accountable.”

Oransky encouraged journalists to get to know the scientific sleuths, some of whom are active on X, formerly known as Twitter.

“You can find dozens of people who do this kind of work,” he said. “It’s like any kind of whistleblower or source that you can develop.”

Oransky also highlighted common types of misconduct that journalists can look out for:

  • Faked data.
  • Image manipulation.
  • Plagiarism.
  • Duplication or “self-plagiarism” — when researchers reuse their own writings or data, taking them from a study that has already been published and inserting them into a newer paper.
  • Fake peer review — a peer review process that has, in whole or in part, been fabricated or altered to ensure a paper gets published.
  • Paper mills — organizations that create and sell fraudulent or potentially fraudulent papers.
  • Authorship issues.
  • Publisher errors.

3. One of the best ways to get tips about research fraud is to report on research fraud.

Oransky shared that he and other people at Retraction Watch continually receive tips about research misconduct. Tipsters will come to journalists they think will report on the issue, he said.

“You write about it and then people come to you,” Cohen added. “They don’t know you’re there unless you’re covering it regularly. And not even regularly, but like you start writing about it and show it’s something your interested in, you’re going to get more ideas.”

Another place journalists can go to check for allegations of research misconduct: court records, including subpoenas. They can also ask public colleges and universities for copies of records such as investigative reports and written communication between researchers and their supervisors, Cohen pointed out. If the research involves human subjects, journalists could request copies of reports and communications sent to and from members of the Institutional Review Board, a group charged with reviewing and monitoring research to ensure human subjects’ safety and rights are protected.

Cohen suggested journalists ask local colleges and universities for records tied to research funding and any money returned to funders. The National Institutes of Health maintains a database of organizations that receive federal grant money to conduct biomedical research.

“You could just start digging around a little bit at the institutions you cover,” Cohen said. “Be skeptical and ask questions of the data and ask questions of the people you cover.”

4. Discuss with your editors whether and how you’ll protect the identities of whistleblowers and experts who want to remain anonymous.

Many experts who leave comments on PubPeer or raise questions about research on other online platforms use aliases because they don’t want their identities known.

“You can imagine that not everybody wants to work under their full name so some of them are using all kinds of pseudonyms, although recently some of these people have come out under their full names,” she said. “But it is work obviously that doesn’t leave you with a lot of fans. Especially the people whose work we criticize are sometimes very mad about that, understandably so. But some of them have sued or threatened to sue some of us.”

Oransky said he has no issues letting scientific sleuths stay anonymous. They can explain their concerns in detail and show journalists their evidence. As with any source, journalists need to check out and independently confirm information they get from an anonymous source before reporting on it.

“Anonymous sources that are vulnerable — which a whistleblower is, which someone in a lab whose pointing out problems is, especially a junior person — as long as you know who they are, your editor knows who they are, that’s my rule,” he said. “We want to understand why they want anonymity, but it’s usually pretty obvious.”

Download Oransky’s slides from his presentation.

Download Bik’s slides from her presentation.

The post How to cover academic research fraud and errors: 4 big takeaways from our webinar appeared first on The Journalist's Resource.

]]>
5 tips for using PubPeer to investigate scientific research errors and misconduct https://journalistsresource.org/home/pubpeer-research-misconduct-tips-journalists/ Tue, 01 Aug 2023 20:33:43 +0000 https://journalistsresource.org/?p=75862 PubPeer, a website where scholars critique one another’s work, is an excellent investigative reporting tool. These five tips will help you make the best use of it.

The post 5 tips for using PubPeer to investigate scientific research errors and misconduct appeared first on The Journalist's Resource.

]]>

PubPeer, a website where researchers critique one another’s work, has played a key role in helping journalists uncover scientific misconduct in several prominent investigative stories in recent years — including the student newspaper series that led to Stanford University President Marc Tessier-Lavigne’s recent resignation.

The platform was created 10 years ago to encourage discussion of individual academic studies and “accelerate the correction of science,” cofounder Boris Barbour, a neuroscience researcher in France, told The Journalist’s Resource. Conversations generally center on papers that already have been peer reviewed and published in academic journals. When a discussion of a paper begins, PubPeer automatically invites an author to respond, sometimes spurring lengthy, detailed exchanges.

Comments are public, allowing journalists to observe part of the scientific process and collect information that could become the basis of an important news story. The vast majority of comments, however, are made anonymously, allowing scholars to raise questions and concerns without risking retaliation, Barbour noted.

“Although all forms of scientific discussion are welcome on PubPeer, the site has become known as the channel by which an astonishing volume of research misconduct has come to light,” he wrote in an email.

Last year, Stanford student journalist Theo Baker discovered allegations of altered images in Tessier-Lavigne’s research on PubPeer and began to investigate. Had journalists checked the website earlier, they would have found criticisms dating back to at least 2015, Baker writes in a July 30 essay for The New York Times.

 “Reporters did not pick up on the allegations, and [academic] journals did not correct the scientific record,” he writes. “Questions that should have been asked, weren’t.”

While PubPeer is an excellent reporting tool for journalists across beats, it’s crucial they use the information they find there responsibly. For example, don’t assume a comment is accurate or a calculation is correct because it came from an academic. Likewise, don’t shrug off serious accusations because the people making them don’t use their real names.

The platform is relatively easy to navigate. On the homepage, you’ll find a list of discussions organized according to their most recent comment.  You can search the PubPeer database using an author’s name, key words, a paper’s title or a paper’s numeric or alphanumeric identifier, such as a Digital Object Identifier, or DOI.

This tip sheet aims to help journalists make the most of the website. We created the five tips listed below based on advice several journalists familiar with PubPeer shared with us during phone and email interviews.

Keep reading to learn more from Julia Belluz, a former senior health correspondent at Vox who’s working on a book about nutrition and obesity; Stephanie M. Lee, a senior reporter at The Chronicle of Higher Education who covers research and the academic community; and Charles Piller, an investigative journalist for Science magazine and founding board member of the Center for Public Integrity.

1. Install the PubPeer browser extension.

PubPeer offers browser extensions for the four major web browsers. Once installed, the extension will alert you if PubPeer comments exist for any research paper you’re reading online.

There’s also a PubPeer plugin for Zotero, a web tool some journalists use to organize and share research. The plugin lets you see whether there are PubPeer comments on any of the papers saved in your Zotero collection.

2. Don’t publish a news story simply to point out people have made negative comments about a research study. 

The fact that an individual study has drawn a certain number of probing comments on PubPeer is not, on its own, newsworthy, Lee says. She uses PubPeer as a barometer of sorts, a quick way to get a cursory read on the credibility of certain researchers and studies.

“Basically, when I start looking into a group of scientists or a scientist whose work I’m interested in for whatever reason, PubPeer is the first place I’ll go to see if questions have been raised,” says Lee, who received the 2022 Victor Cohn Prize for Excellence in Medical Science Reporting for her investigative reporting about scientific misconduct.

Because PubPeer allows the whole research community to weigh in on a paper, it’s not surprising some scholars have spotted problems peer reviewers missed. The peer review process typically involves a small number of academics whose evaluation is limited to, for example, making sure the authors write clearly, their research design is sound, they cite other researchers’ work correctly and that what they have written corresponds with the information presented in their tables and figures.

“We know peer review can fail to catch errors or even outright fraud in research before it’s published,” Belluz wrote.

The website, she added, has become “a place where scientists go to whistleblow about problematic research, so anyone interested in scientific misconduct might find sources or ideas for stories on PubPeer.”

3. Verify all claims made on PubPeer before relying or reporting on them.

Piller describes PubPeer as “more of a tip sheet than an authoritative source.” 

“I’m not casting aspersions on it — it’s great,” he says. “It’s extremely valuable, and a lot of information on it is correct. The problem is, there’s a lot of anonymous sources.”

Barbour, who helped create PubPeer, estimated that “maybe 90%” of comments are posted anonymously.

 “Any criticism of the work of colleagues tends to be badly received, and you never know who will be making — often anonymous — decisions about your future,” Barbour wrote in an email to The Journalist’s Resource. “So, understandably, people worry. Furthermore, even an implicit (but convincing) suggestion of misconduct immediately creates an extraordinarily high-stakes situation. A career might be on the line. It’s unsurprising that those threatened would fight with every weapon at their disposal.”

Piller urges journalists to thoroughly verify everything they find on PubPeer before using it. He recommends enlisting the help of multiple experts with deep experience, including scholars with subject-area expertise and technical experts such as Elisabeth Bik, a well-known image analyst.

Bik helped Baker, the Stanford student journalist, with his investigation. Last year, she helped Piller look into possible image tampering in dozens of Alzheimer’s studies.

“You need to treat anonymous postings [on PubPeer] the same as you would treat anonymous postings of any kind,” Piller notes. “You have no way of knowing if that person has some sort of ulterior motive.”

4. Use caution when describing the likelihood that researcher misconduct happened.

Piller warns journalists to carefully choose words and phrases to convey how certain they are that a study’s findings are erroneous or that a researcher has intentionally participated in some form of misconduct. It can be difficult to establish with complete confidence that data or images have been fabricated or manipulated, especially if journalists lack access to the original data and cannot compare the images that appear in an academic article against the unedited, uncropped, high-resolution, earliest version of those images.

Piller suggests using language that reflects some level of uncertainty — for example, “apparent fabrication” or “potential errors.”

“You can wreck someone’s career, so you have to be really careful and really fair-minded about it, obviously,” he says. “Even if the evidence appears to be incontrovertible, I would use qualifiers.”

He stresses the importance of journalists explaining and making available the evidence they have found.

“Show evidence,” he advises. “It’s better to let the evidence speak for itself most of the time.”

5. No matter how small a role a researcher plays in your story, check PubPeer before including them.

“If you’re looking at a scientist as a source — a credible source of authority — it’s good to know whether that person is regarded as above reproach,” Piller points out.

Many journalists already know they need to look into possible conflicts of interest.

 “Another [red flag] would be credible evidence a person has engaged in data manipulation or image manipulation or other ways of operating scientifically that would cause you to want to question their credibility as a source for a story,” Piller adds.

Lee’s advice: Make a habit of checking PubPeer for the researchers and studies you’re considering including in your coverage, keeping in mind that a lack of comments does not mean there aren’t problems.

“If [journalists] come across research that is influential or controversial in whatever field they’re covering, it’s a good, proactive step to plug the URL into PubPeer and see what they get,” Lee says.

Because PubPeer’s search function does not work for all URLs, Barbour recommended using a DOI or other unique identifier.

The post 5 tips for using PubPeer to investigate scientific research errors and misconduct appeared first on The Journalist's Resource.

]]>