Johns Hopkins Study Finds Majority of Clinical Trials Ignore Relevant Research

The vast majority of already published and relevant clinical trials of a given drug, device or procedure are routinely ignored by scientists conducting new research on the same topic, a new Johns Hopkins study suggests.

The authors of the findings, reported in the Jan. 4 issue of Annals of Internal Medicine, argue that these omissions potentially skew scientific results, waste taxpayer money on redundant studies and involve patients in unnecessary research.

Conducting an analysis of published studies, the Johns Hopkins team concludes that researchers, on average, cited less than 21 percent of previously published, relevant studies in their papers. For papers with at least five prior publications available for citation, one-quarter cited only one previous trial, while another quarter cited no other previous trials on the topic. Those statistics stayed roughly the same even as the number of papers available for citation increased. Larger studies were no more likely to be cited than smaller ones.

"The extent of the discrepancy between the existing evidence and what was cited is pretty large and pretty striking," said Karen Robinson, Ph.D., an assistant professor of medicine at the Johns Hopkins University School of Medicine and co-author of the research with Steven N. Goodman, M.D., M.H.S., Ph.D., a Hopkins epidemiology and biostatistics professor. "It's like listening to one witness as opposed to the other 12 witnesses in a criminal trial and making a decision without all the evidence. Clinical trials should not be started--and cannot be interpreted--without a full accounting of the existing evidence."

Robinson and Goodman searched the Web of Science, an Internet archive, for meta-analyses done in 2004 on groups of randomized, controlled trials on such common topics as a cancer treatment or a heart procedure. A meta-analysis is a systematic procedure for statistically combining the results of many different studies on a similar topic to determine the effectiveness of medical interventions.

The researchers ultimately looked at 227 meta-analyses comprising 1,523 separate clinical trials in 19 different fields, including oncology, neurology and pediatrics. Of 1,101 peer-reviewed publications for which there had been at least five previous relevant papers, 46 percent acknowledged the existence of no more than one previous trial.

"Accurate representation of the prior cumulative evidence is necessary to both ethically justify a trial and to make proper inferences," they wrote. Studying prior research can also lead to study designs that are more likely to fill gaps in the evidence, the team said, noting that although the presence of a citation "does not tell us how information from that trial was used, the absence of a citation almost guarantees that it was not."

The Hopkins researchers could not say why prior trials failed to be cited or whether non-cited trials may have been taken into account in the trial design and proposal stages, such as grant requests to the National Institutes of Health.

At the very least, Robinson said, researchers often contend that their publications are so "unique" that there are no relevant studies to cite, even though someone else may have included it in a meta-analysis of like research. Others claim there just isn't room to cite past relevant research, but Robinson says one reason for the omissions could be the self-interest of researchers trying to get ahead.

"To get published, journals are looking for novelty, uniqueness," she said. Leaving out references to prior similar research can make findings seem more like a breakthrough, she adds. In her publications study, Robinson found several papers that claimed to be the first even when many trials on the subject preceded them.

There are no barriers to funding, conducting or publishing a clinical trial without proof that prior literature had been adequately searched and evaluated, she said. But requirements such as those have been instituted by some European funding agencies, the medical journal The Lancet, and the U.S. Centers for Medicare & Medicaid Services, which require that a covered trial not "unjustifiably duplicate existing studies," Robinson writes.

Robinson said funders, institutional review boards and journals need to take steps to ensure that prior research is considered. To do otherwise, she says, encourages this "unethical" behavior to continue.

"Trials being done may not be justified, because researchers are not looking at or at least not reporting what is already known," she said. "We may be wasting resources when we fund trials for which we already know the answer. And we may be coming to incorrect conclusions about what works in medicine."

In some cases, patients who volunteer for clinical trials may be getting a placebo for a medication that a previous researcher has already determined works or may be getting a treatment that another researcher has shown is of no value. In rare instances, patients have suffered severe side effects and even died in studies because researchers were not aware of previous studies documenting a treatment's dangers.

For more information, visit www.hopkinsmedicine.org/gim/faculty/robinson.html.

Featured

Artificial Intelligence