Plan to replicate 50 high-impact cancer papers shrinks to just 18
An ambitious project that set out nearly 5 years ago to replicate experiments from 50 high-impact cancer biology papers, but gradually shrank that number, now expects to complete just 18 studies.
“I wish we could have done more,” says biologist Tim Errington, who runs the project from the Center for Open Science in Charlottesville, Virginia. But, he adds, “There is an element of not truly understanding how challenging it is until you do a project like this.”
The Reproducibility Project: Cancer Biology (RP:CP) began in October 2013 as an open effort to test replicability after two drug companies reported they had trouble reproducing many cancer studies. The work was a collaboration with Science Exchange, a company based in Palo Alto, California, that found contract labs to reproduce a few key experiments from each paper. Funding included a $1.3 million grant from the Laura and John Arnold Foundation, enough for about $25,000 per study. Experiments were expected to take 1 year.
The project quickly drew criticism from authors of the original studies and others who worried that the replication studies would inevitably fail because the contract labs lacked the expertise needed to replicate the work.
Costs rose and delays ensued as organizers realized they needed more information and materials from the original authors; a decision to have the proposed replications peer reviewed also added time. Organizers whittled the list of papers to 37 in late 2015, then to 29 by January 2017. In the past few months, they decided to discontinue 38% or 11 of the ongoing replications, Errington says. (Elizabeth Iorns, president of Science Exchange, says total costs for the 18 completed studies averaged about $60,000, including two high-priced “outliers.”)
One reason for cutting off some replications was that it was taking too long to troubleshoot or optimize experiments to get meaningful results, Errington says. For example, deciding what density of cells to plate for an experiment required testing a range of cell densities. Although “these things happen in a lab naturally,” Errington says, this work could have proceeded faster if methodological details had been included in the original papers. The project also spent a lot of time obtaining or remaking reagents such as cell lines and plasmids (DNA that is inserted into cells) that weren’t available from the original labs.
One of the effort’s lessons: Disclosing more protocol details and making materials freely available directly from the original lab or through services like Addgene would speed scientists’ ability to build on the work of others. “Communication and sharing are low-hanging fruit that we can work on to improve,” Errington says. Another problem, Iorns adds, is that academic labs rarely validate their assays, making it difficult to know whether a positive result is real or “just noise.”
The project has already published replication results for 10 of the 18 studies in the journal eLife. The bottom line is mixed: Five were mostly repeatable, three were inconclusive, and two studies were negative, but the original findings have been confirmed by other labs. In fact, many of the initial 50 papers have been confirmed by other groups, as some of the RP:CB’s critics have pointed out.
The RP:CB team is now writing up the remaining eight completed studies and a meta-analysis and summary of the project. The 11 incomplete studies, which will be published in brief form, will still “have a lot of value, but not as much” as the completed replications, Errington says.