By John McLaughlin
The ability to reproduce experimental findings is a keystone of the scientific method; it is a major part of what makes modern science such a successful social activity. In the past few years, however, there has been growing alarm over what is being called a “reproducibility crisis” in science, particularly the biomedical sciences.
One especially high-profile example was discussed in a Nature commentary two years ago: The biotech company Amgen, before investing resources into a new drug program, attempted to reproduce the findings of what it considered 53 “landmark” papers in the cancer biology field, and failed to do so for all but six of the publications. This raises the question, are resources being misguidedly invested into therapeutics that are based on flawed results? And more importantly, is this problem unique to pre-clinical research or is it more pervasive?
The replication problem is definitely receiving attention, in both the popular and scientific press. Several of the world’s most elite scientific journals, including Nature and Science, have recently published editorials calling for answers. Unsurprisingly, the proposed solutions have varied. Some are pushing for more extreme approaches, such as hiring independent, third party laboratories to reproduce the findings of a paper before it reaches publication. Other suggestions have been more modest; journals should require increased transparency regarding the description of experimental methods, and raw data should be submitted to open-access repositories where they can be scrutinized more closely.
The call for more rigorous standards of reproducibility is already evoking concrete responses. Last year, several organizations, including PLOS One, the Science Exchange, and Mendeley, together started the Reproducibility Initiative, which bills itself as an effort to “reward high quality reproducible research”. Here’s the basic idea: scientists confidentially submit their experiments for replication (for a fee), choosing among a network of labs with expertise in a chosen technique. If the findings are confirmed, they can boast an “Independently Validated” badge upon publication of the results. They have already received a $1.3 million grant to reproduce 50 of the “most impactful” cancer biology studies published during 2010-2012.
But if this practice becomes a norm, it may place further financial burdens on labs that are already struggling for funds. Are there any more modest, practical changes we can begin making in our own labs to combat this problem? Part of the solution can be improved graduate training of scientists; regarding the day-to-day use of statistics, which types of analysis are appropriate for your experiment, what sample sizes are needed and what conclusions can reasonably be drawn? Miscommunication between scientists may be a factor as well. Today’s biological science involves complicated experimental techniques, using highly complex animal and cell culture models; more intimate knowledge of the methods may be needed in order to faithfully replicate the results.
On the flip side, are institutional and cultural issues also playing a role? The frantic competition for academic faculty positions and grant funding may skew incentives, encouraging post-docs and PIs to cut corners and push for publication as quickly as possible, in high-tier journals. Nobel Laureate Randy Schekman called attention to this problem last year, and vowed to boycott publishing in “glamour” journals like Nature, Cell, and Science.
Whether or not you agree there is a replication crisis in biomedical science, it surely can’t hurt to encourage more openness, transparency, and improved training. The next generation of young scientists would benefit from making these practices a cultural norm.