It’s a familiar complaint: Much of basic and preclinical research cannot be reproduced. What’s more, most irreproducible studies remain in the literature, leading other scientists to waste time and resources attempting to repeat the findings. One solution is to make it easier to publish negative data, but past efforts to do so have had at best limited success. Now, a wave of new forums are taking on the challenge, with innovative publishing models. The new outlets fill different niches. One is a channel at Faculty of 1000 Research that focuses on preclinical research, particularly from industry; one is an online journal—Science Matters—for single observations; another is a database where preclinical studies are graded. The new efforts appear amid a push for greater transparency in research. Some researchers have begun to post their lab notebooks, and other venues encourage scientists to upload unpublished or preprint data.
It is too early to know if these fledgling ventures will thrive. Researchers express enthusiasm for the initiatives in principle, but whether they will follow through and submit their data to these sites remains to be seen.
Regardless of whether these particular outlets take off, researchers agree that changes in science publishing are desperately needed. “A lack of reproducibility across the entire research spectrum is one of the biggest issues we are facing in the biological sciences,” Lorenzo Refolo at the National Institute on Aging, Bethesda, Maryland, told Alzforum. An alarming 2012 study by scientists at Amgen in Thousand Oaks, California, reported that they were unable to replicate key findings from 47 of 53 publications, and other studies have cast similar doubt on the reliability of the literature (see Begley and Ellis, 2012; Vasilevsky et al., 2013; Prinz et al., 2011).
In response to such reports, the National Institutes of Health and other groups have established new guidelines calling for more openness as well as more rigorous research methods and data analysis (see May 2013 news; Jan 2014 news; Jul 2015 news).
Thus far, many calls for reform have focused on cleaning up the literature. Part of the problem, researchers say, is that scientists have little incentive to submit contradictory or confirmatory data for publication. “Science rewards people who make new discoveries. Those who correct the literature don’t get credit,” Bruce Alberts at the University of California, San Francisco, told Alzforum. Alberts is a former editor in chief of Science magazine and president of the National Academy of Sciences. Those who do try to submit such data often run into barriers. Lawrence Rajendran at the University of Zurich went through two years and multiple rounds of revision before having a paper that challenged a high-profile finding rejected at Nature. Rajendran told Alzforum that other papers have met a similar fate, “Clearly, something is not working in science publishing.”
Some groups tried to correct this problem more than a decade ago by starting outlets for contradictory findings. They include the Journal of Negative Results founded by Bjorn Olsen at Harvard, or the negative results section added by Neurobiology of Aging (see May 2003 news; Sep 2004 news). However, these resources remain underused. Scientists told Alzforum that cultural barriers remain. “There was a stigma to publishing negative results. It wasn’t seen as worthwhile,” Refolo told Alzforum. Alberts added that politics plays a role as well. “People don’t like to alienate powerful figures in science,” he said.
Bringing Industry Research into the Open
The newest publishing efforts attempt to overcome some of these problems by lowering the barriers to publication. They make it quick and easy to submit findings while encouraging open commentary and peer review. For example, in 2013, the Faculty of 1000 launched the online, open-access journal F1000 Research to rapidly publish findings in biology and medicine. Articles are posted within days, after being checked by an editor for proper formatting and to ensure they meet basic standards. After publication, at least two referees openly peer-review the paper. Authors can respond to criticism and make corrections. Papers that receive two or more positive reviews are submitted to PubMed and other online indexes.
Alberts thought this format would be a good fit for publishing replicative studies. Together with Alexander Kamb at Amgen, he inaugurated a new F1000 Research channel, Preclinical Reproducibility and Robustness, on February 4 to provide a forum for data that confirms or contradicts published preclinical studies. “It is our hope that, both through this format and others, a vigorous new publishing culture can be established to enhance the crucial self-correcting feature of science,” they wrote in their initial editorial. The launch was covered by science journals and other media (see stories in Nature, Science, and The Economist).
In particular, the researchers hope this channel will tap into the wealth of unpublished data accumulated by industry scientists who attempt to replicate academic findings. “There’s a huge amount of privately funded research that should be part of the scientific literature,” Alberts told Alzforum. Amgen scientists kicked off the effort by publishing three such studies. Kamb is working with other companies to encourage submissions, Alberts said. Amgen declined to make Kamb available for an interview.
Industry scientists normally have little incentive to publish in-house data. The new channel may change that, Alberts said. He believes publication will benefit industry scientists both by correcting the literature, saving time that would otherwise be spent in futile experiments, and by pointing out instances where replication failed due to faulty methods. The authors of the original study are invited to comment on the papers and note any methodological problems.
Other scientists praised the new forum. “[This is] a valuable asset to the research community. I strongly support the initiative,” Sangram Sisodia at the University of Chicago wrote to Alzforum. Rita Guerreiro at University College London, U.K., noted that post-publication peer review works well for the most part. “Because the editorial and reviewing process is completely transparent, it increases confidence in the review and levels the process for everyone,” she wrote.
Some researchers, however, pointed out potential problems. Gary Landreth at Case Western Reserve University, Cleveland, wrote, “The work must be subject to rigorous peer review to establish whether the reproduction study was in fact a genuine attempt to reproduce the original experiments … A significant issue with ‘failure to replicate’ studies is that while they gain attention for challenging published findings, they are rarely subject to the same scientific scrutiny.” (See full comment below.) In one of the first three articles on the channel, Amgen scientists reported a failure to replicate previous findings by Landreth and colleagues on the ability of the cancer drug bexarotene to lower Aβ levels (see Feb 2012 news; May 2013 news; Feb 2016 news). For his part, Landreth pointed out on the F1000 channel that Amgen scientists used a formulation of the drug that has different pharmacokinetic properties than the standard therapeutic version. “The ability to post comments on the F1000 site is a valuable feature of this forum,” he noted.
Observations, Not Stories
Another new outlet takes a different tack. Rajendran saw problems with the emphasis traditional science publishing places on telling complete stories. He believes this tendency introduces bias. “You can’t have plot spoilers or negative data,” Rajendran said. He started the online journal Science Matters in November 2015 to address this. This forum will publish single observations, including negative, confirmatory, and orphan data. Related observations will be linked on the site, with green lines indicating confirmatory data and red lines contradictions. “If a node has many green arrows, you can visually see it’s a better target for trials,” Rajendran said. This journal launch also attracted media coverage (see Science, Vox Science and Health).
Rajendran emphasizes that it is easy to publish through this venue. Submissions can be written using an online template, and then go through an editorial office that checks formatting and removes the authors’ identifying information. The anonymized paper goes to the editorial board, which selects a handling editor who sends the paper out for review. The identities of the authors, editors, and reviewers are all hidden from each other, so that politics can play no role in reviews. Reviewers score the paper on a one to 10 scale for technical quality, with a score of four or higher required for publication. They usually turn the paper around within two weeks. Accepted papers are indexed on PubMed. If a paper scores less than four, the reviewers must suggest ways to improve the study, and the authors have the option to resubmit with additional data. Rajendran believes this triple-blind, quantitative review process will help remove bias and allow more types of research to be published. “It’s democratizing the way we publish science,” he told Alzforum.
Rajendran said initial reaction has been positive. More than 400 scientists have joined the editorial board, including leaders in the area of reproducibility such as Thomas Südhof at Stanford University and Brian Nosek, the founder of the Center for Open Science in Charlottesville, Virginia. The site has received around 60 submissions so far, with the first 14 papers published in February. Rajendran used the forum to publish his own findings questioning the conclusions of a high-profile paper that identified γ-secretase activating protein (GSAP) as a modulator of Aβ production (see Jan 2014 news), after Nature rejected them repeatedly. He plans to publish additional data that extend the finding further. Observations published on the site are like LEGO bricks, Rajendran noted. “The [science] story develops naturally. You basically open up your lab notebook.”
A Shift Toward Openness
These publishing ventures reflect a zeitgeist that increasingly values openness. Scientists in some fields have literally opened up their lab notebooks online, for example on the Open Source Malaria site. Neurodegeneration researchers have been slow to follow this trend, but at least one is trying it out. Rachel Harding, a Huntington’s disease researcher at the University of Toronto, announced in February that she would blog about her research in real time and upload all her methods and raw data to the data-sharing site Zenodo. The project was initiated by her funding agency, the Cure for Huntington’s Disease Initiative (CHDI) Foundation, which supports openness and collaboration in research, she told Alzforum.
“This is an experiment to see if releasing real-time, warts-and-all data fosters new collaborations within the field, and helps us do more effective, efficient science,” Harding said. She hopes to receive valuable suggestions on how to improve her experiments, as well as give Huntington’s patients insight into the scientific process. The risk, she acknowledged, is that other researchers could replicate her findings prepublication and scoop her data. “A lot of researchers think it’s a crazy idea. But we want to answer the big scientific questions as quickly as we can.”
Other venues are also making unpublished data available. BioRxiv (pronounced “bio-Archive”), run by Cold Spring Harbor Laboratory, New York, posts preprint manuscripts that have not yet been submitted to journals. This allows authors to receive suggestions on drafts, as well as make data immediately available to the community. Several journals accept manuscripts directly from bioRxiv, and most of the published preprints eventually appear in peer-reviewed journals, according to an article at Phys.org. BioRxiv started in November 2013.
Research Ideas and Outcomes (RIO), founded in September 2015 by the academic publishing company Pensoft, goes a step further. This open-access journal pledges to publish all stages of research, from grant proposals, methods, and raw data to final results, as well as posters, conference abstracts, and thesis projects. Articles can be peer-reviewed either before or after submission. This model allows authors to receive credit for their work and ideas and find potential collaborators, the publishers suggest.
The National Institute on Aging is getting in on the act, as well. To improve research reproducibility, NIA scientists are developing a database of preclinical Alzheimer’s research. Institute researchers upload published papers and then grade them based on best-practice guidelines for preclinical studies. Every study receives an Experimental Design report card that checks for such features as whether experiments were properly blinded, balanced for gender, whether the researchers reported drug dose and formulation, and whether they included a power calculation to determine if the sample size was large enough to observe the predicted effect. Many studies fail these basic tests. Refolo, who runs the project, noted that of the first 100 or so studies uploaded to the database, only one included a power calculation. The database is currently in beta form and will be released to the public this summer, he said.
The database will also include an option for researchers to register studies and upload unpublished data. As with other online papers, the data will receive a digital object identifier (DOI) so they can be cited and authors will receive credit for the work, Refolo noted. He hopes this will encourage the publication of negative and confirmatory findings that might otherwise go unreported.
Will all these ventures complement each other? Guerreiro supports the idea of different channels where distinct types of scientifically valid research can be published. She noted that in genetics, researchers usually cannot publish data on known mutations because the findings are not considered novel. Yet this information is important because it describes phenotypes associated with each mutation, and helps scientists assess whether a mutation is pathogenic and dissect the role of genetic variability in disease. Publication of such findings in a database could advance the field, she suggested. “The most expensive research projects are those that are performed and not published,” Guerreiro wrote to Alzforum.
As all these new outlets spring up, the question remains: Will scientists use them? Many researchers say these initiatives are a great idea, but have reservations about whether they adequately protect researchers’ ideas. Promotions and grants are still based on a scientist’s ability to publish original findings in top-tier journals, commenters noted. “The big issue [for new outlets] will be doing outreach and getting buy-in from the community. We’re talking about a cultural change,” Refolo said. Guerreiro, meanwhile, suggested the new forums will evolve in response to user feedback and become better utilized with time. “Change is inevitable. These new publication formats are the future,” she wrote.—Madolyn Bowman Rogers
- Guidelines at Nature Aim to Stem Tide of Irreproducibility
- National Institutes of Health Tackles Irreproducibility Problem
- New Journal Guidelines Aim to Boost Transparency in Research
- That Should Have Worked! Where to Publish? Try the New Journal of Negative Results
- Neurobiology of Aging to Publish Negative Results—Call for Manuscripts
- Upping Brain ApoE, Drug Treats Alzheimer's Mice
- Bexarotene Revisited: Improves Mouse Memory But No Effect on Plaques
- Bexarotene—First Clinical Results Highlight Contradictions
- GSAP Revisited: Does It Really Play a Role in Processing Aβ?
- Begley CG, Ellis LM. Drug development: Raise standards for preclinical cancer research. Nature. 2012 Mar 28;483(7391):531-3. PubMed.
- Vasilevsky NA, Brush MH, Paddock H, Ponting L, Tripathy SJ, Larocca GM, Haendel MA. On the reproducibility of science: unique identification of research resources in the biomedical literature. PeerJ. 2013;1:e148. Epub 2013 Sep 5 PubMed.
- Prinz F, Schlange T, Asadullah K. Believe it or not: how much can we rely on published data on potential drug targets?. Nat Rev Drug Discov. 2011 Sep;10(9):712. PubMed.