2011 saw some incredible breakthroughs in science and medicine, from the colour of meteorites to the secrets of ageing (in mice at least). It also saw two low points in the reporting of science. Sadly, the pressure to publish is a constant threat to researchers, and it can mean that research questions may not be stringently tested. The peer-review process of publication in reputable journals should be able to put the data through the ringer, but sometimes the work is so specialist it can be difficult for outsiders to follow.
My own research group is in just this position. We have done a good piece of work, which we know answers our research question, but which is so complicated in its analysis that we are finding it difficult to describe in a manuscript. The techniques used in our work are common place, but we adopted a novel method of extracting the data in order to account for the biological diversity of the region of genomic DNA we are interested in. The manuscript will almost certainly be returned for clarification. Indeed, our raw data is currently being moderated by clever scientists at Miamexpress. They have asked us some very in-depth questions about our experiments, which we must endeavour to answer before we submit our manuscript.
And the research groups who had to retract their manuscripts at the end of last year will have been just as thorough in their preparation. Unfortunately, that's not enough. Much of the data produced is digital, that is, there is no physical picture of the result for us to examine. And the nature of digital data is that it can be amended. I'm not saying that is what happened but if the data cannot be reproduced independently then questions will be raised.
The research papers in question at this time cover two exciting outcomes. The first is the finding that a murine leukaemia (or related) virus (MLV) was detected in patients with chronic fatigue syndrome. But, as the number of samples was limited and the data not reproducible, the authors had no choice but to retract their publication. That's not to say they aren't right. They simply need to find another way to prove it to their peers.
The second paper is arguably more topical, on stem cell lineage, published in the journal Blood. The researchers acknowledge that some of the data may not reflect the published data analysis. This paper, was published in 2008 and cited 13 times in other papers, so it may be argued that although there were errors in assembling the manuscript for publication, the authors stand by their findings and the interpretation thereof.
So, where does that leave the rest of us, struggling through to try and publish our blood, sweat and tears? I think it leaves us a little tainted in the public eye, and we must work harder to make sure our science stands up to rigorous scrutiny by our peers. As research funding decreases, the strongest research questions and protocols will rise to the top. Let's hope that exciting science does not drop away altogether.
Follow @Shackleford_LB