Enzo submitted his manuscript for review at one of those well-known, Earth science-niche journals. However, shortly thereafter he received an email by his editor that it was rejected. Reason being that not enough research was added in the manuscript. Furious, he bellows:
When is my research open enough to be published?
Dear Enzo,
I know, right? It seems that over the past few years the need to give away everything you’re doing, scripting, collecting, plotting, inferring, inverting, simulating, annealing, fitting or predicting in order to get your manuscripts published has been continually increasing. Through time we’re teetering on the brink of researchers having to turn in their daily log-book as a prerequisite to go through the motions of the peer-review process. With this logic, I doubt we’ll have a job in a couple of years. It will all be copy-paste in case we follow the trend the journals set out for us.
I appreciate that you might find this train of thought ever so slightly too far-fetched. Although this wording may be a wee bit fulsome, it does seem like this a possibility when we follow the trend. First, more or less up to the nineties, journals readily accepted manuscripts (after proper peer-review, of course) without a lofty need for supporting data and plentiful trust was put in the authors to be truthful and ethical. Through the zero’s, and with the advent of more computing power and satellite data, numerical model codes became more readily available (at least upon request) and underlying data became bigger and bigger. Concurrently, requirements for journals have gone up and up. To the point now that we need to supply scripts.
Well, I cannot help myself and have to say: enough. Enough is enough. What’s the point of a career in science anymore? Are we supposed to only create new software, so that the rest of community can copy-paste us? Has the expectation of (future) scientists to have more than half a brain cell really forgone? Why do we need to spell out every single (sub)routine? Surely those reading the papers are able to actually put two and two together and do some calculations/interpretations themselves using their own common sense, scripts or models. Science is supposed to be about development and progress, not writing and publishing research in a way that even humanities BSc students can reproduce the results.
Yes, I applaud the way to bring science to the masses forwarded by journals. Yes, a necessity for plain language summaries exists. Yes, scientists can do better to try and present their work in a way that is more accessible to other researchers. HOWEVER, inferences and hypotheses should simply be possible to be reproduced by providing (some of the) essential data, the workflow and proper citations for underlying/old techniques. Oftentimes, alternative scripting or model codes should be able to reproduce some of the inferences as long as the data is there, unless a complete cutting-edge-technique is presented. Not line-by-line reproduced by copy-pasting the actual scripts people have used. It’s beyond me why you should hand in your entire workflow, literally. Are scientist really that shifty, shady and sordid? Sure, I know that plagiarism is a thing, and people make up data. This is not the means to approach this problem, though. You cannot stifle questionable ethics by requiring a zip-file of the researchers’ hard drive. It’s only going to result in new problems. Mainly of the future generation of scientists not being required to actually think (and struggle, which is not a bad thing) and reproduce/understand/take advantage of published results and inferences. To answer your question; if you added to the data, and use proper techniques that you have well described, your research should be open enough to be published, whatever some pesky editor tries to tell you.
Yours truly,
The Sassy Scientist
PS: This post was written without the need to report any hidden scripts, underlying data or secret citations.