The many dimensions of reproducible research:  A call to find your fellow advocates.  

Blogpost by steering group member Tamarinde Haven.

Various definitions of reproducibility and its sister concepts (replicability, replication) are floating around [Royal Netherlands Academy of Arts and Sciences 2018, Goodman et al 2016]. Whereas there are relevant (disciplinary) differences between them, they generally share a focus on the technical parts of reproducible research. 

 

With the technical focus increasingly taking centre stage [e.g., Piccolo & Frampton, 2016], one can assume technical solutions are the panacea. Say techno-optimism in the reproducibility debate. The thinking is something along the lines of: Provided that institutions have the option to facilitate the relevant infrastructure and use of tools, researchers employed at those institutions will carry out reproducible research [Knowledge Exchange, 2024].  

To be clear, making reproducible practices possible is a necessary step. But it is one of many [Nosek, 2019]. Now that you have enabled more reproducible practices, how are you going to ensure it is picked up by researchers?  

Now that you have enabled more reproducible practices, how are you going to ensure it is picked up by researchers?  

Back in 2021, I defended my thesis on fostering a responsible research climate. One of the key barriers to responsible science researchers flagged was the lack of support. Our participants were bugged down by inefficient support systems where they were connected to support staff who were generalists and could not efficiently help them [Haven et al., 2020]. 

Many Dutch research-performing organisations nowadays employ data stewards, code experts, and product owners of various types of software that have been recently developed to promote reproducibility. These experts maintain software packages and they select the relevant infrastructure to support a reproducible data flow. They advise on which tools are suitable for a given project and ensure these are up to date. They implement new standards through workshops and training. And they do so much more. 

During the launch of the Netherlands Reproducibility Network last October, we heard of various disconnects at institutions. We learned about meticulously trained data stewards sitting around, waiting for researchers to find them. Those who found them returned, but that’s only partially good news. I am not aware of their exact reward mechanisms, but many organisations follow the flawed principle that when there are no requests for a solution, we do not seem to be bothered by this problem which is false for many different reasons*. Part of this disconnect may simply be a matter of time, and culture change is a process that typically is much slower than its driving forces might be satisfied with.  

In my personal experience, these professionals have been highly knowledgeable. I found reproducibility advocates who were able to help me draft a coherent data management plan for my funding proposals, advised on relevant metadata standards, wrote the piece of Python code to connect my data with existing databases, and finally connected me with yet other specialists who maintain the newly created data archiving infrastructure in The Netherlands.  

As a network, it is in our DNA to exchange information but also – crucially – contacts for professional purposes. That is why we, as the Netherlands Reproducibility Network, want to focus on promoting connections between researchers and research support staff. What kind of strategies are currently being used to connect these groups? How can we learn from successful efforts or institutions where these parties seamlessly find one another? 

And no, we don’t plan on falling into the same trap as the participants in my thesis research talked about. General, one-size-fits-all solutions likely won’t cut it. That is why we hope to facilitate ongoing and launch new pilots to investigate connections. But as so often with these efforts, the best possible world is one in which these kinds of pilots are not necessary. So, to all my research colleagues: Please find your fellow reproducibility advocates in your institution. Acknowledge their help in your research products. And make sure to share their valuable expertise with your lab members; we should honour the crucial human dimension of reproducible research. 
 

PS. Do you know of any ongoing pilots? Get in touch!

* Just because no one reported any social safety issues to the confidential advisor, there are none, and we do not need confidential advisors? 

References: 

Goodman, S. N., D. Fanelli and J. P. Ioannidis (2016). What does research reproducibility mean? Science Translational Medicine 8, 341ps312. 

Royal Netherlands Academy of Arts and Sciences, KNAW. (2018). Replication studies – Improving reproducibility in the empirical sciences. 

Haven, T., Pasman, H. R., Widdershoven, G., Bouter, L., & Tijdink, J. (2020). Researchers’ Perceptions of a Responsible Research Climate: A Multi Focus Group Study. Science and engineering ethics, 26(6), 3017–3036. https://doi.org/10.1007/s11948-020-00256-8 

Piccolo, S. R., & Frampton, M. B. (2016). Tools and techniques for computational reproducibility. GigaScience, 5(1), 30. https://doi.org/10.1186/s13742-016-0135-4 

 

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.