Advancing reproducibility and Open Science one workshop at a time – community-building in the Netherlands

This is a crosspost from the codecheck website.

This blog post is the final one in our series of posts about the “CODECHECKing goes NL” project, which has been running since May 2024. We have been working hard to promote reproducibility and Open Science in the Netherlands through a series of workshops, and we are excited to share our experiences and future plans with you.

Find the full series of posts on our project page.

Why is this important?

A paper’s code gets rarely checked – everyone in academia knows about peer reviewing articles, but few people engage in reproducibility checks of the materials behind the paper. Reproducibility checks are less about vetting research (e.g., catching fraud, finding errors), but more about ensuring the reusability of research. It is an extension of the thought that if we want to stand on the shoulder of giants, those giants better be standing on solid ground. And solid ground for computational workflows means good documentation that is understandable outside of the inner circle of the authors of a research article.

Reproducibility Checks at the Center for Digital Humanities at Leiden University on 14 th February 2025.

A reproducibility check is about the question whether one can reproduce the reported results in the paper (i.e., the statistics, tables, figures, maps, or graphs) from the provided data, code, or other materials. The CHECK-NL project focuses on the computational reproducibility of published research and tries to answer the question of “Can someone else recreate the output on their hardware using the materials and documentation provided by the authors?”. We call this type of reproduction a CODECHECK.

Who did what?

A bunch of enthusiasts for Open Science and Reproducible Research from University of Twente, TU Delft, and UMCG Groningen applied for funding from NWO via their Open Science funding scheme to organize four in-person events at their respective institutions and beyond. Through these events, they intended to jump start a Dutch reproducibility checking community. Included in the project proposal was also work on the CODECHECK website and registry to better present codechecks and the codecheckers behind them.

Along the way, the group of enthusiasts grew and instead of the planned four events, there were a total of six in-person events: one more as a conference workshop (AGILE in Dresden) and another one (TU Eindhoven) organized by attendees of the first event (exactly what this project was aiming for!). At the events, we also connected with representatives of a data repository, diamond open access publishers, and digital competence centers who are considering their own version of computational reproducibility checks.

The four events in Delft, Enschede, Rotterdam and Leiden brought in a total of 40 researchers, many of whom opened up their own work to be assessed by others, together who codechecked 15 papers. The additional events in Eindhoven and Dresden introduced an international crowd to the CODECHECK principles. Each event had a different topic, focusing on different parts of the research landscape, which resulted in different challenges and learning opportunities at each event. While the groups in Delft and Enschede mainly faced problems with computing environments, documentation, and high computational loads (too big for laptops or the workshop time), the group in Rotterdam raised the issue that reproducibility checks can be pretty dry at their core and may be almost trivial if only heavily summarized data can be shared. At the final event in Leiden, we brought linguists and digital humanists together. One of the questions raised was: how do we start a reproducibility crisis in the humanities? (Because maybe we need one to raise awareness about the important topic in this field?)

What are the results? What did we learn?

One clear lesson learned was about how different crowds from different disciplines are – although the advertisement for the events and their setup and schedule were quite similar, they played out quite differently. Another important lesson is that you need a group of enthusiastic participants to drive such events – fortunately, we always had those!.

There were people with a wide range of coding skills at the events. The wrap-up sessions always gave us the impression that all of them took something home and learned something. Working with someone else code and reproducing another researcher’s workflow requires craftsmanship and a hands-on and can-do attitude that is rarely taught in typical university classes. The workshops and the participating experienced mentors, however, could provide such a setting.

The four main in-person events required attendees to invest an entire workday into this topic. In retrospect, this might have prevented interested people from joining. For raising awareness, shorter, more targeted events might be a suitable alternative.

Getting the certificates was a nice by-product but was certainly not the only outcome. Authors whose project didn’t pass the reproducibility check were given feedback so that they can make their work still reproducible. Participants got the chance to learn from other people’s workflows and software stacks.

Another surprise was how difficult it still is to convince colleagues to submit their work to a reproducibility check. The social layer of this otherwise rather technical question is the biggest challenge for the project team and people working with reproducibility checks. The technological challenges are less exciting than the positive experiences and potential benefits, see e.g., this blog post about an author’s experience how it is to be “codechecked”.

From discussions we distilled the notion that the best time to get a reproducibility check is at the preprint stage or during peer review – then people are still motivated to fix issues before the publication. Also, a certificate is a positive signal towards peer reviewers (at least that’s what we hope). If published work gets checked, authors need to be very motivated to improve documentation or fix bugs, certainly if those are hidden in some deeper level of the code.

Concrete outcomes:

What are the next steps?

The CODECHECK or reproducibility check community in the Netherlands is growing. We met with the wider community to evaluate the project and make new plans. 4TU Research Data is planning to work on codechecks as part of their service as data repository and is working closely with the four technical universities.

The community in the Netherlands will continue to meet and work on topics like reproducibility checks as a service or as part of teaching curricula, and academic culture around code checking. Internationally, we have reached out to colleagues in Bristol and Cologne.

Preregistration For Student Assignments

How can we integrate open science practices into our curricula? In this webinar two lecturers told us how they are including preregistrations in their students’ curricula.

“When you preregister your research, you’re simply specifying your research plan in advance of your study and submitting it to a registry.” (Quote from the website of the Open Science Framework)

Ewout Meijer started the webinar and told us how he convinced thesis coordinators to include pre registration in their courses. As an Open Science Ambassador for his faculty, Ewout got backed by the dean to find out where in the various courses open science practices could be integrated. Pre registrations are just one example of those practices.

He shared several tips of how he convinced colleagues to integrate open science practices:

  • Make it easy for your colleagues – minimize the extra workload by sharing templates and offering introduction lecture materials
  • Make them want it – top down mandates to include open science practices don’t work well, but if your colleagues are convinced that this is the right thing to do, they will follow
  • Don’t be overambitious – first the science, then open science. Finding the balance between what students need to know in terms of scientific content and what they need to know about the scientific system is difficult and will differ between courses and student populations.
  • Searching for “project proposals” as a required component for students to pass a course is a good way to find courses where preregistration can be taught. You just need to replace the proposal with a registration. 
  • Including new (open science) content means kicking out some other content – see what can be replaced and what can be tossed

At Maastricht University, students are asked to use the AsPredicted template and submit that as a pdf (i.e., they don’t upload it on aspredicted.com). Ewout mentioned that not all internship projects are suited for this format, so students might have to adjust it or come up with a project just to fill in the template and pass this grading component.

Students get exposed to the idea of preregistration and the same effect goes for workgroup tutors. Tutors come from a wide range of research groups and are learning themselves about pre registrations while helping students with their thesis work.

Elen Le Foll asked her seminar students to pre-register their term paper analyses. Adding this component required some extra time investment to make sure students understood what was expected of them and for extra feedback rounds. The preregistration adds at least one round of feedback to the term paper and requires students to plan ahead and submit their preregistration on time to have enough time left to incorporate the feedback into their final data analysis. On the positive side, students can learn from feedback and include it in their work. For normal term papers, students get feedback at the end but do not need to use that feedback or cannot improve their work anymore.

As Elen’s course is an advanced course for master students, some of her students want to turn their term paper into a research paper. For them, the preregistration is an excellent way to get a timestamp for their analysis. 

In the discussion with Andrea and other attendees, we discussed how the AsPredicted format can be used by students and if a full registered report might be even more suitable. We briefly touched upon the difficulty of grading pre registrations and how much detail we should ask students. Another point of discussion was how we can sell pre registration to students who are not interested in becoming researchers. This led to a discussion on how to balance the need for academic training with content and application outside of academia.

Thanks to our presentors:

Elen Le Foll is a post-doctoral researcher and lecturer in linguistics at the Department of Romance Studies at the University of Cologne. She likes to integrate Open Science principles and practices in her seminars and recently asked her students to pre-register a study as part of a term paper assignment.

Ewout Meijer works at Maastricht University and coordinates the thesis module for the research master in psychology. He introduced preregistrations for thesis projects. 

Useful Links:

Aspredicted: aspredicted.org

OSF Preregistration Templates: www.cos.io/initiatives/prereg

Making reproducibility work for qualitative research methods – and not the other way around.

This blogpost is inspired by the Open Qualitative Research Symposium at VU on 28th March 2025. We thank the organizers and speakers for this interesting and energizing meeting.

The blogpost is a summary of and reflection on some key issues that were raised during the event, how those relate to reproducibility in a broader context and how the NLRN works on solving some of those issues. 

Authors: Tamarinde Haven and Daniela Gawehns; Foto credits: QOS organising team

Open and Qualitative – an epistemic clash?

How to open and share data collected with qualitative research methods was a central point of discussion and featured prominently in the workshop program. Tools for anonymizing data, and for sharing data such as videos and pictures, were presented and discussed. A recent review on enablers of reproducibility in qualitative research reflects this emphasis, finding that more than half of the included studies addressed Open Data as a topic. One of the authors explained: These papers are not all about how to share data, but many of them attempt to explain how the requirement to open data does not align with the needs of qualitative research:

While familiar privacy concerns also apply in quantitative research, qualitative researchers face an additional, epistemic challenge: most require access to un-anonymized, rich data as a basis for interpretation. And this type of data is difficult to share. The epistemic requirements clash with the broader push for transparency.

The discussion around Open Data requirements for qualitative research highlights a mismatch between the epistemics of a majority voice in the Open Science discussion and the needs of a smaller group of researchers. At the NLRN, we aim to include as many voices as possible in the reproducibility debate, and to take a range of epistemic challenges into account. 

Agata Bochynska gave a presentation on how they support qualitative researchers at the library of the University of Oslo.

Qualitative stepping stones

The second main topic addressed how qualitative research methods can help quantitative scientists work more reproducibly. Reflexivity, a central aspect of qualitative research, can serve as a stepping stone toward process reproducibility and transparency in quantitative research:

“Reflexivity is the process of engaging in self-reflection about who we are as researchers, how our subjectivities and biases guide and inform the research process, and how our worldview is shaped by the research we do and vice versa” (quote from  Reflexivity in quantitative research: A rationale and beginner’s guide – Jamieson – 2023 ). In their paper, Pownall and colleagues make the case for reflexivity as a basic first step towards reproducibility: a reflection on the research process makes opening that process up much easier. 

The NLRN brings organisations together to share best practices and learn from each other. We would love to amplify and share cases where methods from qualitative research helped quantitative researchers approach the reproducibility of their own non-qualitative work in new ways and make it more transparent.

Parallel Communities of Practice

Several speakers voiced their concern about duplication of efforts as parallel communities of open qualitative methods form. The recent call for a European community of qualitative researchers is an answer to this fear and will hopefully create more synergies to avoid duplication of efforts and slowing the process of change down. 250 people already expressed their interest in such a community. We will cross post updates on this initiative on LinkedIn and via our newsletter.

Bogdana Huma and Sam Heijnen from VU were the core organizing team of the symposium and guided a co-creation session at the end of the day.

The NLRN will organize a symposium in collaboration with the Dutch-Belgian Context  Network for Qualitative Methodology to continue the discussion on transparency at the national level and foster learning from other, non-qualitative methodologies.  

Open to the citizen – with methodological consequences

A topic that gets rarely touched upon in the reproducibility discussion is participatory or community driven research. The goal of this approach is to make research more relevant by including citizens, not only as participants or as data collectors, but also as researchers and to guide the research process. 

With reproducibility being a way to share research processes, questions around methods and research pipelines arise: Are those methods flexible enough to accommodate evolving consent forms, fluid management plans and research designs that are sourced from the impacted communities themselves? Is there a way to pre-register those changing plans and how would we go about it? How can we be transparent about changes and present them as an integral part of the research design, rather than as flaws in planning or execution?  

Links: 

Materials of the Symposium: Events | Community Of Practice (for slide decks) and Events | Community Of Practice (for recordings)

Library used for the mentioned review on reproducibility of qualitative research: A context-consent meta-framework for designing open (qualitative) data studies | Reproducibility of qualitative research – an integrative review | Zotero 

Review Paper: MetaArXiv Preprints | Reproducibility and replicability of qualitative research: an integrative review of concepts, barriers and enablers

Platform for Young Meta-Scientists (PYMS): Discussing the Future of Meta-Science  

Blogpost by Cas Goos

The goal of our organization, the Platform for Young Meta-Scientists (PYMS), is to bring together Early Career Researchers (ECRs) working on meta-science, while providing a place to network with peers and discuss research. We consider this an important initiative as ECRs have been at the forefront of many reform initiatives, and because despite the increasing number of meta-science ECRs, many still work disconnected from other researchers with shared interests. Bringing ECRs together is a crucial step in enabling large scale research and strengthening reform initiatives that are key to improving science. 

The PYMS Meeting 

To further these goals within the most recent PYMS meeting, which took place at the UMCG on December 5th 2024 as a pre-symposium to the NLRN symposium on December 6th, we invited an audience of ECRs from a variety of backgrounds. We are glad to have achieved a broad representation of ECRs, including from abroad. Presentations span a variety of meta-science topics such as spin, equivalence testing, and reproducibility. Not only work from the social sciences, but also work in computational sciences, sports and exercise science, and nanobioscience were presented and discussed.  

Map of PYMS attendees and their areas of expertise/interest. Map backdrop was constructed by Tamarinde Haven. 

Furthermore, during the meeting we created many opportunities for discussion and networking, including invitations for collaborations, throughout the day.   

We were also fortunate enough to have Tracey Weissgerber give a keynote talk on having a career in meta-science followed by a series of provocative statements on the same topic. Both led to a practically informative discussion for the attendees on having a career in meta-science.  

The Future of PYMS 

Since the end of last year, the board for PYMS has been renewed. Four new ECRs have joined the board to carry the torch. The new members are Sajedeh Rasti from Eindhoven University, Raphael Merz from Ruhr University Bochum, and Anouk Bouma and myself from Tilburg University. Like the previous board, we plan to continue creating a community for meta-science ECRs with informal networking opportunities. However, we also plan to expand the reach of PYMS in new ways. 

As a first step, PYMS will go international during our next meeting. In this way we expand our networking beyond the Netherlands to prompt broader collaboration in our field. This is especially important here, since projects requiring large investments of time and effort from multiple parties are of crucial importance to investigate and improve scientific practice effectively. This meeting will be held this summer, there will be plenty of opportunities to discuss future projects and financial compensation for international presenters to encourage a broad representation of young meta-scientists. 

Interested ECRs can join our mailing list and get a link through there to our Discord server to connect and communicate with fellow ECRs on meta-science, as well as keep up to date about future events, including the next PYMS meeting this summer. 

The Future of Reproducibility…  

… lies at the end of long corridors 

Under the theme of “The Future of Reproducibility”, about 100 scholars, policymakers, funders, researchers, science enthusiasts and facilitators met at the UMCG for a day of interaction, discussion and new connections.  

Casper Albers, the head of the Open Science Program in Groningen welcomed us and kicked off the plenary morning session, underscoring the importance of retaining the focus on rigorous, open science, even in a future where funding may be scarcer. The keynote speakers presented challenges and developments in research reproducibility from two different domains: Biomedical research and History. Tracey Weissgerber shared how open research evolved in the biomedical sciences and called for a look beyond reproducibility to foster responsible and robust research results. Pim Huijnen and Pieter Huistra presented their work on reproducibility in history. They shared their insight from an interpretative field and highlighted how the domain of history needs to find its own definition of what reproducible scholarship means – and how this affects the choice of tools to achieve this reproducibility.   

The seemingly endless corridors of the medical center brought people together to discuss the presentations while reaching individual step counts. The lunch area was filled with chatter of mingling as new connections formed around poster topics including a chatbot for training materials and a crowdfunding platform for replication studies. 

In the afternoon, participants followed workshops on reproducibility-related topics. The participants of the lego workshop experienced hands-on what it means to work with limited metadata. Attendees could also get a glimpse of the current status of interventions to promote reproducibility and think about how they would promote reproducibility in their groups. The discussions on research integrity and applied research carried over to the coffee break, where attendees mingled and had a chance to meet the poster presenters. 

The lively closing session featuring early career professionals sparked discussions around preventing dreaded futures of science communication, including how to avoid weaponization of skepticism to stir distrust in science.  

You can find the materials (posters and slide decks) here on Zenodo and the recording of the keynotes here.

Inventorizing Reproducibility

Network Nodes Meeting, 4th October 2024

What happens if you ask representatives from 11 different research performing and supporting institutions to think about how reproducibility ready their own institution is?
On 4th October, the contact persons of our node members met in Utrecht to learn from each other and to get to know each other during NLRN’s first network meeting. 

We used the framework of the Knowledge Exchange (KE) report on reproducibility at research performing organizations to systematically think through enablers and hindrances of reproducible research. 

In small groups, we first categorized our own institutions into how reproducibility ready they are. The KE report suggests three levels of readiness; 1) there are some pockets of excellence; 2) efforts are partially coordinated and 3) there is organizational level commitment with coordinated processes. We concluded that those levels would depend on which disciplines, departments and research methodologies you are considering. We often encountered  that there are differences between management levels and researchers: it can be that the university-wide management sets policies to foster reproducible research, but this might not trickle down to an individual researchers’ work. This might even lead to window dressing or “open washing” where institutions present themselves as committed to open research practices but the culture within the institution didn’t change.

The second discussion exercise was about enablers of reproducible working such as training, mentorship or recognition. In small groups, we tried to identify which enablers are already in place and on which level, and if they indeed function as enablers. The KE report comes with an assessment worksheet for institutions that some of the participants tested. 

Visualization of enablers at a sample institution. Each enabler is present with their current state and its target state.

During the final discussion, we tried to figure out how the tools of the report can be used to further reproducibility in the node institutions.

The general consensus was that the tools as presented wouldn’t work as a general tool for all areas of scholarship and research. They would work largely for quantitative methodologies and would need tweaking for interpretative, qualitative, art and action based research methodologies. 

The idea was raised to find an institution that would use the framework to assess their current state and work towards becoming more reproducibility ready. This process could be followed and presented as a concrete example on how the framework works. 

We didn’t have enough time to talk in detail about each of the enablers. One question that was raised was if there is a hierarchy of enablers and if an institution should aim for the highest score on all of those or just some of them. 

We spent a lot of time discussing training for researchers as an enabler. There were ideas of introducing reproducibility related courses to the mandatory courses at graduate schools. Others remarked that there are already a lot of training modules offered but they don’t seem to reach everyone.

The budget cuts for higher education were also discussed during this meeting with the conclusion that we need to work even more collaboratively and coordinated to make the most of the means that are already available. NLRN could play an important role in this.

In conclusion, the network event was a great opportunity to get to know each other with a lot of engagement from all participants in the discussions on the current state and future directions to increase reproducible working. In our next meeting, we will focus on concrete steps towards that future.

Platform for Young Meta-Scientists (PYMS): Empowering the Future of Meta-Science 

Meta-science, the study of scientific practice itself, is a field crucial for fostering and monitoring research transparency, reproducibility, and integrity. Recognizing the need for a community among early-career meta scientists in The Netherlands (and its neighbouring countries), the Platform for Young Meta-Scientists (PYMS) formed in 2018. PYMS is dedicated to supporting and connecting young meta-scientists, providing them with a collaborative environment to share resources and discuss new research ideas. 

NLRN and PYMS: strong bonds 

Collaborating with meta-scientists is one of the focus areas of the NLRN. Evidence-based interventions and monitoring strategies are pivotal to a successful move towards more reproducible science. It is the meta-sciences that can generate such evidence. We are therefore happy to work together with PYMS and wholeheartedly support their mission.   

PYMS map of expertise

Highlights from the PYMS Meeting in Tilburg: May 31, 2024 

The recent PYMS meeting at the Meta-Research Center at Tilburg University was a testament to the vibrant and dynamic nature of the PYMS network. The program featured both formal presentations and informal networking opportunities. A crucial part of the meeting was the brainstorming session on the future of PYMS, where attendees provided valuable feedback and ideas for future events and organizational strategies. 

Here are a few examples of presentations to illustrate the wide range of expertise that participants brought to the meeting: 

  • Signe Glaesel (Leiden University): Discussed the challenges surrounding data sharing, including misinterpretations, intellectual property concerns, and the impact of data policies on participant willingness in sensitive studies. 
  • Michele Nuijten (Tilburg University): Shared a four-step robustness check for research replicability, highlighting the prevalence of reproducibility problems and strategies to improve research robustness. 
  • Ana Barbosa Mendez (Erasmus University and Promovendi Netwerk Nederland (PNN)): Spoke on best practices in Open Science, mapping the needs of PhD students, and the holistic approach required for effective science communication and community building. 

Looking Ahead: The Future of PYMS 

The enthusiasm and engagement at the Tilburg meeting underscored the need for regular PYMS gatherings. Participants expressed interest in a more holistic approach, broadening the scope to include researchers from other scientific fields. Formalizing PYMS through stronger links with organizations like PNN and NLRN was also a key takeaway. 

A concrete outcome of the brainstorming session in the afternoon are plans for a satellite event on December 5th, the day before the NLRN symposium in Groningen. All career-young researchers in the field of meta science are invited to join this event. More information will be shared via the NLRN newsletter and on social media.  

For more information on upcoming PYMS events and how to get involved, visit metaresearch.nl and PYMS (metaphant.net).

Perspectives on Reproducibility – looking back at the NLRN launch symposium

On 27th October 2023, we welcomed more than 100 researchers, policy makers and research facilitators to our Launch event. The aim of the day was to exchange perspectives on reproducibility and to work towards prioritizing actions for NLRN for the coming year(s). 

During the morning, we heard how the UK Reproducibility network propelled changes in the UK research landscape and we discussed how we can learn from each other across disciplines to improve research transparency in our own field. In the afternoon, participants  followed workshops on the topics of education, infrastructure, community building and research practices. The results of those workshops were discussed in the closing panel discussion of the day, where participants and panel members suggested topics and actions for the NLRN to focus on.

Marcus Munafò giving the keynote lecture on collaborative approaches to improving research culture in practice

The symposium brought a diverse set of researchers and stakeholders together, diverse in terms of roles in the research process but also in terms of disciplines It was important to take count of the current reproducibility landscape in the Netherlands. We noted that most people were very familiar with the current state of their own field, but that the overview of the entire landscape was lacking. The NLRN can act as a connector to enable communities to learn from the challenges and advances in seemingly distant fields. 

The interactions during the plenary sessions and workshops showed how research domains differ in their challenges and current status of reproducibility. The workshop hosts were asked to work towards three focus areas or agenda points for the NLRN to work on. Concrete ideas included creating training materials for researchers on how to use existing digital infrastructure or on how to make executable figures. The community building workshop suggested that the NLRN should coordinate national codecheck events. During the infrastructure workshop, participants saw a need for determining at what level research infrastructures should be organized (local,national, international), and for discussing how research outputs and processes differ between research areas, which in turn influences the required reproducibility infrastructure. Participants from the education workshop suggested lobbying for teaching reproducible research practices from the bachelor level onwards and showcasing existing efforts in teaching team science. 

The steering group is now tasked to see which suggestions fit best with the overall goals of the network and how to prioritize them. We will select a few agenda points first while also extending and growing the network. 

Stay tuned! We will share our progress on this blog, in our newsletter and on our social media (LinkedIn and X). 

You can find the presentation slides on zenodo and watch back the keynote lecture on our website. 

Welcome to the NLRN Blog!

Hi there, welcome to our blog! We are currently setting up this blog and planning our first posts.

Within the next few weeks, you can expect a post about our launch event last month and about our first network partners. Sign-up to our newsletter for general news and follow our social media (on LinkedIn and X).

Group Picture of the Steering Group and all present Advisory Board members during the Launch of the NLRN on 27 October 2023