Open Science training must be integrated into ongoing research
Joachim Winter talks about his experience with Open Science
Three key learnings:
- Doctoral students must be familiar with the constantly evolving methods of empirical economic research. Open Science practices are now part of the methodology toolbox you need to master.
- There is growing acknowledgement for data that you produce and that can also be used by other researchers.
- KonsortSWD aims to improve access to registry data. Attempts are made to support policy in legislative measures, to reform access to registry data.
How important is Open Science for you?
JW: My Open Science engagement has been triggered by some events or activities related to administrative tasks. The first was the editorship of MPRA (Munich Personal RePEc Archive). Ekkehard Schlicht, who founded MPRA, asked me in 2010 to take over the editorship of MPRA. The idea of the MPRA is to offer a series of discussion papers within the framework of RePEc where everyone can submit papers which are only checked for formal criteria and then made openly accesible. Given the publishing culture in economics, where preprints play an essential role, this offer is very important for researchers whose home institutions do not maintain their own discussion paper series. That was my first contact with Open Science. And next we had an international PhD school, “Evidence based Economics”, in Munich from 2013 until 2021 with some other Bavarian universities which was funded by Elitenetzwerk Bayern. I was the spokesperson. For this school we set up a training programme for doctoral students in empirical economic research. There were courses about documentation and reuse of datasets. Within the Collaborative Research Centre TRR “Rationality and Competition”, which has been running since 2017, I was responsible during the application process for the central information project, which bundles the activities around data collection, documentation and reuse, among other things.
What attracted you to the editorship of MPRA?
JW: My motivation was that we are responsible at our Munich location to continue the idea of an open discussion paper series. And I also think it’s interesting to interact with other people in the context of RePEc who engage for Open Access publishing.
What is the role of the discussion about reproducibility and replicability in the training of doctoral students?
JW: First let me say that the necessity of making research findings reproducible exists naturally also in economics, and this is widely accepted. There have been deficits in the past; and since the provision of data in the interest of reproducibility and reuse requires effort, our research processes sometimes take a lot of time to adapt. The pressure is rising because of the eminently important leading journals (“Top 5 journals”) in our discipline which matter for individual careers. For a few years now, several of these leading journals have required authors to provide datasets and codes for published articles. And that means de facto that is has been established in the discipline. If you want to publish in such journals you must document your programming code and share your datasets wherever it is legally possible. There’s no way around it for PhD candidates any longer, and for established scientists either.
And do the PhD candidates have this skill set?
JW: I would say that they should all be able to do it. In my experience, the respective new generation of doctoral students have no problems to familiarise themselves with state-of the-art methods. Ambitious students know exactly which skills they need to be successful in the long run. The task of universities is to offer such courses and to include them in the curriculum for PhD studies.
You would describe the situation as satisfactory?
JW: The situation now is satisfactory insofar as the necessity of ensuring the reproducibility of research findings has now reached the whole breadth of the discipline – not least because of the requirements of the leading journals. Of course there are journals in the second or third rank who cannot implement these requirements in the same way. Therefore there will still be publications in the foreseeable future who cannot readily be reproduced. But I think it’s just a question of time. We here in Munich, and especially in our TRR, oblige all project leaders and doctoral students to document the data they collect themselves in an internal project database.
Do doctoral students in economics share their data and their code because it is an external requirement, or do you see an intrinsic motivation?
JW: I wouldn’t like to make such a polar distinction. If you want to do research as a doctoral student, you have to master the corresponding methods. And reproducibility and Open Science are now part of the toolbox that you need to master. You can no longersay that it’s completely irrelevant. The majority of doctoral students understand this. Those who want to continue in research surely do not want to find themselves in a situation a few years from now where the research findings underpinning their career are not robust. Beyond that we have intrinsically motivated researchers in all disciplines, as we have in ours, and they are often prominently active in the community. But that doesn’t mean that reproducibility and Open Science are implemented in the various disciplines by everyone with the same enthusiasm.
What have been your Best Practice experiences in the context of Open Science?
JW: The most important experience is that an early exchange of research findings is very important for the progress of empirical economic research. In our discipline discussion papers are very common. And this dynamic has intensified significantly over the last years. Some of my most important research papers have been cited long before they have been published in journals, simply because we have this culture where you share findings in advance. That distinguishes our discipline strongly from some of the other social sciences. In our discipline we have this idea that you share the results as soon as the project is concluded. That has accelerated the dissemination of knowledge. This dynamic is now spreading from sharing research results to sharing data.
What’s your opinion on giving non-academic, commercial research institutions access to data?
JW: It depends on the area of research. In some areas, for instance pharmaceutics, the majority of research is undertaken outside of universities. Not sharing data with such institutions would be problematic in the sense of ethic research, because data would have to be collected repeatedly. This engenders costs. And every clinical study bears risks for the patients involved in it. But difficult questions ensue from it: How do you divide the resulting profits? How do you handle patents that may result? But not sharing data because of this is probably the worse way to go.
Is data sharing only a matter of ethics for you or have there been positive effects for your career?
JW: I wouldn’t say that it is only for ethical reasons, but instead a question of the credibility of research. Our research findings must be credible, and therefore reproducible. Regarding acknowledgement for commitment to data collection and data use: There are colleagues who have spent a large share of their careers on developing survey studies. They spent a lot of time building such studies. They could have published papers in this time. You often hear these colleagues say that they haven’t always received appropriate acknowledgement for the production of data that can be used by many. That looks a lot better today – acknowledgement for the production of widely usable data is growing.
Would you recommend spending time on making data accessible to your students when it’s only the publications that count of the application procedure?
JW: For doctoral students, Open Science, and especially data sharing, documenting the analysis programmes and ensuring reproducibility, are part of the the work they must do. Because as we have already seen, non-reproducible empirical research will hardly be able to be published prominently in the future. Therefore, the question in a narrower sense does not arise.
The lastest ZBW awareness and image poll showed that the majority of economic researchers have never herad of NFDI, KonsortSWD or BERD. How do you explain this?
JW: There are two answers. The first is that at the moment the various parts of the NFDI, the individual consortia, are very much driven by the participating institutions. People who work in or with these institutions of course are familiar with the issue and the work programme. But people who work for research institutions not involved in a consortium possibly haven’t had any contact yet with NFDI. Many of them don’t need the resources provided by the NFDI in their daily work, or they don’t know that they could access these NFDI-privided resources like data or methods.
What is your role in the context of NFDI, KonsortSWD?
JW: I am an elected member of RatSWD (German Data Forum) and thus a member of the KonsortSWD Advisory Board. We meet twice a year. Ours is a classic oversight function, we supervise and make suggestions. But to return to your previous question: among us economists it is quite possible that some early career researchers use services of RatSWD and/or KonsortSWD without knowing it. Perhaps they are using confidential data in a research data centre. The infrastructure of the research data centres which are extremely important for empirical research in social economics is supported by RatSWD through certification. Without RatSWD, this form of infrastructure would not exist. Another topic addressed by RatSWD is the ethics of research. This addresses questions such as: Do we need ethics committees in economics? If yes, where? Should such committees be location-specific or discipline-specific? What are the uniform standards?
Will NFDI be more widely known after completion?
JW: As an umbrella organisation for many initiatives, NFDI is probably two or three levels removed from those scientists who use the resources it provides in their daily work. Yet I think it’s inevitable that NFDI becomes more widely known. However, I don’t see awareness as decisive; it’s the output, the resources provided and the initiatives that will take research ahead. If doctoral students at small locations are informed about these initiatives, it will be a great success. If they know how to deal with the question of whether their project requires a vote by an ethics committee, which may not be available at their location. Or if they have access to registry data which is only possible via research data centres. Whether these students know that two levels above this there is the NFDI, is irrelevant in my opinion.
When you talk in your community about KonsortSWD or NFDI, what response do you get from economic researchers?
JW: Many colleagues first ask what benefits initiatives such as the NFDI bring to their own daily work. Then I refer to my own area of work and tell them how we at RatSWD aim to improve access to registry data: we try to support policy in legislative measures in order to improve access to registry data and the possibilities for linking different data collections. Right now such activities are running in the background and are not perceived widely.
What’s your stand on Open Peer Review?
JW: I edit a journal which traditionally works with Single Blind Peer Review. For me, the concept of Open Peer Review is quite attractive, but it brings about a radical change. Therefore it will probably not prevail stealthily, but will require an impulse from the leading journals. If the top journals switched to Open Peer Review, others would follow. Whether the discipline will come to realise that Open Peer Review has advantages and can be implemented technically remains to be seen.
Would you personally endorse a truly open process?
JW: I can’t give a general answer. The biggest problem with the reviewing process in economics in my opinion is that the review and often multiple revisions of manuscripts simply take too long. The problem could possibly be solved by publishing studies that are imperfect by current standards and leaving the discussion and evaluation to a public forum. Such a forum should be moderated, however, which would require considerable effort. At present we assume that the quality of the journal in which you publish is a good signal for the quality of the researcher. That is why they are so heavily weighted in application procedures. If you changed over to Open Peer Review, the quality of the journal would no longer play such an enormous role. You would then need a different standard for quality. That’s quite a challenge.
What are your tips for early career researchers who want to learn Open Science practices?
JW: You can start with online resources; there’s a lot available by now. There are online courses for programming languages for instance that have not yet been included into basic curricula, such as Python. There are more and more offers in the domain of data management and project management. If you want to, you can study a lot by yourself online. If the question is how to motivate yourself: my recommendation to all young researchers is to look at what the top journals are doing, how requirements for publications are changing. Our top journals reflect the state of the art or even define it, especially with regard to methodology. And that’s not rocket science, everyone can do this!
Do you have concrete plans for your own personal engagement?
JW: In many areas where I am active, the main question is how to integrate these skills into the training of junior researchers. What offers are actually needed? What can be offered across locations?
Do you have any idea who could provide this?
JW: For one thing you need people willing to invest time for building such infrastructures; and you need funding for new offers so that you need to acquire third-party funds. Right now such initiatives are often built into proposals for large-scale projects, but this doesn’t cover all needs. It’s also important not to separate the teaching of methodology from actual research. We need to integrate methodological advances into ongoing research, also in the field of Open Science. If training offers stray too far from actual research they will necessarily become less attractive. If you rolled out one training programme for all fields of economics it would hardly work. You need to integrate Open Science practices appropriately into training junior researchers and into ongoing research.
Dr Doreen Siegfried conducted the interview on 16 August 2022.
About Professor Joachim Winter
Professor Joachim Winter is a professor of economics and has been holding the chair of empirical economic research at LMU Munich since 2004.
He is a founding member of the Open Science Centre Munich and also co-coordinator of the Data Centre at TRR “Rationality and Competition” and the LMU-ifo Economics & Business Data Centre (EBDC). Professor Joachim Winter is a member of the Advisory Board of the NFDI Consortium KonsortSWD.