Didactics Meets Openness

Malika Ihle allows a glimpse behind the scenes of the LMU Open Research Hybrid Summer School 2023

Photo of Malika Ihle

The Open Science Center at LMU Munich held a hybrid Open Research Summer School this summer. How did you select and put together the content of the Summer School?

MI: I had a clear idea of the topics I wanted covered, which I considered the essential basic open research skills. As I have run several open science summer schools before (e.g. Oxford|Berlin summer school, from the Oxford side), I only added topics I thought were missing from my previous ones. 

By joining, I wanted participants to understand the causes of the replicability crisis across disciplines, e.g. lack of appropriate incentives and policies, and the “lack of professionalisation of the academic research profession” (dixit Prof Richard McElreath) and the intersection between open science and equity, and with philosophy of science. I wanted them to learn to (1) plan statistical plans in advance of collecting data to prevent biases in analyses, with the help of preregistration and data simulation; (2) create computationally reproducible workflows so they can be more efficient and spot mistakes in data wrangling or analyses, through programming, creating dynamic reports, and version controlling scripts; (3) share data, materials, codes, and articles appropriately, using Findable, Accessible, Interoperable, Reusable principles, sensible data anonymisation techniques, adequate repositories and licences, so they can give their research more impact, by allowing others to build upon or replicate their work; (4) follow guidelines to report results adequately, and conduct systematic reviews, so they can facilitate the synthesis of scientific evidence. 

I then found experts and well-known guests from the German-speaking world to talk about these topics, so attendees could naturally internalise that the problems within academic research are surrounding them, in their own German university (as opposed to endemic to English-speaking countries, for example).).

Why is the focus on research data management? Is this the most exciting for everyone?

MI: I wouldn’t say the focus was on RDM. There were several sessions on how to share data in practice because there are several aspects that need considering (FAIR, licences, anonymity, RDM plans). Participants found those useful, especially those contributions with the most practical rather than conceptual pieces of advice.

However, my impression is that the subjects that excited them the most were those where the scientific mind is most needed (e.g. preregistration, which consists of designing and carefully planning a study, combining creativity and rigour) and those that improve the efficiency of their daily workflow (e.g. version control, literate programming).

How did you manage to implement an interdisciplinary approach in the Summer School in order to meet the needs of early career researchers from different disciplines?

MI: I believe an open science summer school can be interdisciplinary as long as (1) it welcomes novices of open research practices, who (2) do research in the sciences (in the English sense, i.e. as opposed to the humanities).

Apart from one session on data anonymity, which is relevant only for those working with sensitive data on humans, all sessions were relevant to all science disciplines and would have been taught the same way within each discipline. When examples are needed for specific concepts (e.g type of biases in study design or analysis), I think that all can visualise a high order scientific question related to humans (e.g. the impact of smoking on cancer or of a drug on depression) and then adapt it to their own field.

More advanced topics (e.g. deeper practices for anonymising data such as synthetic datasets, analysing the intersection between theory building and preregistration, using data simulation to perform power analysis for complex statistical designs, designing specific kinds of experiments) are more discipline-specific and were not covered during this summer school. We offer such training as single 2h or half-day workshops for members of departments for which this is relevant.

Were there any particularly striking “aha” moments or breakthroughs that participants experienced during the workshop?

MI: From the feedback received and my impression of reactions in the rooms (in person and virtual rooms), there were two sorts of light bulb moments for participants:

First, when they realised the shockingly large extent of the lack of replicability of research and how flawed the academic system is in terms of incentives and lack of professionalisation – thanks to the lectures of researchers who are high up in the academic hierarchy (e.g. directors of research organisations).

Second, when they completed a very hands-on workshop that allowed them to upgrade their research project and render it more efficient and reliable, using methods that initially may have seemed complicated and abstract to them, but which can be adopted after only 2 or 3 hours of learning (e.g. version control, simulation of data, literate programming).

What pedagogical approaches or methods did you use to engage participants and promote learning? What role do interactive elements such as workshops, practical sessions or group work play in the overall programme?

MI: To promote learning, we created several self-paced tutorials which participants could choose to navigate in various ways depending on their background knowledge, their interest, and their operating system. They really appreciated how practical this type of workshops was as they could directly implement their learning in their day-to-day workflow.

We also had two sessions promoting interactions amongst participants: (1) a hybrid networking session where online and in-person participants would navigate between breakout rooms to meet subgroups of participants. The idea was that they would find like-minded people to keep in touch with and support one another. They also had access to all participants’ contact details in a central document. (2) A discussion about how to integrate all their learnings back in their research group, considering the challenges they may face (e.g. lack of awareness, understanding, or knowledge, or outright resistance from their peers or supervisor; lack of infrastructure or norms in their field), and clarifying where they can find support to alleviate such challenges.

We also had regular breaks, suggested a place where in-person participants could have lunch in common, and organised a group dinner.

However, some respondents of the feedback survey did indicate they would appreciate even more discussion groups, and one suggested that one participant could lend their project ahead of the summer school, to be gradually and adequately opened during the summer school as a kind of group exercise and illustrative example.

In future iterations, I will consider skipping some of the lectures that provide the most redundant information and include more such facilitated group discussions in these slots instead.

Did you develop materials or resources specifically aimed at supporting the sustainability of learning, e.g. follow-up materials or follow-up opportunities?

MI: We created several online self-paced tutorials that can be followed again asynchronously at any time, as well as reused to teach their own teammates, students, or colleagues.

All the training materials of the summer school (i.e. slides, recordings, workshop material) were made publicly available with open licences at the time of the session on a dedicated repository. Participants did appreciate going through the slides at the same time as the presenters and exploring the links they contained immediately. The repository was also visited ~1000 times solely on the day the material was advertised right after the summer school.

We will also organise a one-hour follow-up hybrid event with interested participants, to discuss, in 6 months, the challenges they faced, the progress they have made so far, and what they think their next steps should be. Half of the school participants signed-up for this.

What methods have you used to prepare participants for different standards and expectations in different subject areas?

MI: To manage participants’ expectations, I created a summer school website very early on, detailing the target audience. I then requested an application where applicants would indicate their motivation and their level of current knowledge on a specific list of topics. I selected applicants who I thought would not be too knowledgeable already (so they wouldn’t get disappointed by our content targeting novices) but who had also expressed a concrete need to learn about good research methodology. These criteria were explicitly mentioned to applicants.

What feedback mechanisms did you include to understand whether the didactic methods and content were effective for researchers from different disciplines?

MI: The feedback questionnaire included questions about whether, for each workshop, the didactic method was thought appropriate for the content. They were also asked how they found the difficulty level of the workshop, and how much knowledge they already had on that topic. Out of 149 ratings, 79% indicated that the workshop’s didactic format was appropriate, and 74% scores indicated that the level of the workshop was just right, even for participants with basic or intermediate preexisting knowledge.

I believe these relatively good scores are at least partly due to the self-paced tutorial format in several of the workshops, which allows participants to choose the content that is useful for them and go at their own pace. For instance, in the version control workshop, two participants finished the tutorial in 30-40 min, one finished at the 3h mark, and the last one finished at home that evening. Those finishing early were encouraged to upgrade their current projects using the tool learned, and to explore more advanced features with one of the facilitators.

How did you manage to minimise the barriers to entry, e.g. for programming and version control, for early career researchers from non-technical disciplines?

MI: The programming session in R was for absolute novices, taking participants all the way, through a self-paced tutorial, to all essential knowledge in R programming. For those already familiar with R programming, which is more and more common in the STEM, a Julia programming workshop was offered in parallel.

Similarly, the version control workshop was a self-paced tutorial designed for people who had never used Git before and who were possibly first introduced to this concept during this summer school. The self-paced tutorial format allows people already familiar with the concepts or even some aspects of the practice to go rapidly through the first parts of the workshop and spend more time on the more advanced parts.

Psychologists, social scientists, ecologists etc. don’t usually study these disciplines to learn computer science. However, basic computing skills are required to produce reproducible research. Even when this is not part of their official curriculum, many come in contact with it, either learning these skills on their own, or envying others who use these skills and who are more efficient in their work for it. I do try to provide general introductions to open research practices across departments to raise awareness about the relevance of these practices even within these traditionally-thought ‘non-technical disciplines’. However, people who were never exposed to these practices, even though they may still be helpful for them, would indeed likely not have applied to this summer school, perhaps thinking it wasn’t relevant to them.

For this summer school, we did not advertise, target through our wording, nor select purely qualitative researchers, such as researchers in the humanities. While programming and version control for some of them is very relevant (e.g. in linguistics, or some digital humanities) it would not be for others (e.g. history, theology), and overall, the rest of the programme wasn’t designed for research fields outside empirical sciences. We are, however, offering other events targeting researchers in the humanities, whose needs in terms of open research practices widely differ.

How did you structure the Summer School to make it accessible and attractive to early career researchers?

MI: To make it accessible to early career researchers with various responsibilities (e.g. medical doctors on call in the evening), the summer school was hybrid and participants could join in a mixed format (i.e. some would attend in person some days, and online on other days).

The feedback was overwhelmingly positive about this format. Online full-school participants and online participants of the public lectures felt included: their questions were addressed, they could participate as much as the in-person participants to the group activities. We also made sure the sound was always of good quality (we had many table microphones) and that the video feed was immersive (we had 360° cameras, focusing on the presenters during their talk and showing the whole room during the Q&As).

Mixed-format attendees appreciated the flexibility and accommodation to their caring duties, other commitments outside of the summer school hours that didn’t allow them to commute every day, the need to reduce overstimulation from social interactions when needed, or other physical or mental needs for accommodation.

Participants also appreciated the fact that the flexibility in the format also allowed to maintain the integrity of the programme instead of cancelling parts of it. Specifically, participants who had planned to attend in person, e.g. a mildly injured presenter, a presenter with child-caring duties, an attendee who had tested positive for COVID-19 but who was feeling reasonably well, and myself on the last day as I was feeling unwell, all switched to online participation, and did so very smoothly, without disturbance to the programme or the audience.

Are there any particular didactic challenges in organising a hybrid event and how did you deal with them?

MI: The didactic for hybrid workshops is somewhat different from solely online or solely in-person workshops. The largest difference in didactic was for the computing skill workshops: We prioritised self-paced tutorials, with a team of facilitators (minimally one online and one in person) to answer questions one-on-one, through screen sharing in Zoom breakout rooms or in person. I then encouraged instructors to have regular group check-ins, to see whether all participants had reached a specific point by a specific time, asking them to put green/red stickers on their laptop or use green tick/red cross reactions on Zoom. We could then approach participants who were slightly behind and offer personalised support.

For all other more synchronised or interactive sessions, I sent detailed instructions to presenters in advance so as to be inclusive of both audiences. These instructions included adding, in advance, all URLs needed in a central document; using polling or brainstorming tools that all attendees can access even if not on Zoom (e.g. whiteboards, slido pools, google docs, or all to raise (virtual) hands); always using the mouse to point at part of a slide; projecting the online attendees when having a group discussion, etc. Both attendees and presenters were also instructed to use microphones at all time, to not overlap their speech and instead wait for a speaker to be done talking, and to use non-gendered terms and the neutral pronoun ‘they’ to refer to an unnamed attendee from which they could not read the pronouns on their badges or Zoom name.

How do you measure the success of the Summer School?

MI: I think there are several measures of success for such an event:

  1. Interest:
    • number of applications, particularly from the own organisation (when it is a well-known one) as opposed to from all over the world when advertised on social media. We received 95 applications (58% of which from the LMU) for 35 seats.
    • number of registrations for online attendance to the public lectures. We received over 300 registrations (30% from the LMU) for the public lectures.
  2. Feedback from participants. In our feedback form, respondents needed to
    • score each of the sessions from 0 (I do not recommend it) to 5 (I fully recommend it). On average, across all sessions, the score was 4.3, and the median score was 4. The minimum median for each session was also 4, and 13 out of 24 sessions had a median score of 5!
    • indicate what they gained from attending. 100% of the respondents indicated they gained new skills and empowerment to adopt open research practices, 95% indicated they gained new insights or understanding, 71% gained a sense of community and support network, and 62% gained empowerment to become an ambassador for open science! 
    • describe, with free text, their highlights and suggestions for improvement. Highlights were very extensive; they related to the content, the expertise of the presenters and instructors, the practicality of the workshops, the hybrid format organisation, and the joy of being able to network with like-minded researchers. I also received 10+ unsolicited emails expressing gratitude from some of the full participants or attendees to the public lectures. Suggestions for improvement were to provide more time for interactions between participants, including during the workshops based on self-paced tutorials, to avoid redundancy in the lectures, and to include bananas in the snack options!
  3. Impact on the participants’ actual workflow. While we won’t conduct an actual meta-research study on this question, we will collect some anecdotal evidence at the follow-up event in 6 months, to assess whether or not, and to what extent, they’ve integrated the practices learned during the summer school into their daily research workflow.

Do you have any advice for other universities or organisations wishing to run similar programmes?

MI: Here some pieces of advice to those who wish to organise such event, combining my experience and the feedback received:

  1. Define a target audience and a programme that matches this audience and be very explicit about this when advertising
  2. Plan slots for facilitated open discussions amongst participants
  3. Invite a wide diversity of presenters in terms of gender and disciplines, including at least a few presenters with tested-out and appreciated training material and known pedagogical skills. I would recommend a combination of motivational stories from well-known professors, and very practical tutorials from dynamic early career researchers.
  4. Insist with your guest presenters and instructors that the most useful and appreciated contributions are those that provide practical implementation advice as opposed to being purely conceptual, as well as those that actively engage participants. Do provide explicit examples of engagement, especially if the event is hybrid, so that presenters understand the type of options that are inclusive of both audiences.
  5. Coordinate content amongst instructors, give feedback on newly created material with the target audience in mind, organise meetings amongst presenters of adjacent topics to maximise complementarity over redundancy in their content. While invited guests are experts on their topic, the event organiser should know the needs of the target audience and communicate clearly with the presenters and instructors about it.
  6. Communicate all logistical instructions to presenters and attendees, to match expectations and create a professional atmosphere inclusive of all. For instance, (i) request all participants (i.e. instructors and attendees) to use a central document to add links to tutorials, slides, and other resources they want to share across both audiences, in advance and during each session; (ii) inform all that it is particularly important with the hybrid format that each session needs to start on time.
  7. And perhaps what should be the first advice: make an extensive checklist of everything that needs doing with a matching timeline, from putting a programme together and advertising the call for application, through booking a venue and testing the technical set up, to creating a feedback form and advertising the course content, before actually starting with the organisation!

Thank you!

About Malika Ihle:

Malika Ihle is an accomplished open research coordinator and a former biologist specialising in the evolution of animal behaviour. Her career has been marked by a deep commitment to improving the reliability, reproducibility and overall credibility of scientific research.

Her professional journey in promoting open science began when she took on the role of the first Reproducible Research Coordinator at the University of Oxford. In this pioneering role, she championed methods and practices aimed at improving the trustworthiness of scientific work, leaving a lasting impact on the institution.

Currently, Malika Ihle is the coordinator of the LMU Open Science Center, where she is responsible for coordinating and organising events and grassroots initiatives that promote open and reproducible research. In addition to event management, she designs and delivers lectures and workshops on open research practices, contributing to the dissemination of knowledge in this critical area.

Malika Ihle is co-founder of the Society for Open Reliable and Transparent Ecology and Evolutionary Biology, an organisation dedicated to promoting transparency and accessibility in scientific research.




to Open Science Magazine