Building Community: BMERG Journal Club Review, Medical Education Research Labs

The BMERG blog series on building community continues to grow, with a review of our recent journal club publication. Our BMERG Journal Club lead Dr Claire Hudson reflects on the discussion from our May journal club on the establishment of medical education research labs.

Paper reviewed: Gisondi, Michael A. et al. The Purpose, Design, and Promise of Medical Education Research Labs. Academic Medicine 97(9):p 1281-1288, September 2022. https://journals.lww.com/academicmedicine/toc/2022/09000

Since my colleagues launched the Bristol Medical Education Research Group (BMERG), our discussions have focused on creating a productive research environment and increasing the impact of our work as education researchers.

Education research often struggles to get sufficient recognition and lucrative funding compared to basic and clinical sciences research, and many believe basic science research is held in higher esteem and more valued by their institutions.

This paper resonated with members of the BMERG Journal Club, as the authors echo some of these concerns and challenges.  The authors offer their perspective on the significance of medical education research labs and offer a practical roadmap for their establishment and success.

Publication overview

The paper falls under the category of ‘Scholarly Perspective’, and we discussed that it shouldn’t be interpreted as an objective literature review or primary research. The team of authors have presented a collection of case studies from their own experiences, identifying five main medical education research structures:

  • single principal investigator (PI) labs
  • multiple PI labs
  • research centres
  • research collaboratives
  • research networks

The contributors were assembled through existing professional relationships, therefore we questioned whether the categories presented fully reflect the entire range of medical education research structures. However, we accepted this was their ‘Scholarly Perspective’, and we think they effectively conveyed their vision for the future of medical education research, with research labs being central to this.

What is a medical education research lab?

This is an important question! The authors define a lab as,

A distinct team within a department or institution led by single or multiple PIs who focus on specific educational problems

Labs differ from larger research centres, collaboratives, and networks in their scale and scope. The paper provides illustrative case examples to demonstrate how different research structures function in practice, and we found this information both useful and well-presented. As all authors are based in the US, we questioned whether the same structures could be identified in the UK.

What are the benefits of an medical education research lab?

The authors outline several key elements that they consider contribute to the success of medical education research labs:

  1. Lab Identity: The lab should have a focussed line of research that can validate the career path of the PI(s).
  2. Lab designation: The ‘lab’ brand helps signal the importance and legitimacy of the research being conducted, since the lab structure is generally well-understood within medicine. The identity and designation together can attract collaborators, funding, and institutional support.
  3. Infrastructure: Proper infrastructure is crucial; and includes not only physical space and administrative support but also access to necessary research tools and technologies.
  4. Training: Research labs should serve as incubators for new talent. They should provide training and mentorship for students and junior staff, fostering the next generation of medical education researchers.

Did we agree?

Point 2. above on lab designation, sparked our next discussion: do we agree with using the term “lab” in the context of medical education?

We had an interesting debate about the appropriateness of making comparisons to a scientific research environment, and interestingly there was a split of opinion between our qualitative and quantitative colleagues!

We certainly didn’t agree that this nomenclature was essential for research legitimacy (as suggested by the authors), and we descended into brainstorming for other potential terminology for a collection of education researchers; “hub”, “village”, “incubator”, “collective” and even “tribe” were suggested!

Overall reflections

In summary, the authors present a compelling argument for the establishment of research labs as a means to overcome the challenges faced by medical education researchers; providing structured support, fostering collaboration, training new researchers, enhancing research productivity, and elevating the status of medical education research within academic institutions. The paper offers practical insights into the design of these labs, making it a useful resource for anyone involved in medical education research.  It would be interesting to find out more about whether the institutional barriers to establishing such groups are the same in the US as the UK, and within the BMERG Journal Club, we are still on the fence with the word ‘lab’!


More about this blog’s author:

Dr Claire Hudson is a Lecturer on the Teaching and Scholarship Pathway within the Bristol Medical School. Her early research career was in biomedical sciences, but she has now made a transition to pedagogic research. She has a special interest in student autonomy and the use of reflective practice in developing academic skills, as well as exploring MSc student skills development in different demographic groups.


Read more of our journal club reflections:


Building Community: BMERG Journal Club Review, Playful Learning

The BMERG blog series on building community continues to grow, with our journal club meeting bi-monthly. This month our BMERG Journal Club lead Dr Claire Hudson reflects on the discussion from our March journal club on Playful Learning.

Paper reviewed: Macdonald I, Malone E, Firth R. How can scientists and designers find ways of working together? A case study of playful learning to co-design visual interpretations of immunology concepts. Studies in Higher Education. 2022;47(9):1980-96. https://doi.org/10.1080/03075079.2021.2020745

I was intrigued by this paper for quite simple reasons; the terms ‘playful learning’ and ‘co-design’ grabbed my attention, as well as the reference to ‘scientists’. Although I am also an educator, I am a scientist at heart. Before everyone with a clinical background switches off, the paper actually discusses concepts that could apply to all disciplines, and it certainly provoked some fruitful discussion within our group.  

At the University of Bristol, we design our academic programmes to align with a Curriculum Framework, which includes a set of six interconnected dimensions that convey the educational aspirations of the University. Ideas of how to embed these dimensions within our teaching are always welcome, and this paper aligned with at least two of these dimensions: Disciplinary and Interdisciplinary (allowing students to engage beyond their discipline)and Inspiring and innovative (challenging, authentic and collaborative learning). So, I read this paper hoping to find some inspiration.

What was the research?

In summary, the authors designed an interdisciplinary activity with Biological Science students and Product Design students, aiming to communicate an immunology concept (for example allergies, vaccination or transplantation) using digital storytelling. Initially, the scientists pitched their immunology concepts to the designers, and then both sets of students took part in regular co-design workshops held in the design studios to create their final products. The researchers conducted semi-structured interviews with the students and collected Likert questionnaire data, to explore their “preconceptions, experience and future learnings of working in interdisciplinary groups”, analysed using thematic analysis.

What were the findings?

Four themes emerged from their research, summarised below:

1. The influence of environment –Being in the design studio fostered creativity in the Science students and developed different ways of thinking.

2. Playfulness as a creative approach –Freedom from assessment (this activity was outside of the curriculum) allowed for risk taking.

3. Storytelling as a means of expression –Translating information in a visual form enhanced understanding of the immunology material.

4. Recognition of the value of Interdisciplinary working – Relevance to authentic working relationships, exploiting individual strengths.

What did we think?

Limitations of the study

We did have some concerns about the study, such as not being explicit about the objectives and the possibility of confirmation bias. At the end of the introduction the authors state “This study aimed to use interdisciplinary co-design workshops to create opportunities for bringing scientists and designers to work together”; this may have been the purpose of the learning activity, but this didn’t explain the objectives of their research. What did they want to find out?

We discussed the limitations of case studies, however, we agreed that this type of study is useful to disseminate practice and generate ideas, provided the researchers are transparent about the wider relevance. We noted that the findings closely matched the themes presented in their introduction, thereby reconfirming previous assumptions rather than generating novel data, which led us to question the depth of the thematic analysis. This confirmation bias could also have arisen due to the nature of the sample; this was a voluntary task, and it is likely that the participating students were highly motivated. 

How could this be relevant to our own practice?

We all agreed that this was an interesting learning experience for the students, and I love hearing about novel ideas for communicating complex scientific concepts. Often, we retain and understand information with the use of a good metaphor, so perhaps we should all integrate more storytelling into our teaching!

However, since this activity was purely extra-curricular, how relevant is it? Do we really have the time/scope to create these opportunities ‘just for fun’? Creating a genuine interdisciplinary task within a curriculum seems challenging, with potential inter-Programme/School/Faculty logistics to navigate. Some of these perceived obstacles arise from imagining a summative task, however we all agreed that creating formative interdisciplinary tasks would be simpler; and in agreement with the authors, would allow students the freedom to experiment and be ‘playful’, stepping out of their comfort zones without being assessed. A great example of this freedom is the ‘creative piece’ produced by our medical students during year 1 Foundations of Medicine. Students are required to take part, but not awarded an explicit grade, which enables risk taking.

Overall reflections

This paper certainly sparked some great discussion about interdisciplinary and group working (clinical perfusion and medical students, medical and nursing students…), but how do we measure the benefit of such collaborations? At BMERG, our focus is turning these ideas into opportunities for research, so watch this space!


Read more of our journal club reflections:


Building Community: BMERG Journal Club, Cultural Competency

Adding to our BMERG Journal Club series, this month Dr Claire Hudson reflects on the discussion from our January journal club focussing on Cultural Competency.

Liu, J., K. Miles, and S. Li, Cultural competence education for undergraduate medical students: An ethnographic study. Frontiers in Education, 2022. 7. https://www.frontiersin.org/articles/10.3389/feduc.2022.980633/full

This paper was chosen by my colleague, Assoc. Prof Liang-Fong Wong, who has a combined interest in cultural competency and medical education, being Year 4 co-lead for our undergraduate MBChB programme and Associate Pro Vice-Chancellor for Internationalisation.  Both Liang and I are keen to develop our qualitative research skills, and at first glance, this paper seemed like an excellent example of a qualitative study.

What is ‘Cultural Competency’?

Liu et al suggest culturally competent healthcare professionals should “communicate effectively and care for patients from diverse social and cultural backgrounds, and to recognize and appropriately address racial, cultural, gender and other sociocultural relevant biases in healthcare delivery”; others have defined attributes of culture competency including “cultural awareness, cultural knowledge, cultural skill, cultural sensitivity, cultural interaction, and cultural understanding”. These concepts were explained effectively at the start of the paper; I felt the authors provided me with context for my subsequent reading.

What was the research?

The authors perceived that teaching of cultural competency is inconsistent across medical schools, and there is a paucity of evidence for how effective the teaching is, and how students actually develop their cultural competency throughout their training. They aimed to describe students’ experiences of learning and developing cultural competency, using an ethnographic approach. They carried out student observations, interviews and focus groups; recruiting participants from a central London medical school.

What were the findings?

There is a wealth of qualitative data and discussion presented in the paper, so perhaps the authors could summarise their overall findings in a clearer way. They suggest that students develop cultural competency in stages; in the pre-clinical years they have formal teaching opportunities, and as their clinical exposure increases, the culture content becomes embedded and derived from other learning experiences, including intercalation and placements.  They highlight the importance of learning from patients’ lived experiences, from peers and from other (non-medical) student communities.

What did we think?

  • Clear descriptions: I come from a quantitative, scientific background, therefore I find reading qualitative papers quite challenging; the terminology used is noticeably different and somewhat out of my ‘comfort zone’! Having said that, the authors very clearly explained the basis of ethnography and reflexivity, which really helped us understand the rationale for them adopting these approaches. Data collection and analysis were explained in detail which reassured us that these were robust and valid. However, thorough descriptions mean a long paper; and it could be more concise in places.
  • Awareness of limitations: A strength of this research was the authors’ transparency about some of its limitations. For example, they acknowledged a potential bias in participant recruitment due to the main author’s own cultural background, but described ways to mitigate this. We found it really interesting that the authors observed different dynamics in the interviews and focus groups depending on the facilitator. In those conducted by a PhD student, a rapport was built such that the students were relaxed and open with their communication, allowing them to be critical about the cultural competency teaching they had received. Conversely, in those conducted by a medical school academic, students were more reserved and tended to be positive about the teaching, highlighting an obvious teacher-student power dynamic. Importantly, this was acknowledged, and adjustments were made. Our biggest take-home message: Carefully consider who facilitates interviews and focus groups so there are no conflicts of interest, and trust is fostered between participants and researchers. Otherwise, students may just tell you what you want to hear!
  • Evaluation to recommendations: We also remarked that the authors have been clever in the way they present this study for publication. Essentially, they have carried out an internal evaluation of cultural competency teaching in their own medical school, but they have externalised this by making a series of recommendations. They benefit from a very diverse student population, and showcase some really good practice in cultural competency teaching which could be adopted by medical schools.

Overall reflections

Reading this paper made us reflect on non-clinical teaching on other programmes; it is important to remember that diverse student populations increase cultural awareness in all settings. Widening participation schemes and overseas students are important for this. During group work, I try to make the groups as diverse as possible, and I believe this is a positive experience.

The study highlighted different levels of engagement from students with cultural competency teaching, some thought it was ‘pointless’ as they were already culturally competent, or they thought the skills were ‘soft’ and would rather be learning facts, other found it really valuable. This is familiar when teaching other skills in other disciplines; the constant battle getting ‘buy-in’ from students, highlighting the need to always explain ‘Why’ certain teaching is important.

This study is a good showcase for qualitative research, and I made a mental note to refer back to this paper when developing my own qualitative research in the future; which must be a good sign!


Read our previous Journal club review on Self-regulated learning here: https://bmerg.blogs.bristol.ac.uk/2023/11/24/journal_club_publication_review1/


Building Community: BMERG Journal Club Review, Self-regulated Learning

Adding to the BMERG blog series on building community, our BMERG Journal Club lead Dr Claire Hudson reflects on the discussion from our recent BMERG journal club session focussing on Self-regulated Learning.

Paper reviewed: Zarei Hajiabadi, Z., Sandars, J., Norcini, J. and Gandomkar R, 2023. The potential of structured learning diaries for combining the development and assessment of self-regulated learning. Adv in Health Sci Educ. pp1-17. https://doi.org/10.1007/s10459-023-10239-6

As the first journal club hosted by BMERG, I wanted to choose a research topic that focussed on medical students, but also assessed teaching and learning strategies applicable to other student groups. As someone who predominantly teaches MSc students within the Bristol Medical School, I have my own interest in student autonomy of learning, whether this is self-regulated learning (SRL) or self-directed learning (SDL) – there is a difference, explained by Gandomkar and Sandars in their paper, “Clearing the confusion about self-directed learning and self-regulated learning“[1].

The general premise of SRL is a cycle of planning, performing and evaluating, but in the context of a specific task; at least that’s my very simple interpretation.

What was the research?

The purpose of the main research study was to determine whether an SRL intervention could help academically low-achieving medical students perform better in a specific exam. The SRL intervention consisted of Structured Self-Regulated Learning (SSRL) diaries accompanied by SRL training over a 4-week period, delivered to 20 students who subsequently sat the exam. The SSRL diaries consisted of 21 questions based around constructs aligned with the SRL model proposed by Zimmerman (2002) [2]. The scores in this exam, and a broader measure of academic attainment across the year, were compared to a matched group of students from a previous year, who did not receive the intervention. In an earlier publication, Zarei Hajiabadi et al (2022), they reported that the exam grade was higher in the intervention group compared to the ‘quasi’ control, but overall attainment (GPA score) was not different [3].

In the 2023 publication, the authors sought to determine:

  1. whether the SSRL diaries can act as a reliable measurement of SRL development over time
  2. what the efficacy of the intervention (SSRL diaries + training) was for developing SRL skills

They determined 1) by conducting internal consistency and generalisability analyses of the SSRL entries and 2) by taking the mean scores for different SRL attributes from the SSRL diaries and determining their change over time using ANOVA.

To summarise, the authors documented good generalisability scores, and they conclude that their intervention increased the students self-reported SRL abilities.

What did we think?

Firstly, the complexity of the aforementioned consistency and generalisability analyses went over the heads of most in our discussion group, and we felt that the paper was overly complicated. We wish we’d read the 2022 paper first (linked below), which is a simpler, more interesting and more pertinent publication, so I advise doing that!

The ‘quasi’ control group from a previous cohort was a study limitation, however this is a common study design when trying to measure the efficacy of a teaching intervention. There are issues with a classical experimental design, i.e., a control versus intervention group; if you hypothesise that your intervention will benefit the students then the control group may be unfairly disadvantaged.

We also questioned the authors’ conclusion that the intervention increased the students self-reported SRL abilities. Students rated their SRL abilities via the SSRL diary over a 4-week timescale, however, since the students were also studying for the exam during this period, these skills may naturally have increased leading up to the exam, in spite of the SRL training and diary. It’s very hard to determine cause and effect in this instance.

Overall reflections

This paper provoked some interesting discussion; with the diversity of our student populations it is natural that some students require additional support more than others. However, we questioned whether it is appropriate to ‘target’ lower-achieving students, and how labelling students in such ways could be demoralising. Providing additional support as an optional resource also has its limitations, since the students who don’t engage are often the ones who would benefit most. I think most educators are familiar with this problem. However, for students that have performed poorly, for example failed an assessment at first attempt, then interventions to help them study more effectively for a second attempt should be encouraged.  

The SSRL diaries provide good suggestions for questions/prompts that encourage goal setting, self-monitoring and self-evaluation practices; these could be incorporated into a diverse range of learning activities such as clinical skills training, exam revision or to provide momentum during MSc research projects. Overall, I enjoyed reading about this study, and it has sustained my interest in nurturing SRL and structured reflections in my students; the more ideas, the better!

References:

  1. Gandomkar R, Sandars J, 2018. Clearing the confusion about self-directed learning and self-regulated learning. Med Teach. 40:8,862-863, DOI: 10.1080/0142159X.2018.1425382. Epub 2018 Jan 12. PMID: 29327634.
  2. Zimmerman B, 2002. Becoming a Self-Regulated Learner: An Overview, Theory Into Practice, 41:2, 64-70, DOI: 10.1207/s15430421tip4102_2
  3. Zarei Hajiabadi Z, Gandomkar R, Sohrabpour A & Sandars J, 2022. Developing low-achieving medical students’ self-regulated learning using a combined learning diary and explicit training intervention, Medical Teacher, 45:5,475-484, DOI: 10.1080/0142159X.2022.2152664