Effective team building and leadership are crucial in running an effective and safe healthcare system with improved patient care and clinical outcomes. Currently, there is a great demand for formal leadership training throughout the extensive medical education curriculum. We constructed an interactive team-building activity utilizing gamification-theory with the Zoom game. The Zoom activity requires a team of learners to organize a set of sequential images, each of which contains a "zoomed out" section from the previous image, into the correct order within a set time frame. Given the unique and approachable nature of this team-based activity, we propose the following: 1) to introduce the Zoom game as a team-building and communication fostering exercise in undergraduate medical education and 2) to assess baseline teamwork skills of first-year medical students through an immersive gaming experience. With this in mind, 260 first-year medical students (class of 2020) at an urban-city medical school were enrolled in the Zoom Team Building Activity as part of their orientation. The students were randomly assigned to 11 teams, comprising 23-24 students and two faculty facilitators per team and completed the activity in the allotted time frame. The average time to complete the Zoom game was 24 minutes, and all the teams successfully placed the pictures in the correct order. Facilitators noted that the Zoom game strongly encouraged friendly interactions, intercollegiate high values, mutual respect, confidence, and trust among each other. Students observed take-home points such as selecting a leader, designating specific roles, and encouraging closed-loop communication. Overall, the Zoom activity game is an interactive, fun, and easily accessible team-building and communication fostering exercise in undergraduate medical education. Further studies on the Zoom game exercise would be essential to determine whether it has a continuous and enduring effect on developing team building among medical students.
Burnout among emergency medicine (EM) residents is gaining increasing attention. The authors designed a workshop to assess EM residents' resilience using a validated scale to prompt personal reflection. The workshop then shifted to peer-to-peer conversations and sharing using images from Visual Explorer (VE) to further reflect on resilience. Overall, resident resilience scores were below those of the US general population, with postgraduate year (PGY)-2 year residents having the lowest scores. The workshop was well received by residents; data from the Critical Incident Questionnaire (CIQ) suggested that residents felt engaged during discussion of the images. Further study is needed to assess the correlation between resilience scores and burnout.
Introduction A primary aim of residency training is to develop competence in clinical reasoning. However, there are few instruments that can accurately, reliably, and efficiently assess residents’ clinical decision-making ability. This study aimed to externally validate the script concordance test in emergency medicine (SCT-EM), an assessment tool designed for this purpose. Methods Using established methodology for the SCT-EM, we compared EM residents’ performance on the SCT-EM to an expert panel of emergency physicians at three urban academic centers. We performed adjusted pairwise t-tests to compare differences between all residents and attending physicians, as well as among resident postgraduate year (PGY) levels. We tested correlation between SCT-EM and Accreditation Council for Graduate Medical Education Milestone scores using Pearson’s correlation coefficients. Inter-item covariances for SCT items were calculated using Cronbach’s alpha statistic. Results The SCT-EM was administered to 68 residents and 13 attendings. There was a significant difference in mean scores among all groups (mean + standard deviation: PGY-1 59 + 7; PGY-2 62 + 6; PGY-3 60 + 8; PGY-4 61 + 8; 73 + 8 for attendings, p < 0.01). Post hoc pairwise comparisons demonstrated that significant difference in mean scores only occurred between each PGY level and the attendings (p < 0.01 for PGY-1 to PGY-4 vs attending group). Performance on the SCT-EM and EM Milestones was not significantly correlated (r = 0.12, p = 0.35). Internal reliability of the exam was determined using Cronbach’s alpha, which was 0.67 for all examinees, and 0.89 in the expert-only group. Conclusion The SCT-EM has limited utility in reliably assessing clinical reasoning among EM residents. Although the SCT-EM was able to differentiate clinical reasoning ability between residents and expert faculty, it did not between PGY levels, or correlate with Milestones scores. Furthermore, several limitations threaten the validity of the SCT-EM, suggesting further study is needed in more diverse settings.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.