Background: Performance feedback is considered essential to clinical skills development. Formative objective structured clinical exams (F-OSCEs) often include immediate feedback by standardized patients. Students can also be provided access to performance metrics including scores, checklists, and video recordings after the F-OSCE to supplement this feedback. How often students choose to review this data and how review impacts future performance has not been documented.Objective: We suspect student review of F-OSCE performance data is variable. We hypothesize that students who review this data have better performance on subsequent F-OSCEs compared to those who do not. We also suspect that frequency of data review can be improved with faculty involvement in the form of student-faculty debriefing meetings.Design: Simulation recording software tracks and time stamps student review of performance data. We investigated a cohort of first- and second-year medical students from the 2015-16 academic year. Basic descriptive statistics were used to characterize frequency of data review and a linear mixed-model analysis was used to determine relationships between data review and future F-OSCE performance.Results: Students reviewed scores (64%), checklists (42%), and videos (28%) in decreasing frequency. Frequency of review of all metric and modalities improved when student-faculty debriefing meetings were conducted (p<.001). Among 92 first-year students, checklist review was associated with an improved performance on subsequent F-OSCEs (p = 0.038) by 1.07 percentage points on a scale of 0-100. Among 86 second year students, no review modality was associated with improved performance on subsequent F-OSCEs.Conclusion: Medical students review F-OSCE checklists and video recordings less than 50% of the time when not prompted. Student-faculty debriefing meetings increased student data reviews. First-year student’s review of checklists on F-OSCEs was associated with increases in performance on subsequent F-OSCEs, however this outcome was not observed among second-year students.
Multiple experts in clinical skills remediation recommend early identification to support struggling learners, but there is minimal documentation on implementation of these programs. We share one school’s outcomes-based research utilizing the formative assessment for learning model to early-identify pre-clerkship students struggling with clinical skills using formative OSCEs (F-OSCE). Student scores were monitored over longitudinal F-OSCE experiences as part of a curricular innovation. Points towards early identification accumulated when a student’s score fell below the 80% threshold for each section of an OSCE. Students who accumulated enough points were advised of the need for intervention, and coaching was recommended. Students were surveyed about their experiences with the program. The objective was to explore whether this early identification program and coaching intervention had a positive impact on subsequent OSCE performance. Of 184 students in 2 cohorts who completed F-OSCEs, 38 (20.7%) were flagged for early identification. Of these, 17 (44.7%) sought additional help by voluntarily participating in the coaching program. Students who participated in extra clinical skills coaching demonstrated statistically significant improvements in performance on subsequent FOSCEs, as did the early identified students who did not participate in extra coaching. The greatest impact of coaching intervention was noted in the physical examination domain. This program was effective in identifying students struggling with clinical skills on formative OSCEs. Early identified students demonstrated improvements in subsequent OSCE performance, with those who sought coaching faring slightly better. Development of robust early identification programs as formative assessments of clinical skills and follow-up coaching programs to guide skills development are important implications of this work. Monitoring short- and long-term results for students identified through this approach to see if improvement is sustained is planned.
Background: Most medical schools in the United States report having a 5- to 10-station objective structured clinical examination (OSCE) at the end of the core clerkship phase of the curriculum to assess clinical skills. We set out to investigate an alternative OSCE structure in which each clerkship has a 2-station OSCE. This study looked to determine the reliability of clerkship OSCEs in isolation to inform composite clerkship grading, as well as the reliability in aggregate, as a potential alternative to an end-of-third-year examination. Design: Clerkship OSCE data from the 2017-2018 academic year were analyzed: the generalizability coefficient (ρ2) and index of dependability (φ) were calculated for clerkships in isolation and in aggregate using variance components analysis. Results: In all, 93 students completed all examinations. The average generalizability coefficient for the individual clerkships was .47. Most often, the largest variance component was the interaction between the student and the station, indicating inconsistency in the performance of students between the 2 stations. Aggregate clerkship OSCE analysis demonstrated good reliability for consistency (ρ2 = .80). About one-third (33.8%) of the variance can be attributed to students, 8.2% can be attributed to the student by clerkship interaction, and 42.6% can be attributed to the student by block interaction, indicating that students’ relative performances varied by block. Conclusions: Two-station clerkship OSCEs have poor to fair reliability, and this should inform the weighting of the composite clerkship grade. Aggregating data results in good reliability. The largest source of variance in the aggregate was student by block, suggesting testing over several blocks may have advantages compared with a single day examination.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.