Search citation statements
Paper Sections
Citation Types
Year Published
Publication Types
Relationship
Authors
Journals
Background Training of examiners is essential to ensure the quality of objective structured clinical examination (OSCE). We aimed to study a perceived effectiveness of tutor-student partnership in a practice OSCE module by novice OSCE tutors and medical students. Method We implemented a practice OSCE at a medical faculty in France with novice tutors and third year medical students as partners. Each tutor (n = 44) served as a partner for the group of 5 students in the conception of the scenario and as an evaluator of the tutored station. Students (n = 303) were involved in the conception of a case and the roles of a physician, evaluator and a simulated patient. Data were obtained through self-assessment questionnaires. Descriptive statistics were used to analyze items of the questionnaires. Free-form answers were coded and analyzed thematically. Results A total of 36 tutors (82%) and 185 students (61%) responded to the questionnaires. The intervention was well perceived. Thirty-two percent of the tutors reported some difficulties in the assessment of student performance and were disposed to receive further training. Fifty-five percent of the students considered the participation in the OSCE case development appropriate to their level of knowledge, and 70% perceived it as beneficial allowing them to set their learning goals. Conclusion This initiative provides a relevant method beneficial to OSCE tutors, medical students, and the faculty. Tutors learn how to assess student performance according to expected achievement levels. It allows students to be engaged as partners in co-creation of learning and teaching.
Background Training of examiners is essential to ensure the quality of objective structured clinical examination (OSCE). We aimed to study a perceived effectiveness of tutor-student partnership in a practice OSCE module by novice OSCE tutors and medical students. Method We implemented a practice OSCE at a medical faculty in France with novice tutors and third year medical students as partners. Each tutor (n = 44) served as a partner for the group of 5 students in the conception of the scenario and as an evaluator of the tutored station. Students (n = 303) were involved in the conception of a case and the roles of a physician, evaluator and a simulated patient. Data were obtained through self-assessment questionnaires. Descriptive statistics were used to analyze items of the questionnaires. Free-form answers were coded and analyzed thematically. Results A total of 36 tutors (82%) and 185 students (61%) responded to the questionnaires. The intervention was well perceived. Thirty-two percent of the tutors reported some difficulties in the assessment of student performance and were disposed to receive further training. Fifty-five percent of the students considered the participation in the OSCE case development appropriate to their level of knowledge, and 70% perceived it as beneficial allowing them to set their learning goals. Conclusion This initiative provides a relevant method beneficial to OSCE tutors, medical students, and the faculty. Tutors learn how to assess student performance according to expected achievement levels. It allows students to be engaged as partners in co-creation of learning and teaching.
What should the evaluator, in a simulated learning experience, do when the assessment being scored yields a critical result? We searched for guidance on recommendations or standards for an evaluator becoming a provider and we were unable to find any published work addressing our experiences. There are numerous articles addressing standardized patients. 1,2,3,4 An article by Simon and Dove 5 encourages a professional evaluation of the standardized patient before assessment by learners. Thus, preventing the finding of urgent or emergent results during the skills assessment. Neither of our experiences used standardized patients. One used classmate-partners and the other a family volunteer to serve as the patients. We offer the following two experiences to initiate a discussion centering on serving as an evaluator in skills testing and the added challenge of serving as a remote evaluator for learners in other jurisdictions. Each of these experiences is based on real evaluator interactions but have been modified to protect the identity of the learners.The first assessment was an in-person assessment of basic patient physical assessment skills as a part of a national point-of-care testing certificate training program. Both evaluators were pharmacists, licensed in the jurisdiction of the testing, with assistance from trained student-assessors who had previously achieved the national certificate. During blood pressure assessment, a pharmacy student learner with an unusually high reading (182/94mmHg) was identified. 6 The student-assessor requested help from the pharmacist-evaluator. While taking the patient history, the pharmacist-evaluator identified previous treatment for rhabdomyolysis. 7,8 The student was transported to the emergency department and was subsequently admitted for renal failure. The pharmacist-evaluator accompanied the student to the emergency department and stayed until a family member arrived.The second assessment was a synchronous, online assessment of a diabetic foot examination. The learner was a pharmacist, licensed in the jurisdiction where the testing was being performed, the evaluator, also a pharmacist, was licensed in a different jurisdiction than where the state where the assessment was conducted. The learner was performing a micro-filament test and the simulated patient, a family member, was unable to feel any of the attempts at stimulation with the micro-filament. The simulated patient had no history of diabetes, leg, or spinal injury, and was able to feel digital stimulation when the learner used a finger to create sensation. 9 The pharmacist-evaluator recommended a consultation with the patient's provider at the earliest possible opportunity, this advice was confirmed by the pharmacist-learner.Evaluators do not expect to find problems during simulated skills assessments. Generally, the simulated patients are either healthy students, faculty, staff or standardized patients with known conditions. In these two instances, however, the simulated patient immediately became a real patient. The ...
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.