The Nightmares Course demonstrated feasibility and acceptability, and is applicable to a broad array of postgraduate medical education programs. The entrustment-based assessment detected several residents not meeting a minimum competency threshold, and directed them to additional training.
PurposeWithin competency‐based medical education, self‐regulated learning (SRL) requires residents to leverage self‐assessment and faculty feedback. We sought to investigate the potential for competency‐based assessments to foster SRL by quantifying the relationship between faculty feedback and entrustment ratings as well as the congruence between faculty assessment and resident self‐assessment.Materials and methodsWe collected comments in (a) an emergency medicine objective structured clinical examination group (objective structured clinical examinations [OSCE] and emergency medicine OSCE group [EMOG]) and (b) a first‐year resident multidisciplinary resuscitation “Nightmares” course assessment group (NCAG) and OSCE group (NOG). We assessed comments across five domains including Initial Assessment (IA), Diagnostic Action (DA), Therapeutic Action (TA), Communication (COM), and entrustment. Analyses included structured qualitative coding and (non)parametric and descriptive analyses.ResultsIn the EMOG, faculty's positive comments in the entrustment domain corresponded to lower entrustment score Mean Ranks (MRs) for IA (<11.1), DA (<11.2), and entrustment (<11.6). In NOG, faculty's negative comments resulted in lower entrustment score MRs for TA (<11.8 and <10) and DA (<12.4), and positive comments resulted in higher entrustment score MRs for IA (>15.4) and COM (>17.6). In the NCAG, faculty's positive IA comments were negatively correlated with entrustment scores (ρ = −.27, P = .04). Across programs, faculty and residents made similar domain‐specific comments 13% of the time.ConclusionsMinimal and inconsistent associations were found between narrative and numerical feedback. Performance monitoring accuracy and feedback should be included in assessment validation.
Over the past decade, simulation-based education has emerged as a new and exciting adjunct to traditional bedside teaching and learning. Simulation-based education seems particularly relevant to emergency medicine training where residents have to master a very broad skill set, and may not have sufficient real clinical opportunities to achieve competence in each and every skill. In 2006, the Emergency Medicine program at Queen's University set out to enhance our core curriculum by developing and implementing a series of simulation-based teaching sessions with a focus on resuscitative care. The sessions were developed in such as way as to satisfy the four conditions associated with optimum learning and improvement of performance; appropriate difficulty of skill, repetitive practice, motivation, and immediate feedback. The content of the sessions was determined with consideration of the national training requirements set out by the Royal College of Physicians & Surgeons of Canada. Sessions were introduced in a stepwise fashion, starting with a cardiac resuscitation series based on the AHA ACLS guidelines, and leading up to a more advanced resuscitation series as staff became more adept at teaching with simulation, and as residents became more comfortable with this style of learning. The result is a longitudinal resuscitation curriculum that begins with fundamental skills of resuscitation and crisis resource management (CRM) in the first 2 years of residency and progresses through increasingly complex resuscitation cases where senior residents are expected to play a leadership role. This paper documents how we developed, implemented, and evaluated this resuscitation-based simulation curriculum for Emergency Medicine postgraduate trainees, with discussion of some of the challenges encountered.
ObjectivesTo address the increasing demand for the use of simulation for assessment, our objective was to review the literature pertaining to simulation-based assessment and develop a set of consensus-based expert-informed recommendations on the use of simulation-based assessment as presented at the 2019 Canadian Association of Emergency Physicians (CAEP) Academic Symposium on Education.MethodsA panel of Emergency Medicine (EM) physicians from across Canada, with leadership roles in simulation and/or assessment, was formed to develop the recommendations. An initial scoping literature review was conducted to extract principles of simulation-based assessment. These principles were refined via thematic analysis, and then used to derive a set of recommendations for the use of simulation-based assessment, organized by the Consensus Framework for Good Assessment. This was reviewed and revised via a national stakeholder survey, and then the recommendations were presented and revised at the consensus conference to generate a final set of recommendations on the use of simulation-based assessment in EM.ConclusionWe developed a set of recommendations for simulation-based assessment, using consensus-based expert-informed methods, across the domains of validity, reproducibility, feasibility, educational and catalytic effects, acceptability, and programmatic assessment. While the precise role of simulation-based assessment will be a subject of continued debate, we propose that these recommendations be used to assist educators and program leaders as they incorporate simulation-based assessment into their programs of assessment.
Introduction: Simulation is becoming a popular educational modality for physician continuing professional development (CPD). This study sought to characterize how simulation-based CPD (SBCPD) is being used in Canada and what academic emergency physicians (AEPs) desire in an SBCPD program. Methods: Two national surveys were conducted from March to June 2018. First, the SBCPD Needs Assessment Survey was administered online to all full-time AEPs across 9 Canadian academic emergency medicine (EM) sites. Second, the SBCPD Status Survey was administered by telephone to the department representatives (DRs)-simulation directors or equivalent-at 20 Canadian academic EM sites. Results: Response rates for the SBCPD Needs Assessment and the SBCPD Status Survey were 40% (252/635) and 100% (20/20) respectively. Sixty percent of Canadian academic EM sites reported using SBCPD, although only 30% reported dedicated funding support. Academic emergency physician responses demonstrated a median annual SBCPD of 3 hours. Reported incentivization for SBCPD participation varied with AEPs reporting less incentivization than DRs. Academic emergency physicians identified time commitments outside of shift, lack of opportunities, and lack of departmental funding as their top barriers to participation, whereas DRs thought AEPs fear of peer judgment and inexperience with simulation were substantial barriers. Content areas of interest for SBCPD were as follows: rare procedures, pediatric resuscitation, and neonatal resuscitation. Lastly, interprofessional involvement in SBCPD was valued by both DRs and AEPs. Conclusions: Simulation-based CPD programs are becoming common in Canadian academic EM sites. Our findings will guide program coordinators in addressing barriers to participation, selecting content, and determining the frequency of SBCPD events.
Competency-based curricula require the development of novel simulation-based programs focused on the assessment of entrustable professional activities. The design and delivery of simulation-based programs are labor-intensive and expensive. Furthermore, they are often developed by individual programs and are rarely shared between institutions, resulting in duplicate efforts and the inefficient use of resources.The purpose of this study is to demonstrate the feasibility of implementing a previously developed simulation-based curriculum at a second institution. We sought to demonstrate comparable program-level outcomes between our two study sites.A multi-disciplinary, simulation-based, resuscitation skills training curriculum developed at Queen’s University was implemented at the University of Saskatchewan. Standardized simulation cases, assessment tools, and program evaluation instruments were used at both institutions.Across both sites, 87 first-year postgraduate medical trainees from 14 different residency programs participated in the course and the related research. A total of 226 simulated cases were completed in over 80 sessions. Program evaluation data demonstrated that the instructor experience and learner experience were consistent between sites. The average confidence score (on a 5-point scale) across sites for resuscitating acutely ill patients was 3.14 before the course and 4.23 (p < 0.001) after the course.We have described the successful implementation of a previously developed simulation-based resuscitation curriculum at a second institution. With the growing need for competency-based instructional methods and assessment tools, we believe that programs will benefit from standardizing and sharing simulation resources rather than developing curricula de novo.
Introduction: Patient-centered care is a core principle of the Canadian healthcare system. In order to facilitate patient-centered care, the documentation of a patient's medical goals and expectations is important, especially in the event of acute decompensation when an informed conversation with the patient may not be possible. The 'Goals of Care Discussion Form (GCF)' at Kingston Health Sciences Centre (KHSC) documents goals of care discussions between patients and healthcare providers. All patients admitted to the Internal Medicine service are expected to have this form completed within 24 hours of admission. Formal measurement of form completion at our center has not previously been done, though anecdotally this form is often incomplete. The purpose of this study is to quantify the rate of completion and assess quality of documentation of the GCF at KHSC. Methods: This prospective chart review took place between August 25, 2018, and March 25, 2019. Charts were reviewed for the presence of a completed GCF, and the quality of notation was assessed, as appropriate. Given there are no existing tools for assessing the quality of a document such as the GCF, authors TC and JM created one de novo for this study. Extracted data included the amount of time elapsed between admission and completion of the GCF, whether the 'yes/no cardiopulmonary resuscitation (CPR)' order in the patient's chart aligned with their wishes as outlined on the GCF, and whether or not a patient's GCF was uploaded to the hospital's electronic medical record (EMR). Results: Two hundred sixteen charts were reviewed. Of these, 136 (63.0%) had a complete GCF. The mean GCF quality score was 3.4/7 (95% CI [3.2, 3.6]). The mean time elapsed from admission to the completion of the GCF was 1.5 days (95% CI [0.6, 2.4]). There were 130 charts with both a complete GCF and a 'yes/no CPR' order, and of these, 20 (15.4%) showed a discrepancy. Eighty-six (63.2%) of the completed GCFs were uploaded to the EMR. Discussion and conclusions: The rate of GCF completion at KHSC is noticeably higher than expected based on the previous literature. However, our assessment of the quality of completion indicates that there is room for improvement. Most concerning, discrepancies were found between the 'yes/no CPR' order in a patient's chart and their stated wishes on the GCF. Furthermore, less than two-thirds of completed GCFs were found to have been uploaded to the hospital's EMR. Given the emphasis on patient-centered care in the Canadian healthcare system, our findings suggest that improvement initiatives are needed with respect to documenting goals of care discussions with patients.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.