In 2012, the Accreditation Council for Graduate Medical Education (ACGME) introduced the Next Accreditation System (NAS) for residency program accreditation. With implementation of the NAS, residents are assessed according to a series of new emergency medicine (EM)-specific performance milestones, and the frequency of assessment reporting is increased. These changes are driving the development of new assessment tools for the NAS that can be feasibly implemented by EM residency programs and that produce valid and reliable assessment data. This article summarizes the recommendations of the writing group on assessment of observable learner performance at the 2012 Academic Emergency Medicine consensus conference on education research in EM that took place on May 9, 2012, in Chicago, Illinois. The authors define an agenda for future assessment tool research and development that was arrived at by consensus during the conference.ACADEMIC EMERGENCY MEDICINE 2012; 19:1354-1359© 2012 by the Society for Academic Emergency Medicine I n 1999 the Accreditation Council for Graduate Medical Education (ACGME) introduced the Outcomes Project, 1 a multiyear process to accredit residency programs based on the assessment of individual resident performance within a framework of six core competency domains: 1) patient care, 2) medical knowledge, 3) practice-based learning and improvement (PBLI), 4) interpersonal and communication skills (ICS), 5) professionalism, and 6) systems-based practice (SBP). Since 2001, the medical education community has passed through the implementation phases of the Outcomes Project and now routinely assesses learners according to this framework. 2In 2012 the ACGME introduced the Next Accreditation System 3 (NAS), which builds on the principles of the Outcomes Project by defining a continuum of performance milestones that culminate in full achievement of competency in each domain (Table 1). Emergency medicine (EM) is an early adopter of the NAS and will begin program accreditation according to this framework in 2013. The NAS differs from the previous accreditation system by requiring more frequent collection and biannual submission of resident assessment data, while reducing the frequency of formal site visits. Because both the assessment standards (milestones) and the frequency of reporting of resident assessment will be changing with the implementation of the NAS, there exists an imperative to develop assessment tools that can feasibly be implemented by multiple residency programs and produce valid and reliable assessment data.This article summarizes the recommendations of the breakout group on assessment of observable learner performance at the 2012 Academic Emergency Medicine consensus conference "Education Research in Emergency Medicine: Opportunities, Challenges, and Strategies for Success," that took place on May 9, 2012, in Chicago, Illinois. We define an agenda for future assessment tool research and development that was arrived at by consensus during the conference. THE CONSENSUS BUILDING ...
Interpersonal and communication skills (ICS) are a key component of several competency-based schemata and key competency in the set of six Accreditation Council for Graduate Medical Education (ACGME) core competencies. With the shift toward a competency-based educational framework, the importance of robust learner assessment becomes paramount. The journal Academic Emergency Medicine (AEM) hosted a consensus conference to discuss education research in emergency medicine (EM). This article summarizes the initial preparatory research that was conducted to brief consensus conference attendees and reports the results of the consensus conference breakout session as it pertains to ICS assessment of learners. The goals of this consensus conference session were to twofold: 1) to determine the state of assessment of observable learner performance and 2) to determine a research agenda within the ICS field for medical educators. The working group identified six key recommendations for medical educators and researchers.
Professionalism is one of the six Accreditation Council on Graduate Medical Education (ACGME) core competencies on which emergency medicine (EM) residents are assessed. However, very few assessment tools exist that have been rigorously evaluated in this population. One goal of the 2012 Academic Emergency Medicine consensus conference on education research in EM was to develop a research agenda for testing and developing tools to assess professionalism in EM residents. A literature review was performed to identify existing assessment tools. Recommendations on future research directions
There is an established expectation that physicians in training demonstrate competence in all aspects of clinical care prior to entering professional practice. Multiple methods have been used to assess competence in patient care, including direct observation, simulation-based assessments, objective structured clinical examinations (OSCEs), global faculty evaluations, 360-degree evaluations, portfolios, self-reflection, clinical performance metrics, and procedure logs. A thorough assessment of competence in patient care requires a mixture of methods, taking into account each method's costs, benefits, and current level of evidence. At the 2012 Academic Emergency Medicine (AEM) consensus conference on educational research, one breakout group reviewed and discussed the evidence supporting various methods of assessing patient care and defined a research agenda for the continued development of specific assessment methods based on current best practices. In this article, the authors review each method's supporting reliability and validity evidence and make specific recommendations for future educational research.ACADEMIC EMERGENCY MEDICINE 2012; 19:1379-1389 by the Society for Academic Emergency Medicine I n 2001, the Accreditation Council for Graduate Medical Education (ACGME) introduced a timeline for the implementation of training and assessment in six core competencies that form the foundation of clinical competence. Introduced in 1996, the Canadian CanMEDS manager competency correlates to the AC-GME patient care competency, broadly defined as "the active engagement in decision-making in the operation of the healthcare system."1 The patient care competency for emergency medicine (EM) has been defined by a previous Academic Emergency Medicine (AEM) consensus conference, 2 now further elaborated on by the milestones in training, 3 as being able to efficiently gather and synthesize medical and diagnostic information, prioritize tasks, and implement management plans on multiple patients, as well as performing essential invasive procedures competently.There is an explicit expectation that physicians in training demonstrate competence in various aspects of clinical care prior to graduation and professional practice. 4 While this accountability falls squarely on the shoulders of residency training programs, it is mirrored by commensurate expectations of maintenance of competency during ongoing professional practice.The goals of the 2012 AEM consensus conference patient care working group were to describe the current state of evidence for assessment of competence in patient care and define a research agenda for the further development of specific assessment methods based on current best practices. METHODSA search was conducted using MEDLINE 1996-present using the key word search terms "assessment," "patient care," "competency," "competence," "assess*," "emergency," and "education" and limited to humans and English language [boolean search: ((assessment and patient care AND (competency or competence)) OR (assess* a...
The Residency Review Committee in Emergency Medicine requires residency programs to deliver at least 5 hours of weekly didactics. Achieving at least a 70 % average attendance rate per resident is required for residency program accreditation, and is used as a benchmark for residency graduation in our program. We developed a web-based, asynchronous curriculum to replace 1 hour of synchronous didactics, and hypothesized that the curriculum would be feasible to implement, well received by learners, and improve conference participation. This paper describes the feasibility and learner acceptability of a longitudinal asynchronous curriculum, and describes its impact on postgraduate year-1(PGY-1) resident conference participation and annual in-training examination scores. Using formal curriculum design methods, we developed modules and paired assessment exercises to replace 1 hour of weekly didactics. We measured feasibility (development and implementation time and costs) and learner acceptability (measured on an anonymous survey). We compared pre- and post-intervention conference participation and in-service training examination scores using a two sample t test. The asynchronous curriculum proved feasible to develop and implement. PGY-1 resident conference participation improved compared to the pre-intervention year (85.6 vs. 62 %; 95 % CI 0.295-0.177; p < 0.001). We are unable to detect a difference between in-training examination results in either the PGY-1 group or across all residents by the introduction of this intervention. 18/31 (58 %) residents completed the post-intervention survey. 83 % reported satisfaction with curriculum changes. Strengths of the curriculum included clarity and timeliness of assignments. Weaknesses included technical difficulties with the online platform. Our curriculum is feasible to develop and implement. Despite technical difficulties, residents report high satisfaction with this new curriculum. Among PGY-1 residents there is improved conference participation compared to the prior year.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.