Background The first year of graduate medical education is an important period in the professional development of physicians. Disruptive behavior interferes with safe and effective clinical practice. Objective To determine the frequency and nature of disruptive behavior perceived by interns and attending physicians in a teaching hospital environment. Method All 516 interns at Partners HealthCare (Boston, MA) during the 2010 and 2011 academic years were eligible to complete an anonymous questionnaire. A convenience nonrandom sample of 40 attending physicians also participated. Results A total of 394 of 516 eligible interns (76.4%) participated. Attendings and interns each reported that their team members generally behaved professionally (87.5% versus 80.4%, respectively). A significantly greater proportion of attendings than interns felt respected at work (90.0% versus 71.5% respectively; P = .01). Disruptive behavior was experienced by 93% of interns; 54% reported that they experienced it once a month or more. Interns reported disruptive behavior significantly more frequently than attending physicians, including increased reports of condescending behavior (odds ratio [OR], 5.46 for interns compared with attendings; P < .001), exclusion from decision making (OR, 6.97; P < .001), and berating (OR, 4.84; P = .02). Inappropriate jokes, abusive language, and gender bias were also reported, and they were not significantly more frequent among interns than attending physicians. Interns most frequently identified nurses as the source of disruption, and were significantly more likely than faculty to identify nurses as the source of disruptive behavior (OR, 10.40; P < .001). Attendings reported other physicians as the most frequent source of disruption. Conclusions Although interns generally feel respected at work, they frequently experience disruptive behavior. Interns described more disruptive behaviors than a convenience sample of attending physicians at our institution.
Background Professionalism is one of the Accreditation Council for Graduate Medical Education's core competencies. Residency programs must teach residents about ethical principles, which is an essential component of professionalism. Objectives We aimed to formally develop a valid and reliable test of ethics knowledge that effectively discriminated among learners in pediatric residency training and to improve methods for measuring outcomes of resident education in medical ethics. Methods We created an instrument with 36 true/false questions that tested knowledge in several domains of pediatric ethics: professionalism, adolescent medicine, genetic testing and diagnosis, neonatology, end-of-life decisions, and decision making for minors. All questions and their correct answers were derived from published statements from the American Academy of Pediatrics Committee on Bioethics. We invited a range of participants from novices to experts to complete the test. We evaluated the instrument's reliability and explored item discrimination, omitting 13 items with the least discriminatory power. Score differences between the 3 categories of examinees were evaluated. Results The 23-item test, completed by 54 participants, demonstrated good internal reliability (Kuder-Richardson 20 statistic = 0.73). The test was moderately difficult and had a mean overall score of 17.3 (±3.3 standard deviation). Performance appropriately improved with degree of expertise: median scores for medical students, postgraduate year-3 residents, and ethicists were 15 (65%, range, 11–19), 19 (83%, range, 14–23), and 22 (96%, range, 20–23), respectively. Ethicists' scores were significantly higher than those of medical students (P < .001) and residents (P = .007). Moreover, residents performed significantly better than medical students (P = .001). Conclusions We developed a standardized instrument, entitled Test of Residents' Ethics Knowledge for Pediatrics (TREK-P), to evaluate residents' knowledge of pediatric ethics. The TREK-P is easy to administer, reliably discriminates among learners, and highlights content areas in which knowledge may be deficient.
Introduction:The pandemic created new demands on the accredited continuing medical education (CME) community. Facing economic, resource, and personal challenges, educators had to cancel or repurpose in-person learning, and design and deliver effective online education. This short report analyzes the effect of this pandemic on CME in the United States.Methods:Organizations accredited by the Accreditation Council for Continuing Medical Education are required to submit detailed data about their educational programs annually. This report compares 2019 and 2020 data sets to evaluate pandemic-related changes in the availability, formats, and participation in CME.Results:After years of comparative stability, 2020 saw significant shifts in CME. Compared with 2019, the number of accredited organizations, activities, hours of instruction, and revenue declined in 2020. In contrast, engagement in CME by physicians and other health care professionals increased to the record levels. Virtual learning formats predominated. Almost half of accredited organizations delivered activities addressing pandemic-related topics, mostly in online formats.Discussion:Educators anticipate continuing to offer activities in online and hybrid formats. This transformation presents new challenges and opportunities for CME. It is important that institutional leaders appropriately resource CME staff and faculty to design and deliver education targeting ongoing pandemic-related issues such as vaccine hesitancy, medical misinformation, and clinician burnout.
Background: Traditionally the role of certifying boards has been to hold physicians accountable for demonstrating standards of competence. In recent years, the authority of continuing board certification has been challenged, due to multiple factors that have shifted the dynamics. The breadth and depth of new information, combined with the pressures of system barriers and administrative burdens, can make it challenging for clinicians stay current and maintain their own competency. Absent feedback about their performance, physicians presume they're practicing effectively. The resulting gap between confidence and competence can also lead physicians to make errors of which they may be unaware. In this environment, assessment and accountability are more important than ever.Four Key Areas: The authors present four key areas to address to move forward with a board certification system that is effective, relevant, and respected. First, boards should set and communicate the specific expectations of specialists. Second, boards should use technology to create practice-relevant assessments. Third, they should collaborate with educators, while maintaining their distinct role as assessors. Fourth, boards need to establish and meet standards for professionalism and ethics that reflect their position as regulatory bodies.Conclusion: Boards have a critical role in professional self-regulation. They should not compromise on their primary responsibility to set and evolve standards for competence and to conduct rigorous assessments of physicians. The methods boards use for assessments should evolve to meet the changing needs of physicians. Collaboration between educators and assessors provides more educational choice, relieves burdens, and supports physicians' commitment to lifelong learning. By working together with physicians, educators and assessors advance their shared goal of supporting physicians to work at the top of their capability and ultimately, optimize patient care. ( J Am Board Fam Med 2020;33:S10-S14.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.