RationaleCompetency‐based education (CBE) is currently being implemented across Canadian postgraduate medical education programmes through Competence by Design (CBD).1 Queen's University received permission to initiate CBE in all programmes simultaneously starting in 2017; an institutional initiative termed Competency‐based medical education (CBME).2 We describe our initial experiences to highlight perceptions and barriers and facilitate implementation at other centers.MethodsAnonymous online surveys were administered to faculty and residents transitioning to CBE (138 respondents) including (a) Queen's programme leaders (Programme Directors and CBME Leads) [n = 27], (b) Queen's residents [n = 102], and (c) Canadian neurology programme directors [n = 9] and were analysed using descriptive and inferential statistical techniques.ResultsPerceptions were favourable (x = 3.55/5, SD = 0.71) and 81.6% perceived CBE enhanced training; however, perceptions were more favourable among faculty. Queen's programme leaders indicated that CBE did not improve their ability to provide negative feedback. Queen's residents did not perceive improved quality of feedback. National Canadian neurology programme directors did not perceive that their institutions had adequately prepared them. There was variability in barriers perceived across groups. Queen's programme leaders were concerned about resident initiative. Queen's residents felt that assessment selection and faculty responsiveness to feedback were barriers. Canadian neurology programme directors were concerned about access to information technology.RecommendationsOur results indicate that faculty were concerned about the reluctance of residents to actively participate in CBE, while residents were hesitant to assume such a role because of lack of familiarity and perceived benefit. This discrepancy indicates attention should be devoted to (a) institutional administrative/educational supports, (b) faculty development around feedback/assessment, and (c) resident development to foster ownership of their learning and familiarity with CBE.
The use of quantitative intercoder reliability measures in the analysis of qualitative research data has often generated acrimonious debates among researchers who view quantitative and qualitative research methodologies as incompatible due to their unique ontological and epistemological traditions. While these measures are invaluable in many contexts, critics point out that the use of such measures in qualitative analysis represents an attempt to import standards derived for positivist research. Guided by extant research and our experience in qualitative research, we argue that it is possible to develop a qualitative-based measure of intercoder reliability that is compatible with the interpretivist epistemological paradigm of qualitative research. We present eight qualitative research process-based guidelines for evaluating and reporting intercoder reliability in qualitative research and anticipate that these recommendations will particularly guide beginning researchers in the coding and analysis processes of qualitative data analysis.
Purpose: The Royal College of Physicians and Surgeons of Canada (RCPSC) has mandated the transition of postgraduate medical training in Canada to a competency-based medical education (CBME) model divided into 4 stages of training. As part of the Queen’s University Fundamental Innovations in Residency Education proposal, Queen’s University in Canada is the first institution to transition all of its residency programs simultaneously to this model, including Diagnostic Radiology. The objective of this report is to describe the Queen’s Diagnostic Radiology Residency Program’s implementation of a CBME curriculum. Methods: At Queen’s University, the novel curriculum was developed using the RCPSC’s competency continuum and the CanMEDS framework to create radiology-specific entrustable professional activities (EPAs) and milestones. In addition, new committees and assessment strategies were established. As of July 2015, 3 cohorts of residents (n = 9) have been enrolled in this new curriculum. Results: EPAs, milestones, and methods of evaluation for the Transition to Discipline and Foundations of Discipline stages, as well as the opportunities and challenges associated with the implementation of a competency-based curriculum in a Diagnostic Radiology Residency Program, are described. Challenges include the increased frequency of resident assessments, establishing stage-specific learner expectations, and the creation of volumetric guidelines for case reporting and procedures. Conclusions: Development of a novel CBME curriculum requires significant resources and dedicated administrative time within an academic Radiology department. This article highlights challenges and provides guidance for this process.
Background: In North America, there is limited data to support deliberate application strategies for post-graduate residency training. There is significant interest in determining what factors play a role in Canadian medical graduate (CMG) matching to their first choice discipline and heightened concern about the number of students going unmatched altogether. Methods: We analyzed matching outcomes of CMGs based on seven years (2013-2019) of residency application data (n= 13,499) from the Canadian Residency Matching Service (CaRMS) database using descriptive and binary logistic regression modeling techniques. Results: The sample was 54% female, with 60% between the ages of 26 and 29, and 60% attended medical schools in Ontario. Applicants who received more rankings from residency programs were more likely (OR = 1.185, p < 0.001) to match. Higher research activities (OR = 0.985, p < 0.001) and number of applications submitted (OR = 0.920, p < 0.001) were associated with a reduced likelihood of matching. Number of volunteer activities and self-report publications did not significantly affect matching. Being male (OR = 0.799, p < 0.05) aged <25 (OR = 0.756, p < 0.05), and from Eastern (OR = 0.497, p < 0.01), or Western (OR = 0.450, p < 0.001) Canadian medical schools were predictors of remaining unmatched. Conclusions: This study identified several significant associations of demographic and application factors that affected matching outcomes. The results will help to better inform medical student application strategies and highlight possible biases in the selection process.
PurposeWithin competency‐based medical education, self‐regulated learning (SRL) requires residents to leverage self‐assessment and faculty feedback. We sought to investigate the potential for competency‐based assessments to foster SRL by quantifying the relationship between faculty feedback and entrustment ratings as well as the congruence between faculty assessment and resident self‐assessment.Materials and methodsWe collected comments in (a) an emergency medicine objective structured clinical examination group (objective structured clinical examinations [OSCE] and emergency medicine OSCE group [EMOG]) and (b) a first‐year resident multidisciplinary resuscitation “Nightmares” course assessment group (NCAG) and OSCE group (NOG). We assessed comments across five domains including Initial Assessment (IA), Diagnostic Action (DA), Therapeutic Action (TA), Communication (COM), and entrustment. Analyses included structured qualitative coding and (non)parametric and descriptive analyses.ResultsIn the EMOG, faculty's positive comments in the entrustment domain corresponded to lower entrustment score Mean Ranks (MRs) for IA (<11.1), DA (<11.2), and entrustment (<11.6). In NOG, faculty's negative comments resulted in lower entrustment score MRs for TA (<11.8 and <10) and DA (<12.4), and positive comments resulted in higher entrustment score MRs for IA (>15.4) and COM (>17.6). In the NCAG, faculty's positive IA comments were negatively correlated with entrustment scores (ρ = −.27, P = .04). Across programs, faculty and residents made similar domain‐specific comments 13% of the time.ConclusionsMinimal and inconsistent associations were found between narrative and numerical feedback. Performance monitoring accuracy and feedback should be included in assessment validation.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.