Introduction: As scholarship moves into the digital sphere, applicant and promotion and tenure (P&T) committee members lack formal guidance on evaluating the impact of digital scholarly work. The P&T process requires the appraisal of individual scholarly impact in comparison to scholars across institutions and disciplines. As dissemination methods evolve in the digital era, we must adapt traditional P&T processes to include emerging forms of digital scholarship. Methods: We conducted a blended, expert consensus procedure using a nominal group process to create a consensus document at the Council of Emergency Medicine Residency Directors Academic Assembly on April 1, 2019. Results: We discussed consensus guidelines for evaluation and promotion of digital scholarship with the intent to develop specific, evidence-supported recommendations to P&T committees and applicants. These recommendations included the following: demonstrate scholarship criteria; provide external evidence of impact; and include digital peerreview roles. As traditional scholarship continues to evolve within the digital realm, academic medicine should adapt how that scholarship is evaluated. P&T committees in academic medicine are at the epicenter for supporting this changing paradigm in scholarship. Conclusion: P&T committees can critically appraise the quality and impact of digital scholarship using specific, validated tools. Applicants for appointment and promotion should highlight and prepare their digital scholarship to specifically address quality, impact, breadth, and relevance. It is our goal to provide specific, timely guidance for both stakeholders to recognize the value of digital scholarship in advancing our field. [
Background: With the rapid proliferation of online medical education resources, quality evaluation is increasingly critical. The Medical Education Translational Resources: Impact and Quality (METRIQ) study evaluated the METRIQ-8 quality assessment instrument for blogs and collected feedback to improve it.Methods: As part of the larger METRIQ study, participants rated the quality of five blog posts on clinical emergency medicine topics using the eight-item METRIQ-8 score. Next, participants used a 7-point Likert scale and free-text comments to evaluate the METRIQ-8 score on ease of use, clarity of items, and likelihood of recommending it to others. Descriptive statistics were calculated and comments were thematically analyzed to guide the development of a revised METRIQ (rMETRIQ) score.Results: A total of 309 emergency medicine attendings, residents, and medical students completed the survey.The majority of participants felt the METRIQ-8 score was easy to use (mean AE SD = 2.7 AE 1.1 out of 7, with 1 indicating strong agreement) and would recommend it to others (2.7 AE 1.3 out of 7, with 1 indicating strong agreement). The thematic analysis suggested clarifying ambiguous questions, shortening the 7-point scale, specifying scoring anchors for the questions, eliminating the "unsure" option, and grouping-related questions. This analysis guided changes that resulted in the rMETRIQ score.Conclusion: Feedback on the METRIQ-8 score contributed to the development of the rMETRIQ score, which has improved clarity and usability. Further validity evidence on the rMETRIQ score is required.W ith increasing expansion of emergency medicine (EM) blogs and podcasts, residents frequently use these open educational resources to supplement and potentially replace traditional tools. 1-4 Unlike textbooks and journals, these online resources are rarely peer-reviewed 5-7 and critics raise concerns that learners From the
IntroductionAsynchronous online training has become an increasingly popular educational format in the new era of technology-based professional development. We sought to evaluate the impact of an online asynchronous training module on the ability of medical students and emergency medicine (EM) residents to detect electrocardiogram (ECG) abnormalities of an acute myocardial infarction (AMI).MethodsWe developed an online ECG training and testing module on AMI, with emphasis on recognizing ST elevation myocardial infarction (MI) and early activation of cardiac catheterization resources. Study participants included senior medical students and EM residents at all post-graduate levels rotating in our emergency department (ED). Participants were given a baseline set of ECGs for interpretation. This was followed by a brief interactive online training module on normal ECGs as well as abnormal ECGs representing an acute MI. Participants then underwent a post-test with a set of ECGs in which they had to interpret and decide appropriate intervention including catheterization lab activation.Results148 students and 35 EM residents participated in this training in the 2012–2013 academic year. Students and EM residents showed significant improvements in recognizing ECG abnormalities after taking the asynchronous online training module. The mean score on the testing module for students improved from 5.9 (95% CI [5.7–6.1]) to 7.3 (95% CI [7.1–7.5]), with a mean difference of 1.4 (95% CI [1.12–1.68]) (p<0.0001). The mean score for residents improved significantly from 6.5 (95% CI [6.2–6.9]) to 7.8 (95% CI [7.4–8.2]) (p<0.0001).ConclusionAn online interactive module of training improved the ability of medical students and EM residents to correctly recognize the ECG evidence of an acute MI.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.