The formula for survival in resuscitation describes educational efficiency and local implementation as key determinants in survival after cardiac arrest. Current educational offerings in the form of standardized online and face-to-face courses are falling short, with providers demonstrating a decay of skills over time. This translates to suboptimal clinical care and poor survival outcomes from cardiac arrest. In many institutions, guidelines taught in courses are not thoughtfully implemented in the clinical environment. A current synthesis of the evidence supporting best educational and knowledge translation strategies in resuscitation is lacking. In this American Heart Association scientific statement, we provide a review of the literature describing key elements of educational efficiency and local implementation, including mastery learning and deliberate practice, spaced practice, contextual learning, feedback and debriefing, assessment, innovative educational strategies, faculty development, and knowledge translation and implementation. For each topic, we provide suggestions for improving provider performance that may ultimately optimize patient outcomes from cardiac arrest.
BackgroundSimulation-based research (SBR) is rapidly expanding but the quality of reporting needs improvement. For a reader to critically assess a study, the elements of the study need to be clearly reported. Our objective was to develop reporting guidelines for SBR by creating extensions to the Consolidated Standards of Reporting Trials (CONSORT) and Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) Statements.MethodsAn iterative multistep consensus-building process was used on the basis of the recommended steps for developing reporting guidelines. The consensus process involved the following: (1) developing a steering committee, (2) defining the scope of the reporting guidelines, (3) identifying a consensus panel, (4) generating a list of items for discussion via online premeeting survey, (5) conducting a consensus meeting, and (6) drafting reporting guidelines with an explanation and elaboration document.ResultsThe following 11 extensions were recommended for CONSORT: item 1 (title/abstract), item 2 (background), item 5 (interventions), item 6 (outcomes), item 11 (blinding), item 12 (statistical methods), item 15 (baseline data), item 17 (outcomes/ estimation), item 20 (limitations), item 21 (generalizability), and item 25 (funding). The following 10 extensions were recommended for STROBE: item 1 (title/abstract), item 2 (background/rationale), item 7 (variables), item 8 (data sources/measurement), item 12 (statistical methods), item 14 (descriptive data), item 16 (main results), item 19 (limitations), item 21 (generalizability), and item 22 (funding). An elaboration document was created to provide examples and explanation for each extension.ConclusionsWe have developed extensions for the CONSORT and STROBE Statements that can help improve the quality of reporting for SBR (Sim Healthcare 00:00-00, 2016).Electronic supplementary materialThe online version of this article (doi:10.1186/s41077-016-0025-y) contains supplementary material, which is available to authorized users.
As simulation is increasingly used to study questions pertaining to pediatrics, it is important that investigators use rigorous methods to conduct their research. In this article, we discuss several important aspects of conducting simulation-based research in pediatrics. First, we describe, from a pediatric perspective, the 2 main types of simulationbased research: (1) studies that assess the efficacy of simulation as a training methodology and (2) studies where simulation is used as an investigative methodology. We provide a framework to help structure research questions for each type of research and describe illustrative examples of published research in pediatrics using these 2 frameworks. Second, we highlight the benefits of simulation-based research and how these apply to pediatrics. Third, we describe simulation-specific confounding variables that serve as threats to the internal validity of simulation studies and offer strategies to mitigate these confounders. Finally, we discuss the various types of outcome measures available for simulation research and offer a list of validated pediatric assessment tools that can be used in future simulation-based studies.
Background: Simulation-based research (SBR) is rapidly expanding but the quality of reporting needs improvement. For a reader to critically assess a study, the elements of the study need to be clearly reported. Our objective was to develop reporting guidelines for SBR by creating extensions to the Consolidated Standards of Reporting Trials (CONSORT) and Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) Statements. Methods: An iterative multistep consensus-building process was used on the basis of the recommended steps for developing reporting guidelines. The consensus process involved the following: (1) developing a steering committee, (2) defining the scope of the reporting guidelines, (3) identifying a consensus panel, (4) generating a list of items for discussion via online premeeting survey, (5) conducting a consensus meeting, and (6) drafting reporting guidelines with an explanation and elaboration document. Results: The following 11 extensions were recommended for CONSORT: item 1 (title/abstract), item 2 (background), item 5 (interventions), item 6 (outcomes), item 11 (blinding), item 12 (statistical methods), item 15 (baseline data), item 17 (outcomes/ estimation), item 20 (limitations), item 21 (generalizability), and item 25 (funding).
Acquisition of competency in procedural skills is a fundamental goal of medical training. In this Perspective, the authors propose an evidence-based pedagogical framework for procedural skill training. The framework was developed based on a review of the literature using a critical synthesis approach and builds on earlier models of procedural skill training in medicine. The authors begin by describing the fundamentals of procedural skill development. Then, a six-step pedagogical framework for procedural skills training is presented: Learn, See, Practice, Prove, Do, and Maintain. In this framework, procedural skill training begins with the learner acquiring requisite cognitive knowledge through didactic education (Learn) and observation of the procedure (See). The learner then progresses to the stage of psychomotor skill acquisition and is allowed to deliberately practice the procedure on a simulator (Practice). Simulation-based mastery learning is employed to allow the trainee to prove competency prior to performing the procedure on a patient (Prove). Once competency is demonstrated on a simulator, the trainee is allowed to perform the procedure on patients with direct supervision, until he or she can be entrusted to perform the procedure independently (Do). Maintenance of the skill is ensured through continued clinical practice, supplemented by simulation-based training as needed (Maintain). Evidence in support of each component of the framework is presented. Implementation of the proposed framework presents a paradigm shift in procedural skill training. However, the authors believe that adoption of the framework will improve procedural skill training and patient safety.
The quality of pediatric resuscitative care delivered across the spectrum of emergency departments (EDs) in the United States is poorly described. In a recent study, more than 4000 EDs completed the Pediatric Readiness Survey (PRS); however, the correlation of PRS scores with the quality of simulated or real patient care has not been described. OBJECTIVE To measure and compare the quality of resuscitative care delivered to simulated pediatric patients across a spectrum of EDs and to examine the correlation of PRS scores with quality measures. DESIGN, SETTING, AND PARTICIPANTS This prospective multicenter cohort study evaluated 58 interprofessional teams in their native pediatric or general ED resuscitation bays caring for a series of 3 simulated critically ill patients (sepsis, seizure, and cardiac arrest). MAIN OUTCOMES AND MEASURES A composite quality score (CQS) was measured as the sum of 4 domains: (1) adherence to sepsis guidelines, (2) adherence to cardiac arrest guidelines, (3) performance on seizure resuscitation, and (4) teamwork. Pediatric Readiness Survey scores and health care professional demographics were collected as independent data. Correlations were explored between CQS and individual domain scores with PRS. RESULTS Overall, 58 teams from 30 hospitals participated (8 pediatric EDs [PEDs], 22 general EDs [GEDs]). The mean CQS was 71 (95% CI, 68-75); PEDs had a higher mean CQS (82; 95% CI, 79-85) vs GEDs (66; 95% CI, 63-69) and outperformed GEDs in all domains. However, when using generalized estimating equations to estimate CQS controlling for clustering of the data, PED status did not explain a higher CQS (β = 4.28; 95% CI, −4.58 to 13.13) while the log of pediatric patient volume did explain a higher CQS (β = 9.57; 95% CI, 2.64-16.49). The correlation of CQS to PRS was moderate (r = 0.51; P < .001). The correlation was weak for cardiac arrest (r = 0.24; P = .07), weak for sepsis (ρ = 0.45; P < .001) and seizure (ρ = 0.43; P = .001), and strong for teamwork (ρ = 0.71; P < .001). CONCLUSIONS AND RELEVANCE This multicenter study noted significant differences in the quality of simulated pediatric resuscitative care across a spectrum of EDs. The CQS was higher in PEDs compared with GEDs. However, when controlling for pediatric patient volume and other variables in a multivariable model, PED status does not explain a higher CQS while pediatric patient volume does. The correlation of the PRS was moderate for simulation-based measures of quality.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.