Background: Online patient simulations (OPS) are a novel method for teaching clinical reasoning skills to students and could contribute to reducing diagnostic errors. However, little is known about how best to implement and evaluate OPS in medical curricula. The aim of this study was to assess the feasibility, acceptability and potential effects of eCRESTthe electronic Clinical Reasoning Educational Simulation Tool. Methods: A feasibility randomised controlled trial was conducted with final year undergraduate students from three UK medical schools in academic year 2016/2017 (cohort one) and 2017/2018 (cohort two). Student volunteers were recruited in cohort one via email and on teaching days, and in cohort two eCREST was also integrated into a relevant module in the curriculum. The intervention group received three patient cases and the control group received teaching as usual; allocation ratio was 1:1. Researchers were blind to allocation. Clinical reasoning skills were measured using a survey after 1 week and a patient case after 1 month. Results: Across schools, 264 students participated (18.2% of all eligible). Cohort two had greater uptake (183/833, 22%) than cohort one (81/621, 13%). After 1 week, 99/137 (72%) of the intervention and 86/127 (68%) of the control group remained in the study. eCREST improved students' ability to gather essential information from patients over controls (OR = 1.4; 95% CI 1.1-1.7, n = 148). Of the intervention group, most (80/98, 82%) agreed eCREST helped them to learn clinical reasoning skills. Conclusions: eCREST was highly acceptable and improved data gathering skills that could reduce diagnostic errors. Uptake was low but improved when integrated into course delivery. A summative trial is needed to estimate effectiveness.
Background: Online patient simulations (OPS) are a novel method for teaching clinical reasoning skills to students and could contribute to reducing diagnostic errors. However, little is known about how best to implement and evaluate OPS in medical curricula. The aim of this study was to assess the feasibility, acceptability and potential effects of eCREST — the electronic Clinical Reasoning Educational Simulation Tool.Methods: A feasibility randomised controlled trial was conducted with final year undergraduate students from three UK medical schools in academic year 2016/2017 (cohort one) and 2017/2018 (cohort two). Student volunteers were recruited in cohort one via email and on teaching days, and in cohort two eCREST was also integrated into a relevant module in the curriculum. The intervention group received three patient cases and the control group received teaching as usual; allocation ratio was 1:1. Researchers were blind to allocation. Clinical reasoning skills were measured using a survey after one week and a patient case after one month.Results: Across schools, 264 students participated (18.2% of all eligible). Cohort two had greater uptake (183/833, 22%) than cohort one (81/621, 13%). After one week, 99/137 (72%) of the intervention and 86/127 (68%) of the control group remained in the study. eCREST improved students’ ability to gather essential information from patients over controls (OR =1.4; 95% CI 1.1-1.7, n =148). Of the intervention group, most (84/98, 82%) agreed eCREST helped them to learn clinical reasoning skills.Conclusions: eCREST was highly acceptable and improved data gathering skills that could reduce diagnostic errors. Uptake was low but improved when integrated into course delivery. A summative trial is needed to estimate effectiveness.
Background Improving clinical reasoning skills—the thought processes used by clinicians to formulate appropriate questions and diagnoses—is essential for reducing missed diagnostic opportunities. The electronic Clinical Reasoning Educational Simulation Tool (eCREST) was developed to improve the clinical reasoning of future physicians. A feasibility trial demonstrated acceptability and potential impacts; however, the processes by which students gathered data were unknown. Objective This study aims to identify the data gathering patterns of final year medical students while using eCREST and how eCREST influences the patterns. Methods A mixed methods design was used. A trial of eCREST across 3 UK medical schools (N=148) measured the potential effects of eCREST on data gathering. A qualitative think-aloud and semistructured interview study with 16 medical students from one medical school identified 3 data gathering strategies: Thorough, Focused, and Succinct. Some had no strategy. Reanalysis of the trial data identified the prevalence of data gathering patterns and compared patterns between the intervention and control groups. Patterns were identified based on 2 variables that were measured in a patient case 1 month after the intervention: the proportion of Essential information students identified and the proportion of irrelevant information gathered (Relevant). Those who scored in the top 3 quartiles for Essential but in the lowest quartile for Relevant displayed a Thorough pattern. Those who scored in the top 3 quartiles for Relevant but in the lowest quartile for Essential displayed a Succinct pattern. Those who scored in the top 3 quartiles on both variables displayed a Focused pattern. Those whose scores were in the lowest quartile on both variables displayed a Nonspecific pattern. Results The trial results indicated that students in the intervention group were more thorough than those in the control groups when gathering data. The qualitative data identified data gathering strategies and the mechanisms by which eCREST influenced data gathering. Students reported that eCREST promoted thoroughness by prompting them to continuously reflect and allowing them to practice managing uncertainty. However, some found eCREST to be less useful, and they randomly gathered information. Reanalysis of the trial data revealed that the intervention group was significantly more likely to display a Thorough data gathering pattern than controls (21/78, 27% vs 6/70, 9%) and less likely to display a Succinct pattern (13/78, 17% vs 20/70, 29%; χ23=9.9; P=.02). Other patterns were similar across groups. Conclusions Qualitative data suggested that students applied a range of data gathering strategies while using eCREST and that eCREST encouraged thoroughness by continuously prompting the students to reflect and manage their uncertainty. Trial data suggested that eCREST led students to demonstrate more Thorough data gathering patterns. Virtual patients that encourage thoroughness could help future physicians avoid missed diagnostic opportunities and enhance the delivery of clinical reasoning teaching.
Interprofessional learning (IPL), involving various professions within healthcare, has been proven to improve the quality of patient care by encouraging collaboration between professionals. Careful consideration of appropriate educational tools and content is required in order to facilitate the effective IPL. This study aimed to explore medical and pharmacy students' preconceptions of the role of virtual patients (VPs) as a learning tool for IPL within their education. A secondary aim was to elicit feedback to inform the development of new VP cases. Two focus groups (one with medical students and the other with pharmacy students), consisting of six students in each, were recruited. Participant perceptions regarding VP-based IPL were explored. Data were analysed using a thematic approach. Participants thought that there were some potential learning benefits of using VPs as part of their curriculum. Pharmacy students held increased value in VPs due to their limited access to patients during their education. Medical students challenged the role of VPs in their clinical development and concerned that VPs lack the flexibility required by doctors to use their judgement and work with uncertainty. Limited understanding of team members' roles in patient care and self-reported ignorance of the overlap in curricula appear to be key barriers for students in valuing the knowledge base of each other's profession and possible benefits of using VPs in joint learning. This study generated a number of key implications which need to be considered when introducing VP-based IPL.
Background: At least 14.6% of paediatric hospital episodes in England were due to infections in 2014/15. The clinical coding of infection aetiology is required for healthcare planning, epidemiology and financial reimbursement. The degree to which causative agents are recorded in infection coding datasets is unknown, and its importance for remuneration and other secondary uses has not been studied. Methods: An audit of bronchiolitis admissions to a London children's hospital was performed between 1 August 2014 and 31 July 2015. The agreement between the discharge documentation, coding reports and results of a Respiratory Virus Molecular Panel was assessed by clinicians and a coding professional. The impact of errors on data quality and finances was reviewed. Results: Of the 74 admissions, 52 (70.3%) did not have identified causative agents accurately represented in the coding, with inadequate clinical documentation being the leading cause (53.8%). In total, 29 hospital admissions were assigned to an incorrect Healthcare Resource Group, highlighting a potential for a further financial gain of £25 388, or a 38.5% increase, on gross reimbursement. Conclusions: Causative agents were not reliably reflected in the coding dataset, with negative effects on secondary uses. This model could be applied to other infections to improve the veracity of infection coding.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.