To characterize the variability in usability and safety of EHRs from two vendors across four healthcare systems (2 Epic and 2 Cerner). Twelve to 15 emergency medicine physicians participated from each site and completed six clinical scenarios. Keystroke, mouse click, and video data were collected. From the six scenarios, two diagnostic imaging, laboratory, and medication tasks were analyzed. There was wide variability in task completion time, clicks, and error rates. For certain tasks, there were an average of a nine-fold difference in time and eight-fold difference in clicks. Error rates varied by task (X-ray 16.7% to 25%, MRI: 0 to 10%, Lactate: 0% to 14.3%, Tylenol: 0 to 30%; Taper: 16.7% to 50%). The variability in time, clicks, and error rates highlights the need for improved implementation optimization. EHR implementation, in addition to vendor design and development, is critical to usable and safe products.
SummaryObjective: Decisions made during electronic health record (EHR) implementations profoundly affect usability and safety. This study aims to identify gaps between the current literature and key stakeholders' perceptions of usability and safety practices and the challenges encountered during the implementation of EHRs. Materials and Methods: Two approaches were used: a literature review and interviews with key stakeholders. We performed a systematic review of the literature to identify usability and safety challenges and best practices during implementation. A total of 55 articles were reviewed through searches of PubMed, Web of Science and Scopus. We used a qualitative approach to identify key stakeholders' perceptions; semi-structured interviews were conducted with a diverse set of health IT stakeholders to understand their current practices and challenges related to usability during implementation. We used a grounded theory approach: data were coded, sorted, and emerging themes were identified. Conclusions from both sources of data were compared to identify areas of misalignment. Results: We identified six emerging themes from the literature and stakeholder interviews: cost and resources, risk assessment, governance and consensus building, customization, clinical workflow and usability testing, and training. Across these themes, there were misalignments between the literature and stakeholder perspectives, indicating major gaps. Discussion: Major gaps identified from each of six emerging themes are discussed as critical areas for future research, opportunities for new stakeholder initiatives, and opportunities to better disseminate resources to improve the implementation of EHRs. Conclusion: Our analysis identified practices and challenges across six different emerging themes, illustrated important gaps, and results suggest critical areas for future research and dissemination to improve EHR implementation. Citation: Ratwani R et al.: Review to identify usability and safety challenges and practices during EHR implementation.
Our analysis highlights important areas of usability and safety policy from other industries that can better inform ONC policies on EHRs.
Objectives Electronic health records (EHRs) continue to have significant usability challenges in part due to differences in workflow. The objective of this study was to examine workflow pattern variations for one specific task: emergency physicians placing a magnetic resonance imaging (MRI) order. Methods A between-subjects usability study was conducted using two different major EHR vendor products across four different provider sites (n = 55). A clinical scenario concerning for spinal cord compression was read to participants who then completed an ordering task using a training environment representative of their native EHR. The primary outcome measures were accuracy, time on task, and number of clicks. Results We identified four different workflows to complete the same order. One workflow required two steps (enabled at one site), one workflow required four steps (enabled at two sites), and two workflows required six steps to complete the task (available at all sites). Of the 12 physicians who employed the two-step workflow, 8 (67%) had the correct order and correct indication, the average time on task was 29.65 (standard deviation [SD] = 13.77), and the mean number of clicks was 13.5 (SD = 18.87). In contrast, for the 43 physicians who employed other workflows, 7 (21%) had the correct order and correct indication, with the average time on task of 73.1 (SD = 30.12) and mean clicks of 27.64 (SD = 13.25) (p < 0.01 for all three comparisons). Discussion These different approaches were made possible by technical specifications leading to multiple workflow options available to physicians in the EHR environment. EHR design maximizing usability can reduce the work effort and improve the accuracy of physician ordering.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.