Background Patients are increasingly seeking Web-based symptom checkers to obtain diagnoses. However, little is known about the characteristics of the patients who use these resources, their rationale for use, and whether they find them accurate and useful. Objective The study aimed to examine patients’ experiences using an artificial intelligence (AI)–assisted online symptom checker. Methods An online survey was administered between March 2, 2018, through March 15, 2018, to US users of the Isabel Symptom Checker within 6 months of their use. User characteristics, experiences of symptom checker use, experiences discussing results with physicians, and prior personal history of experiencing a diagnostic error were collected. Results A total of 329 usable responses was obtained. The mean respondent age was 48.0 (SD 16.7) years; most were women (230/304, 75.7%) and white (271/304, 89.1%). Patients most commonly used the symptom checker to better understand the causes of their symptoms (232/304, 76.3%), followed by for deciding whether to seek care (101/304, 33.2%) or where (eg, primary or urgent care: 63/304, 20.7%), obtaining medical advice without going to a doctor (48/304, 15.8%), and understanding their diagnoses better (39/304, 12.8%). Most patients reported receiving useful information for their health problems (274/304, 90.1%), with half reporting positive health effects (154/302, 51.0%). Most patients perceived it to be useful as a diagnostic tool (253/301, 84.1%), as a tool providing insights leading them closer to correct diagnoses (231/303, 76.2%), and reported they would use it again (278/304, 91.4%). Patients who discussed findings with their physicians (103/213, 48.4%) more often felt physicians were interested (42/103, 40.8%) than not interested in learning about the tool’s results (24/103, 23.3%) and more often felt physicians were open (62/103, 60.2%) than not open (21/103, 20.4%) to discussing the results. Compared with patients who had not previously experienced diagnostic errors (missed or delayed diagnoses: 123/304, 40.5%), patients who had previously experienced diagnostic errors (181/304, 59.5%) were more likely to use the symptom checker to determine where they should seek care (15/123, 12.2% vs 48/181, 26.5%; P=.002), but they less often felt that physicians were interested in discussing the tool’s results (20/34, 59% vs 22/69, 32%; P=.04). Conclusions Despite ongoing concerns about symptom checker accuracy, a large patient-user group perceived an AI-assisted symptom checker as useful for diagnosis. Formal validation studies evaluating symptom checker accuracy and effectiveness in real-world practice could provide additional useful information about their benefit.
Background The 21st Century Cures Act mandates patients’ access to their electronic health record (EHR) notes. To our knowledge, no previous work has systematically invited patients to proactively report diagnostic concerns while documenting and tracking their diagnostic experiences through EHR-based clinician note review. Objective To test if patients can identify concerns about their diagnosis through structured evaluation of their online visit notes. Methods In a large integrated health system, patients aged 18–85 years actively using the patient portal and seen between October 2019 and February 2020 were invited to respond to an online questionnaire if an EHR algorithm detected any recent unexpected return visit following an initial primary care consultation (“at-risk” visit). We developed and tested an instrument (Safer Dx Patient Instrument) to help patients identify concerns related to several dimensions of the diagnostic process based on notes review and recall of recent “at-risk” visits. Additional questions assessed patients’ trust in their providers and their general feelings about the visit. The primary outcome was a self-reported diagnostic concern. Multivariate logistic regression tested whether the primary outcome was predicted by instrument variables. Results Of 293 566 visits, the algorithm identified 1282 eligible patients, of whom 486 responded. After applying exclusion criteria, 418 patients were included in the analysis. Fifty-one patients (12.2%) identified a diagnostic concern. Patients were more likely to report a concern if they disagreed with statements “the care plan the provider developed for me addressed all my medical concerns” [odds ratio (OR), 2.65; 95% confidence interval [CI], 1.45–4.87) and “I trust the provider that I saw during my visit” (OR, 2.10; 95% CI, 1.19–3.71) and agreed with the statement “I did not have a good feeling about my visit” (OR, 1.48; 95% CI, 1.09–2.01). Conclusion Patients can identify diagnostic concerns based on a proactive online structured evaluation of visit notes. This surveillance strategy could potentially improve transparency in the diagnostic process.
Failure to communicate test results to patients remains a persistent problem leading to diagnosis and management delays, [1][2][3] with up to 62% of abnormal laboratory results and 36% of abnormal radiology results lacking timely follow up. 4 The Department of Veterans Affairs (VA) developed a national policy in 2015 stating that practitioners authorized to order laboratory tests (referred to in the policy as providers), or their designees, must communicate abnormal test results to patients within 7 days if action is required and within 14 days if no action is required or results are normal. 5 Acceptable communication modes include face-to-face, telehealth, telephone, secure messages, or letters. To assess policy adherence, the VA implemented a quality measurement system for feedback and improvement. We analyzed the first full year of these measures to determine timeliness of test results communication to patients. MethodsThis cross-sectional study had VA institutional review board approval and was exempt from informed consent because no patient identifiable information was included in the data set.This study follows the Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) reporting guideline. Data collection was implemented through the External Peer Review Program (EPRP), the VA's performance measurement system that is used for quality improvement and benchmarking. Computerized algorithms randomly sample a set of abnormal and normal results related to 8 predetermined tests for each VA facility and each quarter (Table ). Trained EPRP chart reviewers evaluated documentation of the communication of results to patients in medical records.The sample included data from October 2018 to September 2019 for all 141 VA facilities. EPRP measures examined timeliness of communication to patients with abnormal results(Յ7 days if action required; Յ14 days if no action required), normal results (Յ14 days), and all test results (Յ30 days). Additionally, we evaluated 1 item from the Survey of Healthcare Experiences of Patients (SHEP) during the same timeframe (ie, "in the last 6 months, when this provider ordered a blood test, x-ray, or other test for you, how often did someone from this provider's office follow up to give you those results?").We used descriptive statistics to examine EPRP and SHEP data and a Pearson correlation between the EPRP measure for all test results and the SHEP item. Using Stata version 15.1 (StataCorp), statistical analysis was conducted using a significance threshold of P < .05 with a 2-tailed test. Data were given to us between February and May 2021 and were analyzed between March and November 2021. ResultsEPRP measures showed timely communication for 5925 of 8372 abnormal results (70.8%) (ie, within 7 days if action was required and within 14 days if no action was required); for 9472 of 11 784 normal Author affiliations and article information are listed at the end of this article.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.