As academic health sciences libraries assume larger roles in informatics instruction within medical school curricula, librarians are challenged to develop useful and accurate measures for assessing the effectiveness of instructional approaches. The need for this evaluation has intensified as medical schools increase their emphasis on integration of curriculum content and shift to competency-based education and assessment of medical students. This paper reports on a pilot project developed at Dahlgren Memorial Library, Georgetown University Medical Center, for two courses using an instructional intervention and tailored assignment for assessing student competencies.
An MR is a tool, and the MR-EBM that we describe can be useful to develop or evaluate a curriculum in EBM. The MR tool is particularly compatible with the objectives of training for EBM and practice and can be applied to create or evaluate a curriculum using any topical KSA framework. The MR-EBM we describe could be adopted or adapted to represent other institutional objectives for EBM training.
Organizations in underserved settings are implementing or upgrading electronic health records (EHRs) in hopes of improving quality and meeting Federal goals for meaningful use of EHRs. However, much of the research that has been conducted on health information technology does not study use in underserved settings, or does not include EHRs. We conducted a structured literature search of MEDLINE to find articles supporting the contention that EHRs improve quality in underserved settings. We found 17 articles published between 2003 and 2011. These articles were mostly in urban settings, and most study types were descriptive in nature. The articles provide evidence that EHRs can improve documentation, process measures, guideline-adherence, and (to a lesser extent) outcome measures. Providers and managers believed that EHRs would improve the quality and efficiency of care. The limited quantity and quality of evidence point to a need for ongoing research in this area.
SummaryObjective: Clinical decision support (CDS) has been shown to improve process outcomes, but overalerting may not produce incremental benefits. We analyzed providers' response to preventive care reminders to determine if reminder response rates varied when a primary care provider (PCP) saw their own patients as compared with a partner's patients. Secondary objectives were to describe variation in PCP identification in the electronic health record (EHR) across sites, and to determine its accuracy. Methods: We retrospectively analyzed response to preventive care reminders during visits to outpatient primary care sites over a three-month period where an EHR was used. Data on clinician requests for reminders, viewing of preventive care reminders, and response rates were stratified by whether the patient visited their own PCP, the PCP's partner, or where no PCP was listed in the EHR. We calculated the proportion of PCP identification across sites and agreement of identified PCP with an external standard. Results: Of 84,937 visits, 58,482 (68.9%) were with the PCP, 10,259 (12.1%) were with the PCP's partner, and 16,196 (19.1%) had no listed PCP. Compared with PCP partner visits, visits with the patient's PCP were associated with more requested reminders (30.9% vs 22.9%), viewed reminders (29.7% vs 20.7%), and responses to reminders (28.7% vs 12.6%), all comparisons p<0.001. Visits with no listed PCP had the lowest rates of requests, views, and responses. There was good agreement between the EHR-listed PCP and the provider seen for a plurality of visits over the last year (κ=0.917). Conclusions: A PCP relationship during a visit was associated with higher use of preventive care reminders and a lack of PCP was associated with lower use of CDS. Targeting reminders to the PCP may be desirable, but further studies are needed to determine which strategy achieves better patient care outcomes.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.