Summary Background Patients admitted to hospital can acquire multidrug-resistant organisms and Clostridium difficile from inadequately disinfected environmental surfaces. We determined the effect of three enhanced strategies for terminal room disinfection (disinfection of a room between occupying patients) on acquisition and infection due to meticillin-resistant Staphylococcus aureus, vancomycin-resistant enterococci, C difficile, and multidrug-resistant Acinetobacter. Methods We did a pragmatic, cluster-randomised, crossover trial at nine hospitals in the southeastern USA. Rooms from which a patient with infection or colonisation with a target organism was discharged were terminally disinfected with one of four strategies: reference (quaternary ammonium disinfectant except for C difficile, for which bleach was used); UV (quaternary ammonium disinfectant and disinfecting ultraviolet [UV-C] light except for C difficile, for which bleach and UV-C were used); bleach; and bleach and UV-C. The next patient admitted to the targeted room was considered exposed. Every strategy was used at each hospital in four consecutive 7-month periods. We randomly assigned the sequence of strategies for each hospital (1:1:1:1). The primary outcomes were the incidence of infection or colonisation with all target organisms among exposed patients and the incidence of C difficile infection among exposed patients in the intention-to-treat population. This trial is registered with ClinicalTrials.gov, NCT01579370. Findings 31 226 patients were exposed; 21 395 (69%) met all inclusion criteria, including 4916 in the reference group, 5178 in the UV group, 5438 in the bleach group, and 5863 in the bleach and UV group. 115 patients had the primary outcome during 22 426 exposure days in the reference group (51·3 per 10 000 exposure days). The incidence of target organisms among exposed patients was significantly lower after adding UV to standard cleaning strategies (n=76; 33·9 cases per 10 000 exposure days; relative risk [RR] 0·70, 95% CI 0·50–0·98; p=0·036). The primary outcome was not statistically lower with bleach (n=101; 41·6 cases per 10 000 exposure days; RR 0·85, 95% CI 0·69–1·04; p=0·116), or bleach and UV (n=131; 45·6 cases per 10 000 exposure days; RR 0·91, 95% CI 0·76–1·09; p=0·303) among exposed patients. Similarly, the incidence of C difficile infection among exposed patients was not changed after adding UV to cleaning with bleach (n=38 vs 36; 30·4 cases vs 31·6 cases per 10 000 exposure days; RR 1·0, 95% CI 0·57–1·75; p=0·997). Interpretation A contaminated health-care environment is an important source for acquisition of pathogens; enhanced terminal room disinfection decreases this risk. Funding US Centers for Disease Control and Prevention.
BackgroundWhile the majority of healthcare in the US is provided in community hospitals, the epidemiology and treatment of bloodstream infections in this setting is unknown.Methods and FindingsWe undertook this multicenter, retrospective cohort study to 1) describe the epidemiology of bloodstream infections (BSI) in a network of community hospitals and 2) determine risk factors for inappropriate therapy for bloodstream infections in community hospitals. 1,470 patients were identified as having a BSI in 9 community hospitals in the southeastern US from 2003 through 2006. The majority of BSIs were community-onset, healthcare associated (n = 823, 56%); 432 (29%) patients had community-acquired BSI, and 215 (15%) had hospital-onset, healthcare-associated BSI. BSIs due to multidrug-resistant pathogens occurred in 340 patients (23%). Overall, the three most common pathogens were S. aureus (n = 428, 28%), E. coli (n = 359, 24%), coagulase-negative Staphylococci (n = 148, 10%), though type of infecting organism varied by location of acquisition (e.g., community-acquired). Inappropriate empiric antimicrobial therapy was given to 542 (38%) patients. Proportions of inappropriate therapy varied by hospital (median = 33%, range 21–71%). Multivariate logistic regression identified the following factors independently associated with failure to receive appropriate empiric antimicrobial therapy: hospital where the patient received care (p<0.001), assistance with ≥3 ADLs (p = 0.005), Charlson score (p = 0.05), community-onset, healthcare-associated infection (p = 0.01), and hospital-onset, healthcare-associated infection (p = 0.02). Important interaction was observed between Charlson score and location of acquisition.ConclusionsOur large, multicenter study provides the most complete picture of BSIs in community hospitals in the US to date. The epidemiology of BSIs in community hospitals has changed: community-onset, healthcare-associated BSI is most common, S. aureus is the most common cause, and 1 of 3 patients with a BSI receives inappropriate empiric antimicrobial therapy. Our data suggest that appropriateness of empiric antimicrobial therapy is an important and needed performance metric for physicians and hospital stewardship programs in community hospitals.
Summary Background The hospital environment is a source of pathogen transmission. The effect of enhanced disinfection strategies on the hospital-wide incidence of infection has not been investigated in a multicentre, randomised controlled trial. We aimed to assess the effectiveness of four disinfection strategies on hospital-wide incidence of multidrug-resistant organisms and Clostridium difficile in the Benefits of Enhanced Terminal Room (BETR) Disinfection study. Methods We did a prespecified secondary analysis of the results from the BETR Disinfection study, a pragmatic, multicentre, crossover cluster-randomised trial that assessed four different strategies for terminal room disinfection in nine hospitals in the southeastern USA. Rooms from which a patient with a specific infection or colonisation (due to the target organisms C difficile, meticillin-resistant Staphylococcus aureus, vancomycin-resistant enterococci (VRE), or multidrug-resistant Acinetobacter spp) was discharged were terminally disinfected with one of four strategies: standard disinfection (quaternary ammonium disinfectant, except for C difficile, for which 10% hypochlorite [bleach] was used; reference); standard disinfection and disinfecting ultraviolet light (UV-C), except for C difficile, for which bleach and UV-C was used (UV strategy); 10% hypochlorite (bleach strategy); and bleach and UV-C (bleach and UV strategy). We randomly assigned the sequence of strategies for each hospital (1:1:1:1), and each strategy was used for 7 months, including a 1-month wash-in period and 6 months of data collection. The prespecified secondary outcomes were hospital-wide, hospital-acquired incidence of all target organisms (calculated as number of patients with hospital-acquired infection with a target organism per 10 000 patient days), and hospital-wide, hospital-acquired incidence of each target organism separately. BETR Disinfection is registered with ClinicalTrials.gov, number . Findings Between April, 2012, and July, 2014, there were 271 740 unique patients with 375 918 admissions. 314 610 admissions met all inclusion criteria (n=73 071 in the reference study period, n=81 621 in the UV study period, n=78 760 in the bleach study period, and n=81 158 in the bleach and UV study period). 2681 incidenct cases of hospital-acquired infection or colonisation occurred during the study. There was no significant difference in the hospital-wide risk of target organism acquisition between standard disinfection and the three enhanced terminal disinfection strategies for all target multidrug-resistant organisms (UV study period relative risk [RR] 0.89, 95% CI 0.79–1.00; p=0.052; bleach study period 0.92, 0.79–1.08; p=0.32; bleach and UV study period 0.99, 0.89–1.11; p=0.89). The decrease in risk in the UV study period was driven by decreases in risk of acquisition of C difficile (RR 0.89, 95% CI 0.80–0.99; p=0.031) and VRE (0.56, 0.31–0.996; p=0.048). Interpretation Enhanced terminal room disinfection with UV in a targeted subset of high-risk rooms led to a decr...
OBJECTIVE Describe the epidemiology of carbapenem-resistant Enterobacteriaceae (CRE) and examine the effect of lower carbapenem breakpoints on CRE detection. DESIGN Retrospective cohort. SETTING Inpatient care at community hospitals. PATIENTS All patients with CRE-positive cultures were included. METHODS CRE isolated from 25 community hospitals were prospectively entered into a centralized database from January 2008 through December 2012. Microbiology laboratory practices were assessed using questionnaires. RESULTS A total of 305 CRE isolates were detected at 16 hospitals (64%). Patients with CRE had symptomatic infection in 180 cases (59%) and asymptomatic colonization in the remainder (125 cases; 41%). Klebsiella pneumoniae (277 isolates; 91%) was the most prevalent species. The majority of cases were healthcare associated (288 cases; 94%). The rate of CRE detection increased more than fivefold from 2008 (0.26 cases per 100,000 patient-days) to 2012 (1.4 cases per 100,000 patient-days; incidence rate ratio (IRR), 5.3 [95% confidence interval (CI), 1.22–22.7]; P = .01). Only 5 hospitals (20%) had adopted the 2010 Clinical and Laboratory Standards Institute (CLSI) carbapenem breakpoints. The 5 hospitals that adopted the lower carbapenem breakpoints were more likely to detect CRE after implementation of breakpoints than before (4.1 vs 0.5 cases per 100,000 patient-days; P < .001; IRR, 8.1 [95% CI, 2.7–24.6]). Hospitals that implemented the lower carbapenem breakpoints were more likely to detect CRE than were hospitals that did not (3.3 vs 1.1 cases per 100,000 patientdays; P = .01). CONCLUSIONS The rate of CRE detection increased fivefold in community hospitals in the southeastern United States from 2008 to 2012. Despite this, our estimates are likely underestimates of the true rate of CRE detection, given the low adoption of the carbapenem breakpoints recommended in the 2010 CLSI guidelines.
Based on our survey's results, we propose an FTE-to-bed ratio that can be used as a starting point to guide discussions regarding necessary resources for antibiotic stewardship programs with executive leadership. Prospective audit and feedback should be the cornerstone of stewardship programs, and both physician leadership and pharmacists with expertise in stewardship are crucial for success.
Antimicrobial stewardship programs (ASPs) positively impact patient care, but metrics to assess ASP impact are poorly defined. We used a modified Delphi approach to select relevant metrics for assessing patient-level interventions in acute-care settings for the purposes of internal program decision making. An expert panel rated 90 candidate metrics on a 9-point Likert scale for association with 4 criteria: improved antimicrobial prescribing, improved patient care, utility in targeting stewardship efforts, and feasibility in hospitals with electronic health records. Experts further refined, added, or removed metrics during structured teleconferences and re-rated the retained metrics. Six metrics were rated >6 in all criteria: 2 measures of Clostridium difficile incidence, incidence of drug-resistant pathogens, days of therapy over admissions, days of therapy over patient days, and redundant therapy events. Fourteen metrics rated >6 in all criteria except feasibility were identified as targets for future development.Keywords. antimicrobial stewardship; patient safety; process measure; outcome measure; quality metrics. We aimed to gain expert consensus on a list of metrics both useful for assessing the impact of patient-level antimicrobial stewardship interventions and feasible to measure in acutecare hospitals with an electronic health record. The goals of this study were not to identify quality metrics to be used for external comparisons or value-based incentives, but rather to identify metrics most pertinent for internal ASP decisions. METHODSWe performed a modified Delphi, expert consensus-building process to identify metrics useful for tracking the impact of patient-level antimicrobial stewardship interventions. The method differed from the Delphi process developed by the RAND Corporation because it did not include face-to-face meetings [5]. Rather, Web-based teleconferences and electronic surveys enabled the geographically diverse group of experts to participate without logistical barriers. The steps of the process included a comprehensive literature review to develop a candidate metrics list, 2 rounds of electronic surveys for metric rating, data collection, analyses, and feedback to the panel members, and structured, Web-based teleconference discussions between the electronic survey rounds.
OBJECTIVE To evaluate seasonal variation in the rate of surgical site infections (SSI) following commonly performed surgical procedures. DESIGN Retrospective cohort study. METHODS We analyzed 6 years (January 1, 2007, through December 31, 2012) of data from the 15 most commonly performed procedures in 20 hospitals in the Duke Infection Control Outreach Network. We defined summer as July through September. First, we performed 3 separate Poisson regression analyses (unadjusted, multivariable, and polynomial) to estimate prevalence rates and prevalence rate ratios of SSI following procedures performed in summer versus nonsummer months. Then, we stratified our results to obtain estimates based on procedure type and organism type. Finally, we performed a sensitivity analysis to test the robustness of our findings. RESULTS We identified 4,543 SSI following 441,428 surgical procedures (overall prevalence rate, 1.03/100 procedures). The rate of SSI was significantly higher during the summer compared with the remainder of the year (1.11/100 procedures vs 1.00/100 procedures; prevalence rate ratio, 1.11 [95% CI, 1.04–1.19]; P =.002). Stratum-specific SSI calculations revealed higher SSI rates during the summer for both spinal (P =.03) and nonspinal (P =.004) procedures and revealed higher rates during the summer for SSI due to either gram-positive cocci (P =.006) or gram-negative bacilli (P =.004). Multivariable regression analysis and sensitivity analyses confirmed our findings. CONCLUSIONS The rate of SSI following commonly performed surgical procedures was higher during the summer compared with the remainder of the year. Summer SSI rates remained elevated after stratification by organism and spinal versus nonspinal surgery, and rates did not change after controlling for other known SSI risk factors.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.