BACKGROUNDSevere acute respiratory syndrome coronavirus 2 (SARS-CoV-2) infection can spread rapidly within skilled nursing facilities. After identification of a case of Covid-19 in a skilled nursing facility, we assessed transmission and evaluated the adequacy of symptom-based screening to identify infections in residents.
METHODSWe conducted two serial point-prevalence surveys, 1 week apart, in which assenting residents of the facility underwent nasopharyngeal and oropharyngeal testing for SARS-CoV-2, including real-time reverse-transcriptase polymerase chain reaction (rRT-PCR), viral culture, and sequencing. Symptoms that had been present during the preceding 14 days were recorded. Asymptomatic residents who tested positive were reassessed 7 days later. Residents with SARS-CoV-2 infection were categorized as symptomatic with typical symptoms (fever, cough, or shortness of breath), symptomatic with only atypical symptoms, presymptomatic, or asymptomatic.
RESULTSTwenty-three days after the first positive test result in a resident at this skilled nursing facility, 57 of 89 residents (64%) tested positive for SARS-CoV-2. Among 76 residents who participated in point-prevalence surveys, 48 (63%) tested positive. Of these 48 residents, 27 (56%) were asymptomatic at the time of testing; 24 subsequently developed symptoms (median time to onset, 4 days). Samples from these 24 presymptomatic residents had a median rRT-PCR cycle threshold value of 23.1, and viable virus was recovered from 17 residents. As of April 3, of the 57 residents with SARS-CoV-2 infection, 11 had been hospitalized (3 in the intensive care unit) and 15 had died (mortality, 26%). Of the 34 residents whose specimens were sequenced, 27 (79%) had sequences that fit into two clusters with a difference of one nucleotide.
CONCLUSIONSRapid and widespread transmission of SARS-CoV-2 was demonstrated in this skilled nursing facility. More than half of residents with positive test results were asymptomatic at the time of testing and most likely contributed to transmission. Infection-control strategies focused solely on symptomatic residents were not sufficient to prevent transmission after SARS-CoV-2 introduction into this facility.
The Hosmer-Lemeshow test is a commonly used procedure for assessing goodness of fit in logistic regression. It has, for example, been widely used for evaluation of risk-scoring models. As with any statistical test, the power increases with sample size; this can be undesirable for goodness of fit tests because in very large data sets, small departures from the proposed model will be considered significant. By considering the dependence of power on the number of groups used in the Hosmer-Lemeshow test, we show how the power may be standardized across different sample sizes in a wide range of models. We provide and confirm mathematical derivations through simulation and analysis of data on 31,713 children from the Collaborative Perinatal Project. We make recommendations on how to choose the number of groups in the Hosmer-Lemeshow test based on sample size and provide example applications of the recommendations.
Background
Treating patients with infections due to multidrug-resistant pathogens often requires substantial healthcare resources. The purpose of this study was to report estimates of the healthcare costs associated with infections due to multidrug-resistant bacteria in the United States (US).
Methods
We performed retrospective cohort studies of patients admitted for inpatient stays in the Department of Veterans Affairs healthcare system between January 2007 and October 2015. We performed multivariable generalized linear models to estimate the attributable cost by comparing outcomes in patients with and without positive cultures for multidrug-resistant bacteria. Finally, we multiplied these pathogen-specific, per-infection attributable cost estimates by national counts of infections due to each pathogen from patients hospitalized in a cohort of 722 US hospitals from 2017 to generate estimates of the population-level healthcare costs in the US attributable to these infections.
Results
Our analysis cohort consisted of 16 676 patients with community-onset infections and 172 712 matched controls and 8246 patients with hospital-onset infections and 66 939 matched controls. The highest cost was seen in hospital-onset invasive infections, with attributable costs (95% confidence intervals) ranging from $30 998 ($25 272–$36 724) for methicillin-resistant Staphylococcus aureus to $74 306 ($20 377–$128 235) for carbapenem-resistant (CR) Acinetobacter. The highest attributable costs for community-onset invasive infections were seen in CR Acinetobacter ($62 396; $20 370–$104 422). Treatment of these infections cost an estimated $4.6 billion ($4.1 billion–$5.1 billion) in 2017 in the US for community- and hospital-onset infections combined.
Conclusions
We found that antimicrobial-resistant infections led to substantial healthcare costs.
BackgroundApproaches to controlling emerging antibiotic resistance in health care settings have evolved over time. When resistance to broad-spectrum antimicrobials mediated by extended-spectrum β-lactamases (ESBLs) arose in the 1980s, targeted interventions to slow spread were not widely promoted. However, when Enterobacteriaceae with carbapenemases that confer resistance to carbapenem antibiotics emerged, directed control efforts were recommended. These distinct approaches could have resulted in differences in spread of these two pathogens. CDC evaluated these possible changes along with initial findings of an enhanced antibiotic resistance detection and control strategy that builds on interventions developed to control carbapenem resistance.MethodsInfection data from the National Healthcare Safety Network from 2006–2015 were analyzed to calculate changes in the annual proportion of selected pathogens that were nonsusceptible to extended-spectrum cephalosporins (ESBL phenotype) or resistant to carbapenems (carbapenem-resistant Enterobacteriaceae [CRE]). Testing results for CRE and carbapenem-resistant Pseudomonas aeruginosa (CRPA) are also reported.ResultsThe percentage of ESBL phenotype Enterobacteriaceae decreased by 2% per year (risk ratio [RR] = 0.98, p<0.001); by comparison, the CRE percentage decreased by 15% per year (RR = 0.85, p<0.01). From January to September 2017, carbapenemase testing was performed for 4,442 CRE and 1,334 CRPA isolates; 32% and 1.9%, respectively, were carbapenemase producers. In response, 1,489 screening tests were performed to identify asymptomatic carriers; 171 (11%) were positive.ConclusionsThe proportion of Enterobacteriaceae infections that were CRE remained lower and decreased more over time than the proportion that were ESBL phenotype. This difference might be explained by the more directed control efforts implemented to slow transmission of CRE than those applied for ESBL-producing strains. Increased detection and aggressive early response to emerging antibiotic resistance threats have the potential to slow further spread.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.