The Working Party makes more than 100 tabulated recommendations in antimicrobial prescribing for the treatment of infections caused by multidrug-resistant (MDR) Gram-negative bacteria (GNB) and suggest further research, and algorithms for hospital and community antimicrobial usage in urinary infection. The international definition of MDR is complex, unsatisfactory and hinders the setting and monitoring of improvement programmes. We give a new definition of multiresistance. The background information on the mechanisms, global spread and UK prevalence of antibiotic prescribing and resistance has been systematically reviewed. The treatment options available in hospitals using intravenous antibiotics and in primary care using oral agents have been reviewed, ending with a consideration of antibiotic stewardship and recommendations. The guidance has been derived from current peer-reviewed publications and expert opinion with open consultation. Methods for systematic review were NICE compliant and in accordance with the SIGN 50 Handbook; critical appraisal was applied using AGREE II. Published guidelines were used as part of the evidence base and to support expert consensus. The guidance includes recommendations for stakeholders (including prescribers) and antibiotic-specific recommendations. The clinical efficacy of different agents is critically reviewed. We found there are very few good-quality comparative randomized clinical trials to support treatment regimens, particularly for licensed older agents. Susceptibility testing of MDR GNB causing infection to guide treatment needs critical enhancements. Meropenem- or imipenem-resistant Enterobacteriaceae should have their carbapenem MICs tested urgently, and any carbapenemase class should be identified: mandatory reporting of these isolates from all anatomical sites and specimens would improve risk assessments. Broth microdilution methods should be adopted for colistin susceptibility testing. Antimicrobial stewardship programmes should be instituted in all care settings, based on resistance rates and audit of compliance with guidelines, but should be augmented by improved surveillance of outcome in Gram-negative bacteraemia, and feedback to prescribers. Local and national surveillance of antibiotic use, resistance and outcomes should be supported and antibiotic prescribing guidelines should be informed by these data. The diagnosis and treatment of both presumptive and confirmed cases of infection by GNB should be improved. This guidance, with infection control to arrest increases in MDR, should be used to improve the outcome of infections with such strains. Anticipated users include medical, scientific, nursing, antimicrobial pharmacy and paramedical staff where they can be adapted for local use.
Objective: To identify clinical markers available within the first 48 hours of admission that are associated with poor outcome in infective endocarditis. Designs: Retrospective cohort study. Setting: Teaching hospital. Patients: 208 of 220 patients with infective endocarditis. Methods: Consecutive patients with infective endocarditis presenting between 1981 and 1999 to a tertiary centre were studied. Clinical, echocardiographic, and haematological data recorded within 48 hours of admission were obtained. Data were analysed using logistic regression models. Main outcomes measures: Mortality at discharge and at six months. Results: Data were obtained for 93% of patients who were eligible for inclusion. 194 (93%) were positive for Duke criteria. Mean age was 52 (1.2) years, and 138 (66%) were men. 82 (39%) were transferred from other hospitals. 181 (87%) were blood culture positive, and 47 (23%) infections were Staphylococcus aureus. The infection was located on aortic (n = 85, 41%), mitral (n = 77, 37%), tricuspid (n = 18, 9%), and multiple valves (n = 20, 10%). 67 (32%) had prosthetic valve endocarditis. 48% of the cohort were managed with antibiotics alone. Mortality at discharge was 18% and at six months 27%. Duration of illness before admission, age, sex, valve infected, infecting organism, and left ventricular function were not predictors of adverse mortality. However, abnormal white cell count, serum albumin concentration, serum creatinine concentration, or cardiac rhythm, the presence of two major Duke criteria, or visible vegetation conferred a poor prognosis. Conclusions: Conventional prognostic factors in this study did not appear to predict outcome early during hospital admission. However, simple clinical indices, which are readily available, are reliable, cheap, and potentially powerful predictors of poor outcome.
Summary Objectives Staphylococcus aureus bacteraemia is a common, often fatal infection. Our aim was to describe how its clinical presentation varies between populations and to identify common determinants of outcome. Methods We conducted a pooled analysis on 3395 consecutive adult patients with S. aureus bacteraemia. Patients were enrolled between 2006 and 2011 in five prospective studies in 20 tertiary care centres in Germany, Spain, United Kingdom, and United States. Results The median age of participants was 64 years (interquartile range 50–75 years) and 63.8% were male. 25.4% of infections were associated with diabetes mellitus, 40.7% were nosocomial, 20.6% were caused by methicillin-resistant S. aureus (MRSA), although these proportions varied significantly across studies. Intravenous catheters were the commonest identified infective focus (27.7%); 8.3% had endocarditis. Crude 14 and 90-day mortality was 14.6% and 29.2%, respectively. Age, MRSA bacteraemia, nosocomial acquisition, endocarditis, and pneumonia were independently associated with death, but a strong association was with an unidentified infective focus (adjusted hazard ratio for 90-day mortality 2.92; 95% confidence interval 2.33 to 3.67, p < 0.0001). Conclusion The baseline demographic and clinical features of S. aureus bacteraemia vary significantly between populations. Mortality could be reduced by assiduous MRSA control and early identification of the infective focus.
The glycopeptide antibacterial teicoplanin has become increasingly popular in the last decade with the rise in infections related to methicillin-resistant Staphylococcus aureus. Teicoplanin has 6 major and 4 minor components. It is predominantly (90%) bound to plasma proteins. Of the several methods available to measure concentrations in serum, fluorescence polarisation immunoassay has high reliability and specificity. Teicoplanin is not absorbed orally, but intravenous and intramuscular administration are well tolerated. Teicoplanin is eliminated predominantly by the kidneys and only 2 to 3% of an intravenously administered dose is metabolised. Total clearance is 11 ml/h/kg. Steady state is reached only slowly, 93% after 14 days of repeated administration. Elimination is triexponential, with half-lives of 0.4 to 1.0, 9.7 to 15.4 and 83 to 168 hours. Volumes of distribution are 0.07 to 0.11 (initial phase), 1.3 to 1.5 (distribution phase) and 0.9 to 1.6 (steady state) L/kg. A standard dosage regimen of 6 mg/kg every 12 hours for 3 doses, then daily, will produce therapeutic serum concentrations of > or = 10 mg/L in most patients. Higher dosages may be required in certain patients, for example intravenous drug abusers or those with burns, because of unpredictable clearance. Concentrations in bone reach 7 mg/L at 12 hours after a dose of teicoplanin 6 mg/kg, but reach only 3.5 mg/L in the cartilage. Doses of 10 mg/kg are necessary to achieve adequate bone concentrations. There is little penetration into cerebrospinal fluid or the aqueous or vitreous humour. In fat, concentrations may be subtherapeutic (0.5 to 5 mg/L) after a dose of 400mg. A single prophylactic dose of 12 mg/kg is sufficient to maintain therapeutic concentrations during cardiopulmonary bypass or burns surgery. High loading doses reduce the delay to attaining therapeutic concentrations. Premature neonates require a loading dose of 15 mg/kg and a maintenance dosage of 8 mg/kg daily to ensure therapeutic serum concentrations. Children need loading with 10 mg/kg every 12 hours for 3 doses followed by maintenance with 10 mg/kg/day. Clearance is reduced predictably in renal failure, and dosage adjustments can be based on the ratio of impaired clearance to normal clearance. In patients on haemodialysis, 3 loading doses of 6 mg/kg at 12-hour intervals followed by maintenance doses every 72 hours produced trough plasma concentrations of 8 mg/L in most patients at 48 hours. The monitoring of serum concentrations is not necessary to avoid toxicity, but can be helpful in certain patient groups to ensure therapeutic concentrations are present, especially in those not responding to treatment.
Objective To assess the level of agreement between common definitions of wound infection that might be used as performance indicators. Design Prospective observational study. Setting London teaching hospital group receiving emergency cases as well as tertiary referrals. Participants 4773 surgical patients staying in hospital at least two nights. Main outcome measures Numbers of wound infections based on purulent discharge alone, on the Centers for Disease Control (CDC) definition of wound infection, on the nosocomial infection national surveillance scheme (NINSS) version of the CDC definition, and on the ASEPSIS scoring method. Results 5804 surgical wounds were assessed during 5028 separate hospital admissions. The mean percentage of wounds classified as infected differed substantially with different definitions: 19.2% with the CDC definition (95% confidence interval 18.1% to 20.4%), 14.6% (13.6% to15.6%) with the NINSS version, 12.3% (11.4% to 13.2%) with pus alone, and 6.8% (6.1% to 7.5%) with an ASEPSIS score > 20. The agreement between definitions with respect to individual wounds was poor. Wounds with pus were automatically defined as infected with the CDC, NINSS, and pus alone definitions, but only 39% (283/714) of these had ASEPSIS scores > 20. Conclusions Small changes made to the CDC definition or even in its interpretation, as with the NINSS version, caused major variation in estimated percentage of wound infection. Substantial numbers of wounds were differently classified across the grades of infection. A single definition used consistently can show changes in percentage wound infection over time at a single centre, but differences in interpretation prevent comparison between different centres.
BackgroundEmpyema is an increasingly frequent clinical problem worldwide, and has substantial morbidity and mortality. Our objectives were to identify the clinical, surgical and microbiological features, and management outcomes, of empyema.MethodsA retrospective observational study over 12 years (1999–2010) was carried out at The Heart Hospital, London, United Kingdom. Patients with empyema were identified by screening the hospital electronic ‘Clinical Data Repository’. Demographics, clinical and microbiological characteristics, underlying risk factors, peri-operative blood tests, treatment and outcomes were identified. Univariable and multivariable statistical analyses were performed.ResultsPatients (n = 406) were predominantly male (74.1%); median age = 53 years (IQR = 37–69). Most empyema were community-acquired (87.4%) and right-sided (57.4%). Microbiological diagnosis was obtained in 229 (56.4%) patients, and included streptococci (16.3%), staphylococci (15.5%), Gram-negative organisms (8.9%), anaerobes (5.7%), pseudomonads (4.4%) and mycobacteria (9.1%); 8.4% were polymicrobial. Most (68%) cases were managed by open thoracotomy and decortication. Video-assisted thoracoscopic surgery (VATS) reduced hospitalisation from 10 to seven days (P = 0.0005). All-cause complication rate was 25.1%, and 28 day mortality 5.7%. Predictors of early mortality included: older age (P = 0.006), major co-morbidity (P = 0.01), malnutrition (P = 0.001), elevated red cell distribution width (RDW, P<0.001) and serum alkaline phosphatase (P = 0.004), and reduced serum albumin (P = 0.01) and haemoglobin (P = 0.04).ConclusionsEmpyema remains an important cause of morbidity and hospital admissions. Microbiological diagnosis was only achieved in just over 50% of cases, and tuberculosis is a notable causative organism. Treatment of empyema with VATS may reduce duration of hospital stay. Raised RDW appears to associate with early mortality.
BACKGROUND.: Vaccination against Epstein-Barr virus (EBV), inducing an antibody response to the envelope glycoprotein gp350, might protect EBV-negative children with chronic kidney disease from lymphoproliferative disease after transplantation. METHODS.: A phase I trial recruited children with chronic kidney disease to two successive cohorts given three injections of 12.5 microg (n=6) and 25 microg (n=10) recombinant gp350/alhydrogel vaccine over 6 to 8 weeks. RESULTS.: One in each cohort acquired wild EBV before the week 28 evaluation. Both doses were similarly immunogenic, inducing an IgG response in all 13 evaluable patients. Neutralizing antibodies were detected in four recipients (1/4 in the 12.5 microg and 3/9 in the 25 microg cohort). Median time from first vaccination to transplantation was 24 weeks. Immune responses declined rapidly and were unlikely to affect posttransplant events. DISCUSSION.: The vaccine was immunogenic but a prolonged vaccine schedule up to time of transplantation or improved adjuvants are required in future trials to reduce posttransplant EBV load and risk of lymphoproliferative disease.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.