Background and objectives Central venous catheters have traditionally provided access for urgent hemodialysis, but are also sometimes advocated as an option for older or more comorbid patients. Adverse effects of this type of dialysis access include central venous stenosis, for which the risk factors and consequences are incompletely understood.Design, setting, participants, & measurements We conducted two studies within the same population cohort, comprising all patients starting hemodialysis in a single center from January 2006 to December 2013. First, patients were retrospectively analyzed for the presence of central venous stenosis; their access outcomes are described and survival compared with matched controls drawn from the same population. Second, a subset of patients with a history of catheter access within this cohort was analyzed to determine risk factors for central venous stenosis.Results Among 2811 patients, central venous stenosis was diagnosed in 120 (4.3%), at a median dialysis vintage of 2.9 (interquartile range, 1.8-4.6) years. Compared with matched controls, patients with central venous stenosis had similar survival (median 5.1 versus 5.2 years; P=0.54). Among a subset of 500 patients, all with a history of catheter use, 34 (6.8%) developed central venous stenosis, at a rate of 2.2 per 100 patient-years. The incidence of central venous stenosis was higher with larger number of previous catheters (relative risk [RR], 2.2; 95% confidence interval [95% CI]. 1.6 to 2.9), pacemaker insertion (RR, 3.9; 95% CI, 1.7 to 8.9), and was lower with older age (RR, 0.7 per decade; 95% CI, 0.6 to 0.8). In a Cox proportional hazards model, the catheter number, pacemaker, and younger age at dialysis initiation were all significant independent risk factors for central venous stenosis.Conclusions Central venous stenosis occurred in a minority of patients on hemodialysis, and was associated with compromised future access, but unchanged survival. Among patients with a history of catheter use, risk related to both the number of catheters and the total catheter duration, although nondialysis factors such as pacemakers were also important. Central venous stenosis risk was lower in older patients, supporting the selective use of tunneled catheters in this group.Published online ahead of print. Publication date available at www. cjasn.org.
BackgroundKidney transplant recipients often receive large volumes of intravenous fluid replacement in the peri-operative period. Administration of 0.9% saline has previously been associated with acidosis, hyperkalaemia and acute kidney injury. The perioperative use of physiologically balanced replacement fluids may reduce the incidence of post-operative renal replacement therapy and hyperkalaemia.MethodsA retrospective review of consecutive renal transplants before and after a change in perioperative fluid prescription from 0.9% saline to Plasma-Lyte 148.ResultsA total of 97 patients were included in the study, 59 receiving exclusively 0.9% saline and 38 receiving exclusively Plasma-Lyte. Patients in the Plasma-Lyte group were less likely to require emergency postoperative dialysis than those receiving 0.9% saline [odds ratio (OR) 0.15 (95% confidence interval 0.03–0.48), P = 0.004], and these patients had more favourable biochemical parameters with less hyperkalaemia, less acidosis and better diuresis. Patients in the Plasma-Lyte group also had a shorter length of hospital stay (7 days versus 11 days; P < 0.0001) and better graft function at 3 months postoperatively (estimated glomerular filtration rate 51 versus 44 mL/min/1.73 m2; P = 0.03); however, there was no difference in graft function at 1 year.ConclusionsPlasma-Lyte in the perioperative period is safe in renal transplantation and is associated with a favourable biochemical profile, including a reduced incidence of hyperkalaemia, better diuresis and less frequent use of renal replacement therapy early after surgery. In patients receiving Plasma-Lyte, graft function was better at 3 months, but this difference did not persist up to 1 year after transplantation.
reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. carbon dioxide 30 [27][28][29][30][31][32][33][34][35] mmHg and median temperature 37.1 [36.8-37.3]°C. After removal of artefacts, the mean monitoring time was 22 h08 (8 h54). All patients had impaired cerebral autoregulation during their monitoring time. The mean IAR index was 17 (9.5) %. During H 0 H 6 and H 18 H 24 , the majority of our patients; respectively 53 and 71 % had an IAR index > 10 %. Conclusion According to our data, patients with septic shock had impaired cerebral autoregulation within the first 24 hours of their admission in the ICU. In our patients, we described a variability of distribution of impaired autoregulation according to time. ReferencesSchramm P, Klein KU, Falkenberg L, et al. Impaired cerebrovascular autoregulation in patients with severe sepsis and sepsis-associated delirium. Crit Care 2012; 16: R181. Aries MJH, Czosnyka M, Budohoski KP, et al. Continuous determination of optimal cerebral perfusion pressure in traumatic brain injury. Crit. Care Med. 2012.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.