Background: Capillary refill time (CRT) may improve more rapidly than lactate in response to increments in systemic flow. Therefore, it can be assessed more frequently during septic shock (SS) resuscitation. Hyperlactatemia, in contrast, exhibits a slower recovery in SS survivors, probably explained by the delayed resolution of non-hypoperfusion-related sources. Thus, targeting lactate normalization may be associated with impaired outcomes. The ANDROMEDA-SHOCK trial compared CRT-versus lactate-targeted resuscitation in early SS. CRT-targeted resuscitation associated with lower mortality and organ dysfunction; mechanisms were not investigated. CRT was assessed every 30 min and lactate every 2 h during the 8-h intervention period, allowing a first comparison between groups at 2 h (T2). Our primary aim was to determine if SS patients evolving with normal CRT at T2 after randomization (T0) exhibited a higher mortality and organ dysfunction when allocated to the LT arm than when randomized to the CRT arm. Our secondary aim was to determine if those patients with normal CRT at T2 had received more therapeutic interventions when randomized to the LT arm. To address these issues, we performed a post hoc analysis of the ANDROMEDA-SHOCK dataset. Results: Patients randomized to the lactate arm at T0, evolving with normal CRT at T2 exhibited significantly higher mortality than patients with normal CRT at T2 initially allocated to CRT (40 vs 23%, p = 0.009). These results replicated at T8 and T24. LT arm received significantly more resuscitative interventions (fluid boluses: 1000[500-2000] vs. 500[0-1500], p = 0.004; norepinephrine test in previously hypertensive patients: 43 (35) vs. 19 (19), p = 0.001; and inodilators: 16 (13) vs. 3 (3), p = 0.003). A multivariate logistic regression of patients with normal CRT at T2, including APACHE-II, baseline lactate, cumulative fluids administered since emergency admission, source of infection, and randomization group) confirmed that allocation to LT group was a statistically significant determinant of 28-day mortality (OR 3.3; 95%CI[1.5-7.1]); p = 0.003).
Background Persistent hyperlactatemia has been considered as a signal of tissue hypoperfusion in septic shock patients, but multiple non-hypoperfusion-related pathogenic mechanisms could be involved. Therefore, pursuing lactate normalization may lead to the risk of fluid overload. Peripheral perfusion, assessed by the capillary refill time (CRT), could be an effective alternative resuscitation target as recently demonstrated by the ANDROMEDA-SHOCK trial. We designed the present randomized controlled trial to address the impact of a CRT-targeted (CRT-T) vs. a lactate-targeted (LAC-T) fluid resuscitation strategy on fluid balances within 24 h of septic shock diagnosis. In addition, we compared the effects of both strategies on organ dysfunction, regional and microcirculatory flow, and tissue hypoxia surrogates. Results Forty-two fluid-responsive septic shock patients were randomized into CRT-T or LAC-T groups. Fluids were administered until target achievement during the 6 h intervention period, or until safety criteria were met. CRT-T was aimed at CRT normalization (≤ 3 s), whereas in LAC-T the goal was lactate normalization (≤ 2 mmol/L) or a 20% decrease every 2 h. Multimodal perfusion monitoring included sublingual microcirculatory assessment; plasma-disappearance rate of indocyanine green; muscle oxygen saturation; central venous-arterial pCO2 gradient/ arterial-venous O2 content difference ratio; and lactate/pyruvate ratio. There was no difference between CRT-T vs. LAC-T in 6 h-fluid boluses (875 [375–2625] vs. 1500 [1000–2000], p = 0.3), or balances (982[249–2833] vs. 15,800 [740–6587, p = 0.2]). CRT-T was associated with a higher achievement of the predefined perfusion target (62 vs. 24, p = 0.03). No significant differences in perfusion-related variables or hypoxia surrogates were observed. Conclusions CRT-targeted fluid resuscitation was not superior to a lactate-targeted one on fluid administration or balances. However, it was associated with comparable effects on regional and microcirculatory flow parameters and hypoxia surrogates, and a faster achievement of the predefined resuscitation target. Our data suggest that stopping fluids in patients with CRT ≤ 3 s appears as safe in terms of tissue perfusion. Clinical Trials: ClinicalTrials.gov Identifier: NCT03762005 (Retrospectively registered on December 3rd 2018)
BackgroundThe spontaneous breathing trial (SBT) assesses the risk of weaning failure by evaluating some physiological responses to the massive venous return increase imposed by discontinuing positive pressure ventilation. This trial can be very demanding for some critically ill patients, inducing excessive physical and cardiovascular stress, including muscle fatigue, heart ischemia and eventually cardiac dysfunction. Extubation failure with emergency reintubation is a serious adverse consequence of a failed weaning process. Some data suggest that as many as 50% of patients that fail weaning do so because of cardiac dysfunction. Unfortunately, monitoring cardiovascular function at the time of the SBT is complex. The aim of our study was to explore if central venous pressure (CVP) changes were related to weaning failure after starting an SBT. We hypothesized that an early rise on CVP could signal a cardiac failure when handling a massive increase on venous return following a discontinuation of positive pressure ventilation. This CVP rise could identify a subset of patients at high risk for extubation failure.MethodsTwo-hundred and four mechanically ventilated patients in whom an SBT was decided were subjected to a monitoring protocol that included blinded assessment of CVP at baseline, and at 2 minutes after starting the trial (CVP-test). Weaning failure was defined as reintubation within 48-hours following extubation. Comparisons between two parametric or non-parametric variables were performed with student T test or Mann Whitney U test, respectively. A logistic multivariate regression was performed to determine the predictive value on extubation failure of usual clinical variables and CVP at 2-min after starting the SBT.ResultsOne-hundred and sixty-five patients were extubated after the SBT, 11 of whom were reintubated within 48h. Absolute CVP values at 2-minutes, and the change from baseline (dCVP) were significantly higher in patients with extubation failure as compared to those successfully weaned. dCVP was an early predictor for reintubation (OR: 1.70 [1.31,2.19], p<0.001).ConclusionsAn early rise in CVP after starting an SBT was associated with an increased risk of extubation failure. This might represent a warning signal not captured by usual SBT monitoring and could have relevant clinical implications.
Background Lung rest has been recommended during extracorporeal membrane oxygenation (ECMO) for severe acute respiratory distress syndrome (ARDS). Whether positive end-expiratory pressure (PEEP) confers lung protection during ECMO for severe ARDS is unclear. We compared the effects of three different PEEP levels whilst applying near-apnoeic ventilation in a model of severe ARDS treated with ECMO. Methods Acute respiratory distress syndrome was induced in anaesthetised adult male pigs by repeated saline lavage and injurious ventilation for 1.5 h. After ECMO was commenced, the pigs received standardised near-apnoeic ventilation for 24 h to maintain similar driving pressures and were randomly assigned to PEEP of 0, 10, or 20 cm H 2 O ( n =7 per group). Respiratory and haemodynamic data were collected throughout the study. Histological injury was assessed by a pathologist masked to PEEP allocation. Lung oedema was estimated by wet-to-dry-weight ratio. Results All pigs developed severe ARDS. Oxygenation on ECMO improved with PEEP of 10 or 20 cm H 2 O, but did not in pigs allocated to PEEP of 0 cm H 2 O. Haemodynamic collapse refractory to norepinephrine ( n =4) and early death ( n =3) occurred after PEEP 20 cm H 2 O. The severity of lung injury was lowest after PEEP of 10 cm H 2 O in both dependent and non-dependent lung regions, compared with PEEP of 0 or 20 cm H 2 O. A higher wet-to-dry-weight ratio, indicating worse lung injury, was observed with PEEP of 0 cm H 2 O. Histological assessment suggested that lung injury was minimised with PEEP of 10 cm H 2 O. Conclusions During near-apnoeic ventilation and ECMO in experimental severe ARDS, 10 cm H 2 O PEEP minimised lung injury and improved gas exchange without compromising haemodynamic stability.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.