Introduction The aim of this study was to investigate whether inhospital mortality was associated with the administered fraction of oxygen in inspired air (FiO 2 ) and achieved arterial partial pressure of oxygen (PaO 2 ).
Objective To compare postoperative complications in patients undergoing major surgery who received non-filtered or filtered red blood cell transfusions. Design Prospective, randomised, double blinded trial. Setting 19 hospitals throughout the Netherlands (three university; 10 clinical; six general). Participants 1051 evaluable patients: 79 patients with ruptured aneurysm, 412 patients undergoing elective surgery for aneurysm, and 560 undergoing gastrointestinal surgery. Interventions The non-filtered products had the buffy coat removed and were plasma reduced. The filtered products had the buffy coat removed, were plasma reduced, and filtered before storage to remove leucocytes. Main outcome measures Mortality and duration of stay in intensive care. Secondary end points were occurrence of multi-organ failure, infections, and length of hospital stay. Results No significant differences were found in mortality (odds ratio for filtered v non-filtered 0.80, 95% confidence interval 0.53 to 1.21) and in mean stay in intensive care ( − 0.4 day, − 1.6 to 0.6 day). In the filtered group the mean length of hospital stay was 2.4 days shorter ( − 4.8 to 0.0 day; P = 0.050) and the incidence of multi-organ failure was 30% lower (odds ratio 0.70, 0.49 to 1.00; P = 0.050). There were no differences in rates of infection (0.98, 0.73 to 1.32). Conclusion The use of filtered transfusions in some types of major surgery may reduce the length of hospital stay and the incidence of postoperative multi-organ failure.
The use of antegrade selective cerebral perfusion and deep hypothermic circulatory arrest during ascending aorta-hemiarch replacement resulted in acceptable hospital mortality and neurologic outcome. Reduced postoperative intubation time and better renal function preservation were observed in the antegrade selective cerebral perfusion group.
The highly significant correlation between cardio-pulmonary-bypass-time-category and the occurrence of undesirable postoperative events is demonstrated by the consequent rise in odds ratios. This independent influence of cardio-pulmonary-bypass-time on outcome reflects both problems encountered during revascularisation and time-related influence of cardio-pulmonary-bypass on the human body. When a predictive model was created, CPBT proved to be a good predictor of undesirable postoperative events.
As oversedation is still common and significant variability between and within critically ill patients makes empiric dosing difficult, the population pharmacokinetics and pharmacodynamics of propofol upon long-term use are characterized, particularly focused on the varying disease state as determinant of the effect. Twenty-six critically ill patients were evaluated during 0.7-9.5 days (median 1.9 days) using the Ramsay scale and the bispectral index as pharmacodynamic end points. NONMEM V was applied for population pharmacokinetic and pharmacodynamic modeling. Propofol pharmacokinetics was described by a two-compartment model, in which cardiac patients had a 38% lower clearance. Severity of illness, expressed as a Sequential Organ Failure Assessment (SOFA) score, particularly influenced the pharmacodynamics and to a minor degree the pharmacokinetics. Deeper levels of sedation were found with an increasing SOFA score. With severe illness, critically ill patients will need downward titration of propofol. In patients with cardiac failure, the propofol dosages should be reduced by 38%.
IntroductionCardiac operations account for a large proportion of the blood transfusions given each year, leading to high costs and an increased risk to patient safety. Therefore, it is important to explore initiatives to reduce transfusion rates. This study aims to provide a benchmark for transfusion practice by inter-hospital comparison of transfusion rates, blood product use and costs related to patients undergoing coronary artery bypass grafting (CABG), valve surgery or combined CABG and valve surgery.MethodsBetween 2010 and 2013, patients from four Dutch hospitals undergoing CABG, valve surgery or combined CABG and valve surgery (n = 11,150) were included by means of a retrospective longitudinal study design.ResultsIn CABG surgery the transfusion rate ranged between 43 and 54%, in valve surgery between 54 and 67%, and in combined CABG and valve surgery between 80 and 88%. With the exception of one hospital, the trend in transfusion rate showed a significant decrease over time for all procedures. Hospitals differed significantly in the units of blood products given to each patient, and in the use of specific transfused combinations of blood products, such as red blood cells (RBCs) and a combination of RBCs, fresh frozen plasma (FFP) and platelets.ConclusionThis study indicates that benchmarking blood product usage stimulates awareness of transfusion behaviour, which may lead to better patient safety and lower costs. Further studies are warranted to improve awareness of transfusion behaviour and increase the standardisation of transfusion practice in cardiac surgery.
Several recent studies have shown differences in blood loss and allogeneic transfusion requirements between on-pump and off-pump coronary artery bypass grafting (CABG). Recently a new concept, the mini-extracorporeal circulation, was introduced to minimize the side effects of extracorporeal circulation. Therefore, there are no data comparing the three techniques with special emphasis to blood loss and transfusion requirements. Two hundred and eighty-five patients undergoing first-time coronary artery bypass grafting were retrospectively matched for number of grafts, age and sex. Ninety-five patients underwent surgery with the off-pump CABG (OPCAB) technique, 97 patients using conventional CABG with cold cardioplegia (CCABG) and 93 patients with the mini-extracorporeal circuit with warm blood cardioplegia (MCABG). Blood loss for the CCABG group with a mean loss of 819 +/- 557 mL and the OPCAB group with a mean loss of 870 +/- 768 mL was significant different compared to the MCABG group with a mean loss of 679 +/- 290 mL. The use of units red blood cell units was significantly higher for CCABG group and OPCAB group compared to the MCABG group. On the day of operation the use of platelet concentrate was significantly higher for the CCABG group compared to MCABG group. As a consequence of improvements of several components of the mini heart lung machine, significantly less blood products are needed in MCABG patients. The expected reduced need for transfusion when the pump was completely avoided could not be confirmed in this single retrospective cohort study.
IntroductionUse of selective decontamination of the digestive tract (SDD) and selective oropharyngeal decontamination (SOD) in intensive care patients has been controversial for years. Through regular questionnaires we determined expectations concerning SDD (effectiveness) and experience with SDD and SOD (workload and patient friendliness), as perceived by nurses and physicians.MethodsA survey was embedded in a group-randomized, controlled, cross-over multicenter study in the Netherlands in which, during three 6-month periods, SDD, SOD or standard care was used in random order. At the end of each study period, all nurses and physicians from participating intensive care units received study questionnaires.ResultsIn all, 1024 (71%) of 1450 questionnaires were returned by nurses and 253 (82%) of 307 by physicians. Expectations that SDD improved patient outcome increased from 71% and 77% of respondents after the first two study periods to 82% at the end of the study (P = 0.004), with comparable trends among nurses and physicians. Nurses considered SDD to impose a higher workload (median 5.0, on a scale from 1 (low) to 10 (high)) than SOD (median 4.0) and standard care (median 2.0). Both SDD and SOD were considered less patient friendly than standard care (medians 4.0, 4.0 and 6.0, respectively). According to physicians, SDD had a higher workload (median 5.5) than SOD (median 5.0), which in turn was higher than standard care (median 2.5). Furthermore, physicians graded patient friendliness of standard care (median 8.0) higher than that of SDD and SOD (both median 6.0).ConclusionsAlthough perceived effectiveness of SDD increased as the trial proceeded, both among physicians and nurses, SOD and SDD were, as compared to standard care, considered to increase workload and to reduce patient friendliness. Therefore, education about the importance of oral care and on the effects of SDD and SOD on patient outcomes will be important when implementing these strategies.Trial registrationISRCTN35176830.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.