Bacterial sepsis and severe COVID-19 share similar clinical manifestations and are both associated with dysregulation of the myeloid cell compartment. We previously reported an expanded CD14+ monocyte cell state, MS1, in patients with bacterial sepsis, and validated expansion of this cell subpopulation in 18 patients with sepsis using publicly available transcriptomics data. Here, using published scRNA-seq datasets, we show that the gene expression program associated with MS1 correlated with sepsis severity and was up-regulated in monocytes from patients with severe COVID-19. To examine the ontogeny and function of MS1 cells, we developed a cellular model for inducing CD14+ MS1 monocytes by treating human hematopoietic stem and progenitor cells (HSPCs) from healthy bone marrow donors in culture with plasma from patients with severe bacterial infection or SARS-CoV-2 infection. We demonstrated that plasma from patients with bacterial sepsis or COVID-19 induced myelopoiesis in HSPCs in vitro and expression of the MS1 gene program in monocytes and neutrophils that differentiated from these HSPCs. Furthermore, we found that plasma concentrations of IL-6, and to a lesser extent IL-10, correlated with increased myeloid cell output from HSPCs in vitro and enhanced expression of the MS1 gene program. We validated the requirement for these two cytokines to induce the MS1 gene program through CRISPR-Cas9 editing of their receptors in HSPCs. Using this cellular model system, we demonstrated that MS1 cells were broadly immunosuppressive and showed decreased responsiveness to stimulation with a synthetic RNA analog. Our in vitro study suggests a potential role for systemic cytokines in inducing myelopoiesis during severe bacterial infection or SARS-CoV-2 infection.
Cirrhotic cardiomyopathy causes variable degree of systolic and diastolic dysfunction (DD) and conduction abnormalities. The primary aim of our study was to determine whether pre-transplant DD and prolonged corrected QT (QTc) predict a composite of mortality, graft failure, and major cardiovascular events after liver transplantation. We also evaluated the reversibility of cirrhotic cardiomyopathy after transplantation. Adult patients who underwent liver transplantation at our institution from January 2007 to March 2009 were included. Data were obtained from institutional registry, medical record review, and evaluation of echocardiographic images. Among 243 patients, 113 (46.5%) had grade 1 DD, 16 (6.6%) had grade 2 DD, and none had grade 3 DD. The mean pre-transplant QTc was 453 milliseconds. After a mean post-transplant follow-up of 5.2 years, 75 (31%) patients satisfied the primary composite outcome. Cox regression analysis did not show any significant association between DD and the composite outcome (P=.17). However, longer QTc was independently associated with the composite outcome (HR: 1.01, 95% confidence interval: 1.00-1.02, P=.05). DD (P<.001) and left ventricular mass index (P=.001) worsened after transplantation. In conclusion, QTc prolongation appears to be associated with worse outcomes. Although DD did not impact outcomes, it significantly worsened after transplantation.
Background: Limited data exist regarding ventilation in patients with class III obesity [body mass index (BMI) > 40 kg/m 2 ] and acute respiratory distress syndrome (ARDS). The aim of the present study was to determine whether an individualized titration of mechanical ventilation according to cardiopulmonary physiology reduces the mortality in patients with class III obesity and ARDS. Methods: In this retrospective study, we enrolled adults admitted to the ICU from 2012 to 2017 who had class III obesity and ARDS and received mechanical ventilation for > 48 h. Enrolled patients were divided in two cohorts: one cohort (2012-2014) had ventilator settings determined by the ARDSnet table for lower positive end-expiratory pressure/higher inspiratory fraction of oxygen (standard protocol-based cohort); the other cohort (2015-2017) had ventilator settings determined by an individualized protocol established by a lung rescue team (lung rescue team cohort). The lung rescue team used lung recruitment maneuvers, esophageal manometry, and hemodynamic monitoring. Results: The standard protocol-based cohort included 70 patients (BMI = 49 ± 9 kg/m 2 ), and the lung rescue team cohort included 50 patients (BMI = 54 ± 13 kg/m 2 ). Patients in the standard protocol-based cohort compared to lung rescue team cohort had almost double the risk of dying at 28 days [31% versus 16%, P = 0.012; hazard ratio (HR) 0.32; 95% confidence interval (CI95%) 0.13-0.78] and 3 months (41% versus 22%, P = 0.006; HR 0.35; CI95% 0.16-0.74), and this effect persisted at 6 months and 1 year (incidence of death unchanged 41% versus 22%, P = 0.006; HR 0.35; CI95% 0.16-0.74). Conclusion: Individualized titration of mechanical ventilation by a lung rescue team was associated with decreased mortality compared to use of an ARDSnet table.
Anti-Galalpha1-3Gal antibodies (antialphaGal Ab) are a major barrier to clinical xenotransplantation as they are believed to initiate both hyperacute and acute humoral rejection. Extracorporeal immunoadsorption (EIA) with alphaGal oligosaccharide columns temporarily depletes antialphaGal Ab, but their return is ultimately associated with graft destruction. We therefore assessed the ability of two immunotoxins (IT) and two monoclonal antibodies (mAb) to deplete B and/or plasma cells both in vitro and in vivo in baboons, and to observe the rate of return of antialphaGal Ab following EIA. The effects of the mouse anti-human IT anti-CD22-ricin A (proportional to CD22-IT, directed against a B cell determinant) and anti-CD38-ricin A (proportional to CD38-IT, B and plasma cell determinant) and the mouse anti-human anti-CD38 mAb (proportional to CD38 mAb) and mouse/human chimeric anti-human anti-CD20 mAb (proportional to CD20 mAb, Rituximab, B cell determinant) on B and plasma cell depletion and antialphaGal Ab production were assessed both in vitro and in vivo in baboons (n = 9) that had previously undergone splenectomy. For comparison, two baboons received nonmyeloablative whole body irradiation (WBI) (300 cGy), and one received myeloablative WBI (900 cGy). Depletion of B cells was monitored by flow cytometry of blood, bone marrow (BM) and lymph nodes (LN), staining with anti-CD20 and/or anti-CD22 mAbs, and by histology of LN. EIA was carried out after the therapy and antialphaGal Ab levels were measured daily. In vitro proportional to CD22-IT inhibited protein synthesis in the human Daudi B cell line more effectively than proportional to CD38-IT. Upon differentiation of B cells into plasma cells, however, less inhibition of protein synthesis after proportional to CD22-IT treatment was observed. Depleting CD20-positive cells in vitro from a baboon spleen cell population already depleted of granulocytes, monocytes, and T cells led to a relative enrichment of CD20-negative cells, that is plasma cells, and consequently resulted in a significant increase in antialphaGal Ab production by the remaining cells, whereas depleting CD38-positive cells resulted in a significant decrease in antialphaGal Ab production. In vivo, WBI (300 or 900 cGy) resulted in 100% B cell depletion in blood and BM, > 80% depletion in LN, with substantial recovery of B cells after 21 days and only transient reduction in antialphaGal Ab after EIA. Proportional to CD22-IT depleted B cells by > 97% in blood and BM, and by 60% in LN, but a rebound of B cells was observed after 14 and 62 days in LN and blood, respectively. At 7 days, serum antialphaGal IgG and IgM Ab levels were reduced by a maximum of 40-45% followed by a rebound to levels up to 12-fold that of baseline antialphaGal Ab by day 83 in one baboon. The results obtained with proportional to CD38-IT were inconclusive. This may have been, in part, due to inadequate conjugation of the toxin. Cell coating was 100% with proportional to CD38 mAb, but no changes in antialphaGal Ab production were obs...
This is the first study to report an efficient decrease of phagocytic function by depletion of macrophages with MLs in a large-animal model. Depletion of macrophages with MLs led to initial higher chimerism and prolonged the survival of circulating pig cells in baboons. Blockade of macrophage function with IVIg had a more modest effect. Cells of the RES, therefore, play a major role in clearing pPBPCs from the circulation in baboons. Depletion or blockade of the RES may contribute to achieving mixed hematopoietic chimerism and induction of tolerance to a discordant xenograft.
Although most patients presenting for liver transplantation have normal left ventricular function, some develop left ventricular failure after transplantation. The primary objective of our study was to determine the predictors of systolic heart failure (HF) occurring immediately after liver transplantation. Its etiology, prospects of recovery, and factors associated with nonrecovery were also studied. Liver transplantations performed at our institution from January 2006 to February 2015 were evaluated using prospectively collected institutional registries. Patients with echocardiographically documented decline in ejection fraction to <45% within 6 months after liver transplantation were identified. Four controls were chosen per case: matched for age, gender, transplant year, and model for end-stage liver disease score. Conditional multivariable logistic regression was used for primary analysis and nonparametric tests for comparison between groups. In a cohort of 1284 adult patients, 45 cases and 180 controls were identified. Diastolic dysfunction (DD) was an independent predictor (OR 5.26, 95% CI 1.03-28.57, P = .04) of systolic HF in multivariable analysis. Stress-induced cardiomyopathy was the most common etiology. Left ventricular function recovered in 21 patients. Pretransplant DD decreased the chances of recovery (P = .05). In conclusion, patients with pretransplant DD need close post-transplant follow-up for timely identification of HF.
With the increasing age of recipients undergoing orthotopic liver transplant (OLT), there is need for better risk stratification among them. Our study aims to identify predictors of poor outcome among OLT recipients ≥ 60 yr of age. All patients who underwent OLT at Cleveland Clinic from January 2004 to April 2010 were included. Baseline patient characteristics and post-OLT outcomes (mortality, graft failure, length of stay, and major post-OLT cardiovascular events) were obtained from prospectively collected institutional registry. Among patients ≥ 60 yr of age, multivariate regression modeling was performed to identify independent predictors of poor outcome. Of the 738 patients included, 223 (30.2%) were ≥ 60 yr. Hepatic encephalopathy, platelet counts < 45,000/μL, total serum bilirubin > 3.5 mg/dL, and serum albumin < 2.65 mg/dL independently predicted poor short-term outcomes. The presence of pre-OLT coronary artery disease and arrhythmia were independent predictors of poor long-term outcomes. Cardiac causes represented the second most common cause of mortality among the elderly cohort. Despite that, this carefully selected cohort of older OLT recipients had outcomes that were comparable with the younger recipients. Thus, our results show the need for better pre-OLT evaluation and optimization, and for closer post-OLT surveillance, of cardiovascular disease among the elderly.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.