Acetaminophen‐induced acute liver failure (AALF) is associated with innate immunity activation, which contributes to the severity of hepatic injury and clinical outcome. A marked increase in hepatic macrophages (h‐mϕ) is observed in experimental models of AALF, but controversy exists regarding their role, implicating h‐mϕ in both aggravation and resolution of liver injury. The role of h‐mϕ in human AALF is virtually unexplored. We sought to investigate the role of chemokine (C‐C motif) ligand 2 (CCL2) in the recruitment of circulating monocytes to the inflamed liver and to determine how the h‐mϕ infiltrate and liver microenvironment may contribute to tissue repair versus inflammation in AALF. We evaluated circulating monocytes, their chemokine (C‐C motif) receptor 2 (CCR2) expression, and serum CCL2 levels in patients with AALF. Cell subsets and numbers of circulation‐derived (MAC387+) or resident proliferating (CD68/Ki67+) h‐mϕ in hepatic immune infiltrates were determined by immunohistochemistry. Inflammatory cytokine levels were determined in whole and laser microdissected liver tissue by proteome array. In AALF, circulating monocytes were depleted, with the lowest levels observed in patients with adverse outcomes. CCL2 levels were high in AALF serum and hepatic tissue, and circulating monocyte subsets expressed CCR2, suggesting CCL2‐dependent hepatic monocyte recruitment. Significant numbers of both MAC387+ and CD68+ h‐mϕ were found in AALF compared with control liver tissue with a high proportion expressing the proliferation marker Ki67. Levels of CCL2, CCL3, interleukin (IL)‐6, IL‐10, and transforming growth factor‐β1 were significantly elevated in AALF liver tissue relative to chronic liver disease controls. Conclusion: In AALF, the h‐mϕ population is expanded in areas of necrosis, both through proliferation of resident cells and CCL2‐dependent recruitment of circulating monocytes. The presence of h‐mϕ within an anti‐inflammatory/regenerative microenvironment indicates that they are implicated in resolution of inflammation/tissue repair processes during AALF. (HEPATOLOGY 2012)
Postoperative acute kidney injury (AKI) increases morbidity and mortality after liver transplantation (LT). Novel methods of assessing AKI including cystatin C (CyC) and neutrophil gelatinase-associated lipocalin (NGAL) have been identified as potential markers of AKI. We compare the ability of standard renal markers (serum creatinine [sCr], estimated glomerular filtration rate [eGFR] and intensive therapy unit organ failure scores with CyC and NGAL to predict AKI within the first 48 hours after LT. 95 patients (median age 50 [interquartile range ¼ 41-59], 60% male) underwent LT (25% with acute liver failure). AKI was defined according to the Acute Kidney Injury Network criteria. Severe AKI was classified as !stage 2. NGAL (urine [u] and plasma [p]) and CyC concentrations taken immediately after transplantation on admission to the Liver Intensive Care Unit were compared with standard markers of renal function. Predictive ability was assessed using the area under the curve generated by receiver operator characteristic analysis (AUROC) and logistic regression. Day 0 sCr, uNGAL, pNGAL, CyC, and eGFR predicted AKI as did SOFA (Sequential Organ Failure Assessment) and APACHE II (Acute Physiology and Chronic Health Evaluation II) scores. APACHE II and pNGAL were the most powerful predictors of severe AKI (APACHE II AUROC ¼ 0.87 [0.77-0.97], P < 0.001; pNGAL AUROC ¼ 0.87 [0.77-0.92], P < 0.001). Using multivariate logistic regression, APACHE II (odds ratio 1.64/point [95% confidence interval ¼ 1.22-2.21, P ¼ 0.001] and pNGAL [odds ratio ¼ 1.01/ng/mL [95% confidence interval ¼ 1.00-1.02], P ¼ 0.002) retained independent significance. A ''renal risk score'' using APACHE II > 13 and pNGAL > 258 ng/mL was calculated with a score of !1 having a 100% sensitivity and 76% specificity for severe AKI. In conclusion, a combination of NGAL and APACHE II predicts AKI with high sensitivity and specificity after LT. Liver Transpl 16:1257-1266, 2010. V C 2010 AASLD.Received March 14, 2010; accepted July 15, 2010.Renal dysfunction is common after liver transplantation (LT). The incidence of acute renal failure complicating the posttransplant period varies between 48% and 94% 1 and affects both short-term and long-term outcome. Mortality in those requiring renal replacement therapy (RRT) may be as high as 40% at 90 days, rising to 54% at 1 year.2 Although many risk factors for developing renal dysfunction posttransplant have been investigated, the greatest impact on outcome is in patients who develop de novo renal impairment, especially in those who require RRT.2 Patients with low glomerular filtration rates (GFR) at 1 month post-LT are also at risk of developing severe renal dysfunction at 5 years post-LT 3 and the need for dialysis in LT recipients has been reported to be as high as 18% at 5 years.
Patients of African origin who contract HDV less often have cirrhosis. Patients with HDV and detectable viral load have worse clinical outcomes. Patients with HDV genotype 5 less often develop hepatic decompensation. Patients with HDV genotype 5 seem to respond better to peg-IFN treatment.
HBeAg seroconversion marks an important spontaneous change and treatment end-point for HBeAg-positive patients and is a pre-requisite for HBsAg loss or functional cure. In this retrospective analysis, we aimed to identify predictors of seroconversion using serum quantitative HBsAg and HBcrAg, in HBeAg-positive patients treated with nucleos(t)ide analogues (NA). Data and samples from 118 HBeAg-positive adults (genotypes A-G) started on NA between Jan 2005 and Sept 2016 were retrospectively analysed at several time-points. The predictive power of on-treatment levels of HBsAg and HBcrAg was determined using receiver operating curve (ROC) analysis and cut-off values determined by maximized Youden's index. About 36.4% of patients achieved HBeAg seroconversion after a median of 39 months' treatment. On treatment kinetics of HBV DNA, HBsAg and HBcrAg differed between HBeAg seroconverters and nonseroconverters. A combination of HBsAg and HBcrAg had the greatest predictive value for HBeAg seroconversion: at 6 months, HBsAg of 3.9 log IU/mL and HBcrAg of 5.7 log U/mL had a sensitivity of 71.4%, specificity of 79.5%, positive predictive value (PPV) of 65.2% and negative predictive value (NPV) of 83.8%, with AUROC of 0.769 (0.668, 0.869; 95%CI), and at 12 months, HBsAg 3.8 log IU/mL and HBcrAg 5.5 log U/mL had a sensitivity of 73.7%, specificity of 79.5%, PPV of 63.6% and NPV of 86.1%, with AUROC 0.807 (0.713, 0.901; 95% CI). In conclusion, our results may be used to identify patients who are unlikely to achieve treatment end-points, which will be important as the future management of chronic hepatitis B looks to therapies that offer functional cure.
Aim Megafires are increasing in intensity and frequency globally. The impacts of megafires on biodiversity can be severe, so conservation managers must be able to respond rapidly to quantify their impacts, initiate recovery efforts and consider conservation options within and beyond the burned extent. We outline a framework that can be used to guide conservation responses to megafires, using the 1.5 million hectare 2019/2020 megafires in Victoria, Australia, as a case study. Location Victoria, Australia. Methods Our framework uses a suite of decision support tools, including species attribute databases, ~4,200 species distribution models and a spatially explicit conservation action planning tool to quantify the potential effects of megafires on biodiversity, and identify species‐specific and landscape‐scale conservation actions that can assist recovery. Results Our approach identified 346 species in Victoria that had >40% of their modelled habitat affected by the megafire, including 45 threatened species, and 102 species with >40% of their modelled habitat affected by high severity fire. We then identified 21 candidate recovery actions that are expected to assist the recovery of biodiversity. For relevant landscape‐scale actions, we identified locations within and adjacent to the megafire extent that are expected to deliver cost‐effective conservation gains. Main conclusion The 2019/2020 megafires in south‐eastern Australia affected the habitat of many species and plant communities. Our framework identified a range of single‐species (e.g., supplementary feeding, translocation) and landscape‐scale actions (e.g., protection of refuges, invasive species management) that can help biodiversity recover from megafires. Conservation managers will be increasingly required to rapidly identify conservation actions that can help species recover from megafires, especially under a changing climate. Our approach brings together commonly used datasets (e.g., species distribution maps, trait databases, fire severity mapping) to help guide conservation responses and can be used to help biodiversity recover from future megafires across the world.
Reductions in serum levels of Gc globulin, a hepatically synthesized component of the extracellular actin scavenger system responsible for complexing circulating actin and attenuating intravascular microthrombus formation, are associated with poor outcome in acute liver failure. Clinically applicable assays of the important actin-free fraction (Af-Gc) have not been available until now. We measured actin-free Gc globulin levels with a novel, rapid assay in 61 cases of acute liver failure (ALF) and in 91 patients with cirrhosis (40 of whom were clinically unstable with extrahepatic organ dysfunction), and studied associations with liver dysfunction, extrahepatic organ dysfunction, indices of disseminated coagulation, and outcome. Reductions in Af-Gc levels mirrored hepatic dysfunction and organ dysfunction in both groups, and discriminated patients with poor prognosis from those with good prognosis in the ALF cohort. Levels were lowest in patients with ALF (10% of control values), but levels were also markedly reduced in both unstable (28%) and stable (44%) patients with cirrhosis. Associations with markers of disseminated intravascular coagulation were seen in both groups, most notably in the cirrhosis cohort, supporting a pathophysiological role for reduced Af-Gc in the evolution of organ dysfunction. In acetaminophen-induced ALF, Af-Gc identified patients with poor prognosis as well as did the Acute Physiology and Chronic Health Evaluation (APACHE II) score (area under the receiver operating characteristic curve, 0.7), and in cirrhosis, Af-Gc was an independent predictor of mortality by multifactorial analysis. In conclusion, the importance of Af-Gc reductions in the development of multiple organ dysfunction in ALF and cirrhosis is highlighted, probably resulting from reduced hepatic production and peripheral exhaustion of this arm of the extracellular actin scavenger system.
The carbon stability of fire-tolerant forests is often assumed but less frequently assessed, limiting the potential to anticipate threats to forest carbon posed by predicted increases in forest fire activity. Assessing the carbon stability of fire-tolerant forests requires multi-indicator approaches that recognize the myriad ways that fires influence the carbon balance, including combustion, deposition of pyrogenic material, and tree death, post-fire decomposition, recruitment, and growth. Five years after a large-scale wildfire in southeastern Australia, we assessed the impacts of low- and high-severity wildfire, with and without prescribed fire (≤10 yr before), on carbon stocks in multiple pools, and on carbon stability indicators (carbon stock percentages in live trees and in small trees, and carbon stocks in char and fuels) in fire-tolerant eucalypt forests. Relative to unburned forest, high-severity wildfire decreased short-term (five-year) carbon stability by significantly decreasing live tree carbon stocks and percentage stocks in live standing trees (reflecting elevated tree mortality), by increasing the percentage of live tree carbon in small trees (those vulnerable to the next fire), and by potentially increasing the probability of another fire through increased elevated fine fuel loads. In contrast, low-severity wildfire enhanced carbon stability by having negligible effects on aboveground stocks and indicators, and by significantly increasing carbon stocks in char and, in particular, soils, indicating pyrogenic carbon accumulation. Overall, recent preceding prescribed fire did not markedly influence wildfire effects on short-term carbon stability at stand scales. Despite wide confidence intervals around mean stock differences, indicating uncertainty about the magnitude of fire effects in these natural forests, our assessment highlights the need for active management of carbon assets in fire-tolerant eucalypt forests under contemporary fire regimes. Decreased live tree carbon and increased reliance on younger cohorts for carbon recovery after high-severity wildfire could increase vulnerabilities to imminent fires, leading to decisions about interventions to maintain the productivity of some stands. Our multi-indicator assessment also highlights the importance of considering all carbon pools, particularly pyrogenic reservoirs like soils, when evaluating the potential for prescribed fire regimes to mitigate the carbon costs of wildfires in fire-prone landscapes.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.