The field of liver transplantation has shifted considerably in the MELD era, including changing allocation, immunosuppression, and liver failure etiologies, as well as better supportive therapies. Our aim was to evaluate the predictive accuracy of the MELD score over time. The United Network for Organ Sharing provided de‐identified data on 120 156 patients listed for liver transplant from 2002‐2016. The ability of the MELD score to predict 90‐day mortality was evaluated by a concordance (C‐) statistic and corroborated with competing risk analysis. The MELD score's concordance with 90‐day mortality has downtrended from 0.80 in 2003 to 0.70 in 2015. While lab MELD scores at listing and transplant climbed in that interval, score at waitlist death remained steady near 35. Listing age increased from 50 to 54 years. HCV‐positive status at listing dropped from 33 to 17%. The concordance of MELD and mortality does not differ with age (>60 = 0.73, <60 = 0.74), but is lower in diseases that are increasing most rapidly—alcoholic liver disease and non‐alcoholic fatty liver disease—and higher in those that are declining, particularly in HCV‐positive patients (HCV positive = 0.77; negative = 0.73). While MELD still predicts mortality, its accuracy has decreased; changing etiology of disease may contribute.
IMPORTANCE With continuing improvements in medical devices and more than a decade since the 2006 United Network for Organ Sharing (UNOS) allocation policy, it is pertinent to assess survival among patients on the heart transplantation waiting list, especially given the recently approved 2018 UNOS allocation policy.OBJECTIVES To assess survival outcomes among patients on the heart transplant waiting list during the past 3 decades and to examine the association of ventricular assist devices (VADs) and the 2006 UNOS allocation policy with survival. DESIGN, SETTING, AND PARTICIPANTSA retrospective cross-sectional used the UNOS database to perform an analysis of 95 323 candidates wait-listed for heart transplantation between January 1, 1987, and December 29, 2017. Candidates for all types of combined transplants were excluded (n = 2087). Patients were followed up from the time of listing to death, transplantation, or removal from the list due to clinical improvement. Competing-risk, Kaplan-Meier, and multivariable Cox proportional hazards regression analyses were used. MAIN OUTCOMES AND MEASURESThe analysis involved an unadjusted and adjusted survival analysis in which the primary outcome was death on the waiting list. Because of changing waiting list preferences and policies during the study period, the intrinsic risk of death for wait-listed candidates was assessed by individually analyzing, comparing, and adjusting for several candidate risk factors. RESULTSIn total, 95 323 candidates (72 915 men [76.5%]; mean [SD] age, 51.9 [12.0] years) were studied. In the setting of changes in listing preferences, 1-year survival on the waiting list increased from 34.1% in 1987-1990 to 67.8% in 2011-2017 (difference in proportions, 0.34%; 95% CI, 0.32%-0.36%; P < .001). The 1-year waiting list survival for candidates with VADs increased from 10.2% in 1996-2000 to 70.0% in 2011-2017 (difference in proportions, 0.60%; 95% CI, 0.58%-0.62%; P < .001). Similarly, in the setting of changing mechanical circulatory support indications, the 1-year waiting list survival for patients without VADs increased from 53.9% in 1996-2000 to 66.5% in 2011-2017 (difference in proportions, 0.13%; 95% CI, 0.12%-0.14%; P < .001). In the decade prior to the 2006 UNOS allocation policy, the 1-year waiting list survival was 51.1%, while in the decade after it was 63.9% (difference in proportions, 0.13%; 95% CI, 0.12%-0.14%; P < .001). In adjusted analysis, each time period after 1987-1990 had a marked decrease in waiting list mortality.CONCLUSIONS AND RELEVANCE This study found temporally associated increases in heart transplant waiting list survival for all patient groups (with or without VADs, UNOS status 1 and status 2 candidates, and candidates with poor functional status).
Objectives: Acute-on-chronic liver failure (ACLF), whereas increasingly well-defined in adults, has been poorly characterized in pediatric patients other than having a poor prognosis. This study aimed to identify ACLF and evaluate prognosis in the American pediatric population. Methods: Modified ACLF definitions (p-CLIF) were applied to 11,300 children listed for liver transplantation from March 2002 through 2017 in the Organ Procurement and Transplantation Network (OPTN) database. Results: Pediatric ACLF patients have greater mortality within 90 days from listing (46.6% by p-CLIF) than other types of failure (<30%), including acute liver failure, as well as greater mortality within the first 30 and 90 days after transplantation than all other types of liver failure, but do not have increased mortality rates relative to other groups between 90 and 365 days from transplant. Although some ACLF listings also received 1B status, ACLF mortality at 90 days was greater than the general 1B population (50 vs 29.4%). Model for End-Stage Liver Disease/Pediatric End-Stage Liver Disease scores of ACLF patients are lower than 1B listings, and do not predict waitlist or posttransplant death. Greater number of organ failures does correlate with increased mortality. Biliary atresia is the leading etiology of pediatric chronic liver disease, accounting for over 30% of chronic and 45% of ACLF listings, yet is protective against mortality (hazard ratio [HR] = 0.142 for ACLF). Receiving exception approval is independently but similarly protective in ACLF (HR = 0.145). Conclusions: These findings pose a challenge for allocation decisions but indicate greater attention to ACLF is needed, as scoring systems may not capture these children's risk of early death, which appears to currently be mitigated by exceptions. Multicenter, clinical, preferably prospective study of ACLF is necessary to determine how to prioritize ACLF relative to other liver failure types to address its relatively higher early mortality.
Neovascularization is an understudied aspect of calcific aortic valve disease (CAVD). Within diseased valves, cells along the neovessels' periphery stain for pericyte markers, but it is unclear whether valvular interstitial cells (VICs) can demonstrate a pericyte-like phenotype. This investigation examined the perivascular potential of VICs to regulate valve endothelial cell (VEC) organization and explored the role of Angiopoeitin1-Tie2 signaling in this process. Porcine VECs and VICs were fluorescently tracked and co-cultured in Matrigel over 7 days. VICs regulated early VEC network organization in a ROCK-dependent manner, then guided later VEC network contraction through chemoattraction. Unlike vascular control cells, the valve cell cultures ultimately formed invasive spheroids with 3D angiogenic-like sprouts. VECs co-cultured with VICs displayed significantly more invasion than VECs alone; with VICs generally leading and wrapping around VEC invasive sprouts. Lastly, Angiopoietin1-Tie2 signaling was found to regulate valve cell organization during VEC/VIC spheroid formation and invasion. VICs demonstrated pericyte-like behaviors toward VECs throughout sustained co-culture. The change from a vasculogenic network to an invasive sprouting spheroid suggests that both cell types undergo phenotypic changes during long-term culture in the model angiogenic environment. Valve cells organizing into spheroids and undergoing 3D invasion of Matrigel demonstrated several typical angiogenic-like phenotypes dependent on basal levels of Angiopoeitin1-Tie2 signaling and ROCK activation. These results suggest that the ectopic sustained angiogenic environment during the early stages of valve disease promotes organized activity by both VECs and VICs, contributing to neovessel formation and the progression of CAVD.
Both ROCK and Rac1 inhibition interfered with key processes in vascular network formation by valve ECs. This is the first report of manipulation of valve EC vasculogenic organization in response to small molecule inhibitors. Further study is warranted to comprehend this facet of valvular cell biology and pathology and how it differs from vascular biology.
There is a shortage of pediatric liver allografts available for transplantation, and at the same time, there are over 100 allografts discarded each year. In 2006, the DRI was developed using donor factors and cold ischemia time to predict liver graft failure. 1 However, the DRI has little clinical relevance due to its poor predictive capacity. [2][3][4][5] Because of this poor predictive capacity, marginal allografts are accepted based on the surgeon's subjective assessment of the donor allograft at the time of donation. This assessment, coupled with liver biopsy results, has often resulted in allografts being accepted despite adverse laboratory values and demographic characteristics. 5,6 Similarly, allografts that might have otherwise been successfully transplanted were likely discarded.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
334 Leonard St
Brooklyn, NY 11211
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.