Identifying transplant recipients in whom immunological tolerance is established or is developing would allow an individually tailored approach to their posttransplantation management. In this study, we aimed to develop reliable and reproducible in vitro assays capable of detecting tolerance in renal transplant recipients. Several biomarkers and bioassays were screened on a training set that included 11 operationally tolerant renal transplant recipients, recipient groups following different immunosuppressive regimes, recipients undergoing chronic rejection, and healthy controls. Highly predictive assays were repeated on an independent test set that included 24 tolerant renal transplant recipients. Tolerant patients displayed an expansion of peripheral blood B and NK lymphocytes, fewer activated CD4 + T cells, a lack of donor-specific antibodies, donor-specific hyporesponsiveness of CD4 + T cells, and a high ratio of forkhead box P3 to α-1,2-mannosidase gene expression. Microarray analysis further revealed in tolerant recipients a bias toward differential expression of B cell-related genes and their associated molecular pathways. By combining these indices of tolerance as a cross-platform biomarker signature, we were able to identify tolerant recipients in both the training set and the test set. This study provides an immunological profile of the tolerant state that, with further validation, should inform and shape drug-weaning protocols in renal transplant recipients.
Long-term allograft survival generally requires lifelong immunosuppression (IS). Rarely, recipients display spontaneous ''operational tolerance'' with stable graft function in the absence of IS. The lack of biological markers of this phenomenon precludes identification of potentially tolerant patients in which IS could be tapered and hinders the development of new tolerance-inducing strategies. The objective of this study was to identify minimally invasive blood biomarkers for operational tolerance and use these biomarkers to determine the frequency of this state in immunosuppressed patients with stable graft function. Blood gene expression profiles from 75 renal-transplant patient cohorts (operational tolerance/acute and chronic rejection/stable graft function on IS) and 16 healthy individuals were analyzed. A subset of samples was used for microarray analysis where three-class comparison of the different groups of patients identified a ''tolerant footprint'' of 49 genes. These biomarkers were applied for prediction of operational tolerance by microarray and real-time PCR in independent test groups. Thirty-three of 49 genes correctly segregated tolerance and chronic rejection phenotypes with 99% and 86% specificity. The signature is shared with 1 of 12 and 5 of 10 stable patients on triple IS and low-dose steroid monotherapy, respectively. The gene signature suggests a pattern of reduced costimulatory signaling, immune quiescence, apoptosis, and memory T cell responses. This study identifies in the blood of kidney recipients a set of genes associated with operational tolerance that may have utility as a minimally invasive monitoring tool for guiding IS titration. Further validation of this tool for safe IS minimization in prospective clinical trials is warranted.kidney transplantation ͉ microarray ͉ tolerant ͉ genomics ͉ immunosuppression titration D espite continuous improvement in renal allograft survival over the last decade, the half-life of renal allografts has increased marginally because of accrual of chronic graft nephropathy from drug-related nephrotoxicity and chronic rejection (1, 2). Patients facing life-long immunosuppression (IS) have increased risk of infection and malignancy (3), whereas insufficient immunosuppressive drug exposure or interruption usually increases rejection risk (4). However, spontaneous and long-term graft acceptance is observed in a small number of patients after solid-organ transplantation (5, 6), years after total withdrawal of immunosuppressive drugs, confirming that a clinical operational state of tolerance to a mismatched graft, described as ''a state of quiescence of the transplanted organ, functioning without a destructive immune response'' (7), can indeed occur and exist in humans. However, the frequency of this observation in the kidney transplant population is unknown and, currently, we cannot identify patients primed to develop this immune adaptation or monitor for the stability of this status of ''operational tolerance.'' The operationally tolerant kidney transpla...
Several transplant patients maintain stable kidney graft function in the absence of immunosuppression. Here we compared the characteristics of their peripheral B cells to that of others who had stable graft function but were under pharmacologic immunosuppression, to patients with chronic rejection and to healthy volunteers. In drug-free long-term graft function (DF) there was a significant increase in both absolute cell number and frequency of total B cells; particularly activated, memory and early memory B cells. These increased B-cell numbers were associated with a significantly enriched transcriptional B-cell profile. Costimulatory/migratory molecules (B7-2/CD80, CD40, and CD62L) were upregulated in B cells; particularly in memory CD19(+)IgD(-)CD38(+/-)CD27(+) B cells in these patients. Their purified B cells, however, responded normally to a polyclonal stimulation and did not have cytokine polarization. This phenotype was associated with the following specific characteristics which include an inhibitory signal (decreased FcgammaRIIA/FcgammaRIIB ratio); a preventive signal of hyperactive B-cell response (an increase in BANK1, which negatively modulates CD40-mediated AKT activation); an increased number of B cells expressing CD1d and CD5; an increased BAFF-R/BAFF ratio that could explain why these patients have more peripheral B cells; and a specific autoantibody profile. Thus, our findings show that patients with DF have a particular blood B-cell phenotype that may contribute to the maintenance of long-term graft function.
Although cold ischemia time has been widely studied in renal transplantation area, there is no consensus on its precise relationship with the transplantation outcomes. To study this, we sampled data from 3839 adult recipients of a first heart-beating deceased donor kidney transplanted between 2000 and 2011 within the French observational multicentric prospective DIVAT cohort. A Cox model was used to assess the relationship between cold ischemia time and death-censored graft survival or patient survival by using piecewise log-linear function. There was a significant proportional increase in the risk of graft failure for each additional hour of cold ischemia time (hazard ratio, 1.013). As an example, a patient who received a kidney with a cold ischemia time of 30 h presented a risk of graft failure near 40% higher than a patient with a cold ischemia time of 6 h. Moreover, we found that the risk of death also proportionally increased for each additional hour of cold ischemia time (hazard ratio, 1.018). Thus, every additional hour of cold ischemia time must be taken into account in order to increase graft and patient survival. These findings are of practical clinical interest, as cold ischemia time is among one of the main modifiable pre-transplantation risk factors that can be minimized by improved management of the peri-transplantation period.
ObjectiveTo develop and validate an integrative system to predict long term kidney allograft failure.DesignInternational cohort study.SettingThree cohorts including kidney transplant recipients from 10 academic medical centres from Europe and the United States.ParticipantsDerivation cohort: 4000 consecutive kidney recipients prospectively recruited in four French centres between 2005 and 2014. Validation cohorts: 2129 kidney recipients from three centres in Europe and 1428 from three centres in North America, recruited between 2002 and 2014. Additional validation in three randomised controlled trials (NCT01079143, EudraCT 2007-003213-13, and NCT01873157).Main outcome measureAllograft failure (return to dialysis or pre-emptive retransplantation). 32 candidate prognostic factors for kidney allograft survival were assessed.ResultsAmong the 7557 kidney transplant recipients included, 1067 (14.1%) allografts failed after a median post-transplant follow-up time of 7.12 (interquartile range 3.51-8.77) years. In the derivation cohort, eight functional, histological, and immunological prognostic factors were independently associated with allograft failure and were then combined into a risk prediction score (iBox). This score showed accurate calibration and discrimination (C index 0.81, 95% confidence interval 0.79 to 0.83). The performance of the iBox was also confirmed in the validation cohorts from Europe (C index 0.81, 0.78 to 0.84) and the US (0.80, 0.76 to 0.84). The iBox system showed accuracy when assessed at different times of evaluation post-transplant, was validated in different clinical scenarios including type of immunosuppressive regimen used and response to rejection therapy, and outperformed previous risk prediction scores as well as a risk score based solely on functional parameters including estimated glomerular filtration rate and proteinuria. Finally, the accuracy of the iBox risk score in predicting long term allograft loss was confirmed in the three randomised controlled trials.ConclusionAn integrative, accurate, and readily implementable risk prediction score for kidney allograft failure has been developed, which shows generalisability across centres worldwide and common clinical scenarios. The iBox risk prediction score may help to guide monitoring of patients and further improve the design and development of a valid and early surrogate endpoint for clinical trials.Trial registrationClinicaltrials.gov NCT03474003.
These data show that whereas clinically tolerant recipients displayed normal levels of CD25hiCD4+T cells and FOXP3 transcripts, chronic rejection is associated with a decrease in CD25hiCD4+T cells and FOXP3 transcripts, suggesting that clinically "operational tolerance" may be due to a maintained phenomenon of natural tolerance that is lacking in patients with chronic rejection.
The involvement of immunologic and nonimmunologic events in long-term kidney allograft failure is difficult to assess. The development of HLA antibodies after transplantation is the witness of ongoing reactivity against the transplant, and several studies have suggested that the presence of HLA antibodies correlates with poor graft survival. However, they have not discriminated between donor-specific (DS) and non-specific (NDS) antibodies. A total of 1229 recipients of a kidney graft, transplanted between 1972 and 2002, who had over a 5-yr period a prospective annual screening for HLA antibodies with a combination of ELISA, complement-dependent cytotoxicity, and flow cytometry tests were investigated; in 543 of them, the screening was complete from transplantation to the fifth year postgrafting. Correlations were established between the presence and the specificity of the antibodies and clinical parameters. A total of 5.5% of the patients had DS, 11.3% had NDS, and 83% had no HLA antibodies after transplantation. NDS antibodies appeared earlier (1 to 5 yr posttransplantation) than DS antibodies (5 to 10 yr). In multivariate analysis, HLA-DR matching, pretransplantation immunization, and acute rejection were significantly associated with the development of both DS and NDS antibodies and also of DS versus NDS antibodies. The presence of either DS or NDS antibodies significantly correlated with lower graft survival, poor transplant function, and proteinuria. Screening of HLA antibodies posttransplantation could be a good tool for the follow-up of patients who receive a kidney transplant and allow immunosuppression to be tailored.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
334 Leonard St
Brooklyn, NY 11211
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.