scholarly journals Plasma cadmium is associated with increased risk of long-term kidney graft failure.

Author(s):  
Camilo G. Sotomayor ◽  
Dion Groothof ◽  
Joppe J. Vodegel ◽  
Michele F. Eisenga ◽  
Tim J. Knobbe ◽  
...  
2009 ◽  
Vol 23 (5) ◽  
pp. 465-475 ◽  
Author(s):  
Mario Rotondi ◽  
Giuseppe Stefano Netti ◽  
Elena Lazzeri ◽  
Giovanni Stallone ◽  
Elisabetta Bertoni ◽  
...  

2021 ◽  
Author(s):  
Felix Poppelaars ◽  
Mariana Gaya da Costa ◽  
Bernardo Faria ◽  
Siawosh K. Eskandari ◽  
Jeffrey Damman ◽  
...  

Introduction Improvement of long-term outcomes in kidney transplantation remains one of the most pressing challenges, yet drug development is stagnating. Human genetics offers an opportunity for much-needed target validation in transplantation. Conflicting data exists about the effect of transforming growth factor beta 1 (TGF-beta1) on kidney transplant survival, since TGF-beta1 has profibrotic and protective effects. We therefore the impact of a recently discovered functional TGBF1 polymorphism on long term kidney graft survival. Methods We performed an observational cohort study analyzing recipient and donor DNA in 1,271-kidney transplant pairs from the University Medical Center Groningen in The Netherlands and associated a low-producing TGBF1 polymorphism (rs1800472 C>T) with 5, 10, and 15-year death-censored kidney graft survival. Results Donor genotype frequencies of s1800472 in TGBF1 differed significantly between patients with and without graft loss (P=0.042). Additionally, the low-producing TGBF1 polymorphism in the donor was associated with an increased risk of graft loss following kidney transplantation (HR 2.13 for the T allele; 95%-CI 1.16-3.90; P=0.015). The incidence of graft loss within 15 years of follow-up was 16.4% in the CC-genotype group and 28.9% in the CT-genotype group. After adjustment for transplant-related covariates, the association between the TGBF1 polymorphism in the donor and graft loss remained significant. In contrast, there was no association between the TGBF1 polymorphism in the recipient and graft loss. Conclusion Kidney allografts possessing a low-producing TGBF1 polymorphism have a higher risk of late graft loss. Our study adds to a growing body of evidence that TGFbeta1 is beneficial, rather than harmful, for kidney transplant survival.


2015 ◽  
Author(s):  
Laurent Mesnard ◽  
Thangamani Muthukumar ◽  
Maren Burbach ◽  
Carol Li ◽  
Huimin Shang ◽  
...  

BACKGROUND: Kidney transplantation is the treatment of choice for most patients with end-stage renal disease and existing data suggest that post transplant graft function is a predictor of kidney graft failure. METHODS: Exome sequencing of DNA from kidney graft recipients and their donors was used to determine recipient and donor mismatches at the amino acid level. The number of mismatches that are more likely to induce an immune response in the recipient was computationally estimated and designated the allogenomics mismatch score. The relationship between the allogenomics score and post transplant kidney allograft function was examined using linear regression. RESULTS: A significant inverse correlation between the allogenomics mismatch score and kidney graft function at 36 months post transplantation was observed in a discovery cohort of kidney recipient-donor pairs (r2>=0.57, P<0.05, the score vs. level of serum creatinine or estimated glomerular filtration rate). This relationship was confirmed in an independent validation cohort of kidney recipient-donor pairs. We observed that the strength of the correlation increased with time post-transplantation. This inverse correlation remained after excluding HLA loci from the calculation of the score. Exome sequencing yielded allogenomics scores with stronger correlations with graft function than simulations of genotyping assays which measure common polymorphisms only. CONCLUSIONS: The allogenomics mismatch score, derived by exome sequencing of recipient-donor pairs, facilitates quantification of histoincompatibility between the organ donor and recipient impacting long-term post transplant graft function. The allogenomics mismatch score, by serving as a prognostic biomarker, may help identify patients at risk for graft failure.


2012 ◽  
Vol 12 (6) ◽  
pp. 1618-1623 ◽  
Author(s):  
H. G. Otten ◽  
M. C. Verhaar ◽  
H. P. E. Borst ◽  
R. J. Hené ◽  
A. D. van Zuilen

2021 ◽  
Vol 8 ◽  
pp. 205435812098537
Author(s):  
Kyla L. Naylor ◽  
Gregory A. Knoll ◽  
Eric McArthur ◽  
Amit X. Garg ◽  
Ngan N. Lam ◽  
...  

Background: The frequency and outcomes of starting maintenance dialysis in the hospital as an inpatient in kidney transplant recipients with graft failure are poorly understood. Objective: To determine the frequency of inpatient dialysis starts in patients with kidney graft failure and examine whether dialysis start status (hospital inpatient vs outpatient setting) is associated with all-cause mortality and kidney re-transplantation. Design: Population-based cohort study. Setting: We used linked administrative healthcare databases from Ontario, Canada. Patients: We included 1164 patients with kidney graft failure from 1994 to 2016. Measurements: All-cause mortality and kidney re-transplantation. Methods: The cumulative incidence function was used to calculate the cumulative incidence of all-cause mortality and kidney re-transplantation, accounting for competing risks. Subdistribution hazard ratios from the Fine and Gray model were used to examine the relationship between inpatient dialysis starts (vs outpatient dialysis start [reference]) and the dependent variables (ie, mortality or re-transplant). Results: We included 1164 patients with kidney graft failure. More than half (55.8%) of patients with kidney graft failure, initiated dialysis as an inpatient. Compared with outpatient dialysis starters, inpatient dialysis starters had a significantly higher cumulative incidence of mortality and a significantly lower incidence of kidney re-transplantation ( P < .001). The 10-year cumulative incidence of mortality was 51.9% (95% confidence interval [CI]: 47.4, 56.9%) (inpatient) and 35.3% (95% CI: 31.1, 40.1%) (outpatient). After adjusting for clinical characteristics, we found inpatient dialysis starters had a significantly increased hazard of mortality in the first year after graft failure (hazard ratio: 2.18 [95% CI: 1.43, 3.33]) but at 1+ years there was no significant difference between groups. Limitations: Possibility of residual confounding and unable to determine inpatient dialysis starts that were unavoidable. Conclusions: In this study we identified that most patients with kidney graft failure had inpatient dialysis starts, which was associated with an increased risk of mortality. Further research is needed to better understand the reasons for an inpatient dialysis start in this patient population.


2012 ◽  
Vol 2012 ◽  
pp. 1-5 ◽  
Author(s):  
Anabela Malho Guedes ◽  
Jorge Malheiro ◽  
Isabel Fonseca ◽  
La Salete Martins ◽  
Sofia Pedroso ◽  
...  

Kidney graft survival has been mainly evaluated using an up to 10-year threshold. Instead, in this study our aim was to evaluate predictive variables that impact long-term kidney graft survival (≥10 years). We enrolled 892 patients in our analysis: 638 patients with functioning graft at 10 years PT and 254 patients with graft failure at 10 years PT (considering patient death with a functioning graft <10 years PT as graft failure). Between groups comparisons were done using Mann-Whitney and chi-square test. To determine independent predictive variables for long-term graft survival a multivariate-adjusted logistic regression was performed. Significant predictors of long term graft survival were lower 12-month PT creatinine (, ), lower donor age (, ), shorter time on dialysis (, ), recipient positive CMV IgG (, ), absence of AR episodes (, ), 0 to 1 (versus 2) HLA-B mismatch (, ), and recipients male gender (, ). Our results show that an early KT, younger donor age, and an optimal first year graft function are of paramount importance for long-term graft survival. Measures that address these issues (careful donor selection, preemptive KT, and effective immunosuppressive protocols) are still warranted.


2021 ◽  
Vol 36 (Supplement_1) ◽  
Author(s):  
Diana Rodríguez Espinosa ◽  
Jose Jesus Broseta Monzo ◽  
Evelyn Hermida-Lama ◽  
Elena Cuadrado ◽  
Jimena Del Risco ◽  
...  

Abstract Background and Aims Early graft failure (EGL) is a devastating complication of kidney transplantation. Patients with EGL have an increased risk of mortality of up to twelve times compared to patients who received grafts that survive beyond 30 days. Moreover, they may have become sensitized to antigens from the failed graft and that human leukocyte antigen antibodies (anti-HLA), identified on single antigen bead assays, may not be reliable until several weeks after transplantation. Thus, if rapid re-transplantation occurs, there is no certainty regarding the recipient's immunological status. Hence, there could be an increased immunological risk with the consequent disturbance of the new graft's survival. Method We performed a retrospective single-center observational study in re-transplanted patients with EGL (defined as graft loss before 30 days from transplant) between January 1977 and November 2019 from our center to analyze the outcomes of rapid re-transplantation (occurred within 30 days of EGL) vs late re-transplantation (occurred beyond those 30 days). Results: T here were 82 re-transplants after EGL. The median overall patient survival after re-transplantation was 32 years. Eight patients died within the first year. Among the mortality causes, there were four septic shocks, one cardiogenic shock, one massive pulmonary thromboembolism, one myocardial infarction, and one unknown cause. When analyzed for periods, death censored graft survival was 89% at one and five years after re-transplantation. One graft was lost at eight days due to antibody-mediated rejection (AMR), while there was one death with a functioning graft three months after re-transplantation secondary to a pulmonary embolism. Seventy-three late re-transplants occurred. When analyzed for periods, death censored graft survival was 81% and 69% at one and five years after re-transplantation, respectively. The median patient survival after late re-transplantation was 32 years. There were fewer deaths after rapid re-transplantation than late re-transplantation, but given the small number of cases in the former, this difference did not reach statistical significance (p = 0.3). There was no association between the timing of re-transplantation and an increased risk of graft failure (HR 0.30 [0.04 – 2.2]). While four rapid re-transplants did not share any incompatibilities between donors, four did share at least one HLA type I incompatibility, and one shared an incompatibility of HLA class I and class II. There were no T-cell mediated rejections (TCMR), and there was only one AMR in the rapid rapid re-transplantation group, whereas there were six TCMRs and fifteen AMRs in the late re-transplantation group (p = 0.03 and p = 0.4, respectively). Conclusion Rapid re-transplantation appears to be safe and does not entail increased rejection risk, nor it diminishes long-term graft survival when compared to late re-transplantation.


2018 ◽  
Vol 29 (9) ◽  
pp. 2279-2285 ◽  
Author(s):  
Elena G. Kamburova ◽  
Bram W. Wisse ◽  
Irma Joosten ◽  
Wil A. Allebes ◽  
Arnold van der Meer ◽  
...  

Background Complement-fixing antibodies against donor HLA are considered a contraindication for kidney transplant. A modification of the IgG single-antigen bead (SAB) assay allows detection of anti-HLA antibodies that bind C3d. Because early humoral graft rejection is considered to be complement mediated, this SAB-based technique may provide a valuable tool in the pretransplant risk stratification of kidney transplant recipients.Methods Previously, we established that pretransplant donor-specific anti-HLA antibodies (DSAs) are associated with increased risk for long-term graft failure in complement-dependent cytotoxicity crossmatch-negative transplants. In this study, we further characterized the DSA-positive serum samples using the C3d SAB assay.Results Among 567 pretransplant DSA-positive serum samples, 97 (17%) contained at least one C3d-fixing DSA, whereas 470 (83%) had non–C3d-fixing DSA. At 10 years after transplant, patients with C3d-fixing antibodies had a death-censored, covariate-adjusted graft survival of 60%, whereas patients with non–C3d-fixing DSA had a graft survival of 64% (hazard ratio, 1.02; 95% confidence interval, 0.70 to 1.48 for C3d-fixing DSA compared with non–C3d-fixing DSA; P=0.93). Patients without DSA had a 10-year graft survival of 78%.Conclusions The C3d-fixing ability of pretransplant DSA is not associated with increased risk for graft failure.


Nutrients ◽  
2021 ◽  
Vol 13 (1) ◽  
pp. 262
Author(s):  
Jose L. Flores-Guerrero ◽  
Maryse C. J. Osté ◽  
Paula B. Baraldi ◽  
Margery A. Connelly ◽  
Erwin Garcia ◽  
...  

Background. Due to the critical shortage of kidneys for transplantation, the identification of modifiable factors related to graft failure is highly desirable. The role of trimethylamine-N-oxide (TMAO) in graft failure remains undetermined. Here, we investigated the clinical utility of TMAO and its dietary determinants for graft failure prediction in renal transplant recipients (RTRs). Methods. We included 448 RTRs who participated in the TransplantLines Cohort Study. Cox proportional-hazards regression analyses were performed to study the association of plasma TMAO with graft failure. Net Benefit, which is a decision analysis method, was performed to evaluate the clinical utility of TMAO and dietary information in the prediction of graft failure. Results. Among RTRs (age 52.7 ± 13.1 years; 53% males), the baseline median TMAO was 5.6 (3.0–10.2) µmol/L. In multivariable regression analysis, the most important dietary determinants of TMAO were egg intake (Std. β = 0.09 [95%CI, 0.01; 0.18]; p = 0.03), fiber intake (Std. β = −0.14 [95%CI, −0.22, −0.05]; p = 0.002), and fish and seafood intake (Std. β = 0.12 [95%CI, 0.03,0.21]; p = 0.01). After a median follow-up of 5.3 (4.5–6.0) years, graft failure was observed in 58 subjects. TMAO was associated with an increased risk of graft failure, independent of age, sex, the body mass index (BMI), blood pressure, lipids, albuminuria, and the Estimated Glomerular Filtration Rate (eGFR) (Hazard Ratio per 1-SD increase of TMAO, 1.62 (95% confidence interval (CI): 1.22; 2.14, p < 0.001)). A TMAO and dietary enhanced prediction model offered approximately double the Net Benefit compared to a previously reported, validated prediction model for future graft failure, allowing the detection of 21 RTRs per 100 RTRs tested, with no false positives versus 10 RTRs, respectively. Conclusions. A predictive model for graft failure, enriched with TMAO and its dietary determinants, yielded a higher Net Benefit compared with an already validated model. This study suggests that TMAO and its dietary determinants are associated with an increased risk of graft failure and that it is clinically meaningful.


2021 ◽  
Author(s):  
◽  
Ahmed Alnajar

Background Social determinants of health (SDH) such as poverty, unequal access to health care, and lack of education have recently been found to contribute to health inequities. In the field of heart transplantation, it is known that patients of color have a higher early mortality risk than white patients, even after taking differences in comorbidities into account. However, the role of SDH in predicting disease patterns and transplant outcomes has not been determined. The purpose of this study was to assess long-term heart transplantation success based on patients' SDH. Methods A retrospective analysis of 34,584 adult heart transplant recipients from 2004 to 2021 was performed using the Organ Procurement and Transplantation Network database. Established and modern SDH indices—including the Agency for Healthcare Research and Quality Index, the Area Deprivation Index, the Social Vulnerability Index , the Social Deprivation Index, and the Crime Risk Index—were evaluated. A survival analysis using the Kaplan-Meier method assessed the indices' association with overall graft survival or failure. A temporal decomposition model was used to identify overlapping phases of graft survival. Multivariable clustered Cox regression, LASSO and random forest models were used to determine the association of these indices with post-transplantation risk of graft failure. Results On average, patients falling into the lower quartile across different SDH indices had a 20% increased risk of graft failure. After accounting for recipient and donor biological factors and comorbidities, recipients' indices remained statistically significant. Individual sociodemographic variables were highly predictive of graft failure across different hazard intervals. Recipients' SDH indices were most predictive throughout the constant and late post-transplantation phases. On the other hand, donor SDH indices were less significant across all intervals. Conclusion To improve survival outcomes, a collaborative effort to counsel and support disadvantaged transplant recipients should be considered.


Sign in / Sign up

Export Citation Format

Share Document