Factors Influencing Ethnic Disparities in Outcomes after Deceased Donor Kidney Transplantation

Author(s):  
Jasmin Divers ◽  
Sumit Mohan ◽  
William M. Brown ◽  
Stephen O. Pastan ◽  
Ajay K. Israni ◽  
...  

Abstract Background: African American (AA) recipients of deceased-donor (DD) kidney transplants (KT) have shorter allograft survival than recipients of other ethnic groups. Reasons for this disparity encompass complex interactions between donors and recipients characteristics. Methods: Outcomes from 3,872 AA and 19,719 European American (EA) DDs who had one kidney transplanted in an AA recipient and one in an EA recipient were analyzed. Four donor/recipient pair groups (DRP) were studied, AA/AA, AA/EA, EA/AA, and EA/EA. Survival random forests and Cox proportional hazard models were fitted to rank and evaluate modifying effects of DRP on variables associated with allograft survival. These analyses sought to identify factors contributing to the observed disparities in transplant outcomes among AA and EA DDKT recipients. Results: Transplant era, discharge serum creatinine, delayed graft function, and DRP were among the top predictors of allograft survival and mortality among DDKT recipients. Interaction effects between DRP with the kidney donor risk index and transplant era showed significant improvement in allograft survival over time in EA recipients. However, AA recipients appeared to have similar or poorer outcomes for DDKT performed after 2010 versus before 2001; allograft survival hazard ratios (95% CI) were 1.15 (0.74, 1.76) and 1.07 (0.8, 1.45) for AA/AA and EA/AA, compared to 0.62 (0.54, 0.71) and 0.5 (0.41, 0.62) for EA/EA and AA/EA DRP, respectively. Recipient mortality improved over time among all DRP, except unemployed AA/AAs. Relative to DDKT performed pre-2001, employed AA/AAs had HR=0.37 (0.2, 0.69) versus 0.59 (0.31, 1.11) for unemployed AA/AA after 2010. Conclusion: Relative to DDKT performed before 2001, similar or worse overall DCAS was observed among AA/AAs, while EA/EAs experienced considerable improvement regardless of employment status, KDRI, and EPTS. AA recipients of an AA DDKT, especially if unemployed, had worse allograft survival and mortality and did not appear to benefit from advances in care over the past 20 years.

2022 ◽  
Vol 23 (1) ◽  
Author(s):  
Jasmin Divers ◽  
Sumit Mohan ◽  
W. Mark Brown ◽  
Stephen O. Pastan ◽  
Ajay K. Israni ◽  
...  

Abstract Background African American (AA) recipients of deceased-donor (DD) kidney transplants (KT) have shorter allograft survival than recipients of other ethnic groups. Reasons for this disparity encompass complex interactions between donors and recipients characteristics. Methods Outcomes from 3872 AA and 19,719 European American (EA) DDs who had one kidney transplanted in an AA recipient and one in an EA recipient were analyzed. Four donor/recipient pair groups (DRP) were studied, AA/AA, AA/EA, EA/AA, and EA/EA. Survival random forests and Cox proportional hazard models were fitted to rank and evaluate modifying effects of DRP on variables associated with allograft survival. These analyses sought to identify factors contributing to the observed disparities in transplant outcomes among AA and EA DDKT recipients. Results Transplant era, discharge serum creatinine, delayed graft function, and DRP were among the top predictors of allograft survival and mortality among DDKT recipients. Interaction effects between DRP with the kidney donor risk index and transplant era showed significant improvement in allograft survival over time in EA recipients. However, AA recipients appeared to have similar or poorer outcomes for DDKT performed after 2010 versus before 2001; allograft survival hazard ratios (95% CI) were 1.15 (0.74, 1.76) and 1.07 (0.8, 1.45) for AA/AA and EA/AA, compared to 0.62 (0.54, 0.71) and 0.5 (0.41, 0.62) for EA/EA and AA/EA DRP, respectively. Recipient mortality improved over time among all DRP, except unemployed AA/AAs. Relative to DDKT performed pre-2001, employed AA/AAs had HR = 0.37 (0.2, 0.69) versus 0.59 (0.31, 1.11) for unemployed AA/AA after 2010. Conclusion Relative to DDKT performed before 2001, similar or worse overall DCAS was observed among AA/AAs, while EA/EAs experienced considerable improvement regardless of employment status, KDRI, and EPTS. AA recipients of an AA DDKT, especially if unemployed, had worse allograft survival and mortality and did not appear to benefit from advances in care over the past 20 years.


2020 ◽  
Vol 15 (2) ◽  
pp. 257-264 ◽  
Author(s):  
S. Ali Husain ◽  
Kristen L. King ◽  
Ibrahim Batal ◽  
Geoffrey K. Dube ◽  
Isaac E. Hall ◽  
...  

Background and objectivesUnfavorable histology on procurement biopsies is the most common reason for deceased donor kidney discard. We sought to assess the reproducibility of procurement biopsy findings.Design, setting, participants, & measurementsWe compiled a continuous cohort of deceased donor kidneys transplanted at our institution from 1/1/2006 to 12/31/2016 that had at least one procurement biopsy performed, and excluded cases with missing biopsy reports and those used in multiorgan transplants. Suboptimal histology was defined as the presence of advanced sclerosis in greater than or equal to one biopsy compartment (glomeruli, tubules/interstitium, vessels). We calculated κ coefficients to assess agreement in optimal versus suboptimal classification between sequential biopsy reports for kidneys that underwent multiple procurement biopsies and used time-to-event analysis to evaluate the association between first versus second biopsies and patient and allograft survival.ResultsOf the 1011 kidneys included in our cohort, 606 (60%) had multiple procurement biopsies; 98% had first biopsy performed at another organ procurement organization and their second biopsy performed locally. Categorical agreement was highest for vascular disease (κ=0.17) followed by interstitial fibrosis and tubular atrophy (κ=0.12) and glomerulosclerosis (κ=0.12). Overall histologic agreement (optimal versus suboptimal) was κ=0.15. First biopsy histology had no association with allograft survival in unadjusted or adjusted analyses. However, second biopsy optimal histology was associated with a higher probability of death-censored allograft survival, even after adjusting for donor and recipient factors (adjusted hazard ratio, 0.50; 95% confidence interval, 0.34 to 0.75; P=0.001).ConclusionsDeceased donor kidneys that underwent multiple procurement biopsies often displayed substantial differences in histologic categorization in sequential biopsies, and there was no association between first biopsy findings and post-transplant outcomes.


2019 ◽  
Vol 9 (1) ◽  
Author(s):  
Subagini Nagarajah ◽  
Shengqiang Xia ◽  
Marianne Rasmussen ◽  
Martin Tepel

Abstract β-1,4-mannosylglycoprotein 4-β-N-acetylglucosaminyltransferase (MGAT3) is a key molecule for the innate immune system. We tested the hypothesis that intronic antisense long non-coding RNA, MGAT3-AS1, can predict delayed allograft function after kidney transplantation. We prospectively assessed kidney function and MGAT3-AS1 in 129 incident deceased donor kidney transplant recipients before and after transplantation. MGAT3-AS1 levels were measured in mononuclear cells using qRT-PCR. Delayed graft function was defined by at least one dialysis session within 7 days of transplantation. Delayed graft function occurred in 22 out of 129 transplant recipients (17%). Median MGAT3-AS1 after transplantation was significantly lower in patients with delayed graft function compared to patients with immediate graft function (6.5 × 10−6, IQR 3.0 × 10−6 to 8.4 × 10−6; vs. 8.3 × 10−6, IQR 5.0 × 10−6 to 12.8 × 10−6; p < 0.05). The median preoperative MGAT3-AS1 was significantly lower in kidney recipients with delayed graft function (5.1 × 10−6, IQR, 2.4 × 10−6 to 6.8 × 10−6) compared to recipients with immediate graft function (8.9 × 10−6, IQR, 6.8 × 10−6 to 13.4 × 10−6; p < 0.05). Receiver-operator characteristics showed that preoperative MGAT3-AS1 predicted delayed graft function (area under curve, 0.83; 95% CI, 0.65 to 1.00; p < 0.01). We observed a positive predictive value of 0.57, and a negative predictive value of 0.95. Long non-coding RNA, MGAT3-AS1, indicates short-term outcome in patients with deceased donor kidney transplantation.


2019 ◽  
Vol 3 (Supplement_1) ◽  
pp. S682-S682
Author(s):  
Joanna M Blodgett ◽  
Kenneth Rockwood ◽  
Olga Theou

Abstract Positive advances in life expectancy, healthcare access and medical technology have been accompanied by an increased prevalence of chronic diseases and substantial population ageing. How this impacts changes in both frailty level and subsequent mortality in recent decades are not well understood. We aimed to investigate how these factors changed over an 18-year period. Nine waves of the National Health and Nutrition Examination Survey (1999-2016) were harmonized to create a 46-item frailty index (FI) using self-reported and laboratory-based health deficits. Individuals aged 20+ were included in analyses (n=44086). Mortality was ascertained in December 2015. Weighted multilevel models estimated the effect of cohort on FI score in 10-year age-stratified groups. Cox proportional hazard models estimated if two or four-year mortality risk of frailty changed across the 1999-2012 cohorts. Mean FI score was 0.11±0.10. In the five older age groups (&gt;40 years), later cohorts had higher frailty levels than did earlier cohorts. For example, in people aged 80+, each subsequent cohort had an estimated 0.007 (95%CI: 0.005, 0.009) higher FI score. However, in those aged 20-29, later cohorts had lower frailty [β=-0.0009 (-0.0013, -0.0005)]. Hazard ratios and cohort-frailty interactions indicated that there was no change in two or four-year lethality of FI score over time (i.e. two-year mortality: HR of 1.069 (1.055, 1.084) in 1999-2000 vs 1.061 (1.044, 1.077) in 2011-2012). Higher frailty levels in the most recent years in middle and older aged adults combined with unchanged frailty lethality suggests that the degree of frailty may continue to increase.


2019 ◽  
Vol 8 (11) ◽  
pp. 1899 ◽  
Author(s):  
Shadi Katou ◽  
Brigitta Globke ◽  
M. Haluk Morgul ◽  
Thomas Vogel ◽  
Benjamin Struecker ◽  
...  

The aim of this study was to analyze the value of urine α- and π-GST in monitoring and predicting kidney graft function following transplantation. In addition, urine samples from corresponding organ donors was analyzed and compared with graft function after organ donation from brain-dead and living donors. Urine samples from brain-dead (n = 30) and living related (n = 50) donors and their corresponding recipients were analyzed before and after kidney transplantation. Urine α- and π-GST values were measured. Kidney recipients were grouped into patients with acute graft rejection (AGR), calcineurin inhibitor toxicity (CNI), and delayed graft function (DGF), and compared to those with unimpaired graft function. Urinary π-GST revealed significant differences in deceased kidney donor recipients with episodes of AGR or DGF at day one after transplantation (p = 0.0023 and p = 0.036, respectively). High π-GST values at postoperative day 1 (cutoff: >21.4 ng/mg urine creatinine (uCrea) or >18.3 ng/mg uCrea for AGR or DGF, respectively) distinguished between rejection and no rejection (sensitivity, 100%; specificity, 66.6%) as well as between DGF and normal-functioning grafts (sensitivity, 100%; specificity, 62.6%). In living donor recipients, urine levels of α- and π-GST were about 10 times lower than in deceased donor recipients. In deceased donors with impaired graft function in corresponding recipients, urinary α- and π-GST were elevated. α-GST values >33.97 ng/mg uCrea were indicative of AGR with a sensitivity and specificity of 77.7% and 100%, respectively. In deceased donor kidney transplantation, evaluation of urinary α- and π-GST seems to predict different events that deteriorate graft function. To elucidate the potential advantages of such biomarkers, further analysis is warranted.


2020 ◽  
Vol 21 (1) ◽  
Author(s):  
Rao Chen ◽  
Haifeng Wang ◽  
Lei Song ◽  
Jianfei Hou ◽  
Jiawei Peng ◽  
...  

Abstract Background Delayed graft function (DGF) is closely associated with the use of marginal donated kidneys due to deficits during transplantation and in recipients. We aimed to predict the incidence of DGF and evaluate its effect on graft survival. Methods This retrospective study on kidney transplantation was conducted from January 1, 2018, to December 31, 2019, at the Second Xiangya Hospital of Central South University. We classified recipients whose operations were performed in different years into training and validation cohorts and used data from the training cohort to analyze predictors of DGF. A nomogram was then constructed to predict the likelihood of DGF based on these predictors. Results The incidence rate of DGF was 16.92%. Binary logistic regression analysis showed correlations between the incidence of DGF and cold ischemic time (CIT), warm ischemic time (WIT), terminal serum creatine (Scr) concentration, duration of pretransplant dialysis, primary cause of donor death, and usage of LifePort. The internal accuracy of the nomogram was 83.12%. One-year graft survival rates were 93.59 and 99.74%, respectively, for the groups with and without DGF (P < 0.05). Conclusion The nomogram established in this study showed good accuracy in predicting DGF after deceased donor kidney transplantation; additionally, DGF decreased one-year graft survival.


2016 ◽  
Vol 157 (24) ◽  
pp. 946-955
Author(s):  
Gergely Zádori ◽  
Vera Tarjányi ◽  
Réka P. Szabó ◽  
Lajos Zsom ◽  
Roland Fedor ◽  
...  

Introduction: To ease organ shortage many transplant centres developed different donor scoring systems, however, a general consensus among clinicians on the use of these systems does not still exist. Aim: The aim of the authors was to analyse the effect of expanded criteria donor, deceased donor score and kidney donor risk index on postoperative kidney function and graft survival. Method: Analysis of the characteristics of 138 kidney transplantations and 205 donors in a retrospective study of a five-year period. Results: There was a trend towards rejecting donors in higher risk groups; 22.7% of standard criteria donors belonged to the high risk group of deceased donor score. Graft function was worse in high risk patients. High risk donors can be divided due to the use of deceased donor score. Patients with the highest risk had worse graft function and survival. Conclusions: With the use of these scoring systems grafts with favourable outcome can be selected more precisely. Orv. Hetil., 2016, 157(24), 946–955.


Sign in / Sign up

Export Citation Format

Share Document