scholarly journals Employment status at transplant influences ethnic disparities in outcomes after deceased donor kidney transplantation

2022 ◽  
Vol 23 (1) ◽  
Author(s):  
Jasmin Divers ◽  
Sumit Mohan ◽  
W. Mark Brown ◽  
Stephen O. Pastan ◽  
Ajay K. Israni ◽  
...  

Abstract Background African American (AA) recipients of deceased-donor (DD) kidney transplants (KT) have shorter allograft survival than recipients of other ethnic groups. Reasons for this disparity encompass complex interactions between donors and recipients characteristics. Methods Outcomes from 3872 AA and 19,719 European American (EA) DDs who had one kidney transplanted in an AA recipient and one in an EA recipient were analyzed. Four donor/recipient pair groups (DRP) were studied, AA/AA, AA/EA, EA/AA, and EA/EA. Survival random forests and Cox proportional hazard models were fitted to rank and evaluate modifying effects of DRP on variables associated with allograft survival. These analyses sought to identify factors contributing to the observed disparities in transplant outcomes among AA and EA DDKT recipients. Results Transplant era, discharge serum creatinine, delayed graft function, and DRP were among the top predictors of allograft survival and mortality among DDKT recipients. Interaction effects between DRP with the kidney donor risk index and transplant era showed significant improvement in allograft survival over time in EA recipients. However, AA recipients appeared to have similar or poorer outcomes for DDKT performed after 2010 versus before 2001; allograft survival hazard ratios (95% CI) were 1.15 (0.74, 1.76) and 1.07 (0.8, 1.45) for AA/AA and EA/AA, compared to 0.62 (0.54, 0.71) and 0.5 (0.41, 0.62) for EA/EA and AA/EA DRP, respectively. Recipient mortality improved over time among all DRP, except unemployed AA/AAs. Relative to DDKT performed pre-2001, employed AA/AAs had HR = 0.37 (0.2, 0.69) versus 0.59 (0.31, 1.11) for unemployed AA/AA after 2010. Conclusion Relative to DDKT performed before 2001, similar or worse overall DCAS was observed among AA/AAs, while EA/EAs experienced considerable improvement regardless of employment status, KDRI, and EPTS. AA recipients of an AA DDKT, especially if unemployed, had worse allograft survival and mortality and did not appear to benefit from advances in care over the past 20 years.

2021 ◽  
Author(s):  
Jasmin Divers ◽  
Sumit Mohan ◽  
William M. Brown ◽  
Stephen O. Pastan ◽  
Ajay K. Israni ◽  
...  

Abstract Background: African American (AA) recipients of deceased-donor (DD) kidney transplants (KT) have shorter allograft survival than recipients of other ethnic groups. Reasons for this disparity encompass complex interactions between donors and recipients characteristics. Methods: Outcomes from 3,872 AA and 19,719 European American (EA) DDs who had one kidney transplanted in an AA recipient and one in an EA recipient were analyzed. Four donor/recipient pair groups (DRP) were studied, AA/AA, AA/EA, EA/AA, and EA/EA. Survival random forests and Cox proportional hazard models were fitted to rank and evaluate modifying effects of DRP on variables associated with allograft survival. These analyses sought to identify factors contributing to the observed disparities in transplant outcomes among AA and EA DDKT recipients. Results: Transplant era, discharge serum creatinine, delayed graft function, and DRP were among the top predictors of allograft survival and mortality among DDKT recipients. Interaction effects between DRP with the kidney donor risk index and transplant era showed significant improvement in allograft survival over time in EA recipients. However, AA recipients appeared to have similar or poorer outcomes for DDKT performed after 2010 versus before 2001; allograft survival hazard ratios (95% CI) were 1.15 (0.74, 1.76) and 1.07 (0.8, 1.45) for AA/AA and EA/AA, compared to 0.62 (0.54, 0.71) and 0.5 (0.41, 0.62) for EA/EA and AA/EA DRP, respectively. Recipient mortality improved over time among all DRP, except unemployed AA/AAs. Relative to DDKT performed pre-2001, employed AA/AAs had HR=0.37 (0.2, 0.69) versus 0.59 (0.31, 1.11) for unemployed AA/AA after 2010. Conclusion: Relative to DDKT performed before 2001, similar or worse overall DCAS was observed among AA/AAs, while EA/EAs experienced considerable improvement regardless of employment status, KDRI, and EPTS. AA recipients of an AA DDKT, especially if unemployed, had worse allograft survival and mortality and did not appear to benefit from advances in care over the past 20 years.


2019 ◽  
Vol 3 (Supplement_1) ◽  
pp. S682-S682
Author(s):  
Joanna M Blodgett ◽  
Kenneth Rockwood ◽  
Olga Theou

Abstract Positive advances in life expectancy, healthcare access and medical technology have been accompanied by an increased prevalence of chronic diseases and substantial population ageing. How this impacts changes in both frailty level and subsequent mortality in recent decades are not well understood. We aimed to investigate how these factors changed over an 18-year period. Nine waves of the National Health and Nutrition Examination Survey (1999-2016) were harmonized to create a 46-item frailty index (FI) using self-reported and laboratory-based health deficits. Individuals aged 20+ were included in analyses (n=44086). Mortality was ascertained in December 2015. Weighted multilevel models estimated the effect of cohort on FI score in 10-year age-stratified groups. Cox proportional hazard models estimated if two or four-year mortality risk of frailty changed across the 1999-2012 cohorts. Mean FI score was 0.11±0.10. In the five older age groups (>40 years), later cohorts had higher frailty levels than did earlier cohorts. For example, in people aged 80+, each subsequent cohort had an estimated 0.007 (95%CI: 0.005, 0.009) higher FI score. However, in those aged 20-29, later cohorts had lower frailty [β=-0.0009 (-0.0013, -0.0005)]. Hazard ratios and cohort-frailty interactions indicated that there was no change in two or four-year lethality of FI score over time (i.e. two-year mortality: HR of 1.069 (1.055, 1.084) in 1999-2000 vs 1.061 (1.044, 1.077) in 2011-2012). Higher frailty levels in the most recent years in middle and older aged adults combined with unchanged frailty lethality suggests that the degree of frailty may continue to increase.


2016 ◽  
Vol 157 (24) ◽  
pp. 946-955
Author(s):  
Gergely Zádori ◽  
Vera Tarjányi ◽  
Réka P. Szabó ◽  
Lajos Zsom ◽  
Roland Fedor ◽  
...  

Introduction: To ease organ shortage many transplant centres developed different donor scoring systems, however, a general consensus among clinicians on the use of these systems does not still exist. Aim: The aim of the authors was to analyse the effect of expanded criteria donor, deceased donor score and kidney donor risk index on postoperative kidney function and graft survival. Method: Analysis of the characteristics of 138 kidney transplantations and 205 donors in a retrospective study of a five-year period. Results: There was a trend towards rejecting donors in higher risk groups; 22.7% of standard criteria donors belonged to the high risk group of deceased donor score. Graft function was worse in high risk patients. High risk donors can be divided due to the use of deceased donor score. Patients with the highest risk had worse graft function and survival. Conclusions: With the use of these scoring systems grafts with favourable outcome can be selected more precisely. Orv. Hetil., 2016, 157(24), 946–955.


2020 ◽  
Vol 35 (Supplement_3) ◽  
Author(s):  
Eleanor Murray ◽  
Robert Pearson ◽  
Peter Thomson ◽  
Marc Clancy ◽  
John Asher

Abstract Background and Aims UK NHSBT kidney matching scheme changed in September 2019, aiming to better match graft and patient survival through stratification of donors and recipients into risk quartiles. We present data on two years of transplants aiming to highlight discrepancies between our unit and the model on which the scheme is based, and the potential implications on service provision of its’ introduction. Method We reviewed all deceased donor transplants in our centre in 2015 and 2016. Recipients and donors were re-classified into the risk index quartiles and endpoint data included inpatient days in first year, 1 year eGFR, survival, imaging, and infection episodes. Comparisons were made with NHSBT literature. Results 196 deceased donor transplants were performed. Distribution of D1-4 kidneys to R1-4 recipients in our cohort did not reflect those presented in the allocation scheme models, with our population skewed toward higher risk R4 category (73.4%), including 55 D4R4 (83% of D4 kidneys), see Figure 1. 2.0% had an age difference between donor and recipient of >25years, and 12.8% 15-25 years, compared with the NHSBT proposed targets of 8% and 20% respectively. Within the R4 group, recipients receiving a D4 graft were associated with a higher rate of DGF (41.7%, vs 23.2% D1-D3 grafts, p=0.009), longer index admission (median 11 days vs 8 days, p=0.038) and more readmissions within the first post-operative year (median 18 vs 11 days, p = 0.005) – Figure 2. D4 grafts demonstrated lower mean eGFR at one year (35.7, vs. 54.8 ml/min, p<0.001), Figure 3. R4 recipients experienced graft loss more frequently (HR 3.4 vs R1-3 (95%CI 0.8-13.9, p=0.12). One-year survival in R4 cohort was 97.8% (four deaths), and 93.8% at 4 years; R1-3 cohort had 100% survival to 4 years; there was no significant impact on R4 patient survival with D4 kidneys vs. D1-D3. Day ward attendances, bacteraemia, and CT imaging events did not differ by R or D category; D4 was associated with higher rates of transplant ultrasound (5.6 vs R1-3 4.25, p=0.009), and R4 with higher rates of urinary tract infection (3.6 vs R1-3 1.5, p=0.03). Conclusion Firstly, our transplant population is weighted to higher risk R4 recipients; secondly, intended principals of the allocation scheme are already largely being observed. Thirdly, our data does suggest that increasing R4D4 transplants will have a significant impact on transplant centres, with resource burden primarily within the first year. But despite poorer graft function, patient survival appears to be equivalent and improved matching may in the longer term reduce need for re-implantation as the scheme intends.


Author(s):  
Kristen A. Berg ◽  
Jarrod E. Dalton ◽  
Douglas D. Gunzler ◽  
Claudia J. Coulton ◽  
Darcy A. Freedman ◽  
...  

2021 ◽  
Vol 10 (7) ◽  
pp. 1401
Author(s):  
You-Ting Lin ◽  
Wei-Lun Huang ◽  
Hung-Pin Wu ◽  
Man-Ping Chang ◽  
Ching-Chu Chen

Heart failure (HF) is a common presentation in patients with type 2 diabetes mellitus (T2DM). Previous studies revealed that the HbA1c level is significantly associated with HF. However, little is known about the association between HbA1c variability and HF. We aimed to evaluate the association of mean and variability of HbA1c with HF in patients with T2DM. Using Diabetes Share Care Program data, patients with T2DM who had mean HbA1c (HbA1c-Mean), and HbA1c variability (tertiles of HbA1c-SD and HbA1c-adjSD) within 12–24 months during 2001–2008 were included. The cutoffs of HbA1c-Mean were set at <7%, 7–7.9%, and ≥8%. Hazard ratios (HRs) for HF during 2008–2018 were estimated using Cox proportional hazard models. A total of 3824 patients were included, of whom 315 patients developed HF during the observation period of 11.72 years. The associated risk of HF increased with tertiles of HbA1c variability and cutoffs of HbA1c-Mean. In mutually adjusted models, HbA1c-Mean showed a consistent dose-response association with HF, while the association of HbA1c variability with HF disappeared. Among patients with HbA1c-Mean <7%, the associated risk of HF in patients with HbA1c variability in tertile 3 was comparable to patients with HbA1c-Mean ≥8%. In conclusion, mean HbA1c was an independent predictor of HF and not explained by HbA1c variability. In addition to absolute HbA1c level, targeting on stability of HbA1c in patients with good glycemic control was also important for the development of HF in patients with T2DM.


2014 ◽  
Vol 2014 ◽  
pp. 1-9 ◽  
Author(s):  
Diego Guerrieri ◽  
Luis Re ◽  
Jorgelina Petroni ◽  
Nella Ambrosi ◽  
Roxana E. Pilotti ◽  
...  

Background.Delayed graft function (DGF) remains an important problem after kidney transplantation and reduced long-term graft survival of the transplanted organ. The aim of the present study was to determine if the development of DGF was associated with a specific pattern of inflammatory gene expression in expanded criteria of deceased donor kidney transplantation. Also, we explored the presence of correlations between DGF risk factors and the profile that was found.Methods.Seven days after kidney transplant, a cDNA microarray was performed on biopsies of graft from patients with and without DGF. Data was confirmed by real-time PCR. Correlations were performed between inflammatory gene expression and clinical risk factors.Results.From a total of 84 genes analyzed, 58 genes were upregulated while only 1 gene was downregulated in patients with DGF compared with no DGF (P=0.01). The most relevant genes fold changes observed was IFNA1, IL-10, IL-1F7, IL-1R1, HMOX-1, and TGF-β. The results were confirmed for IFNA1, IL-1R1, HMOX-1 and TGF-β. A correlation was observed between TGF-β, donor age, and preablation creatinine, but not body mass index (BMI). Also, TGF-βshowed an association with recipient age, while IFNA1 correlated with recipient BMI. Furthermore, TGF-β, IFNA1 and HMOX-1 correlated with several posttransplant kidney function markers, such as diuresis, ultrasound Doppler, and glycemia.Conclusions.Overall, the present study shows that DGF is associated with inflammatory markers, which are correlated with donor and recipient DGF risk factors.


Sign in / Sign up

Export Citation Format

Share Document