scholarly journals Osmoregulation Performance and Kidney Transplant Outcome

2019 ◽  
Vol 30 (7) ◽  
pp. 1282-1293 ◽  
Author(s):  
Manal Mazloum ◽  
Jordan Jouffroy ◽  
François Brazier ◽  
Christophe Legendre ◽  
Antoine Neuraz ◽  
...  

BackgroundKidney transplant recipients have an impaired ability to dilute urine but seldom develop baseline hyponatremia before ESRD. Although hyponatremia is a risk factor for adverse events in CKD and in kidney transplant recipients, it remains unclear whether subtler alterations in osmoregulation performance are associated with outcome.MethodsWe studied a single-center prospective cohort of 1258 kidney transplant recipients who underwent a water-loading test 3 months after transplant to determine osmoregulation performance. Measured GFR (mGFR) was performed at the same visit. A group of 164 healthy candidates for kidney donation served as controls. We further evaluated the association of osmoregulation performance with transplantation outcomes and subsequent kidney function.ResultsUnlike controls, most kidney transplant recipients failed to maintain plasma sodium during water loading (plasma sodium slope of −0.6±0.4 mmol/L per hour in transplant recipients versus −0.12±0.3 mmol/L per hour in controls; P<0.001). Steeper plasma sodium reduction during the test independently associated with the composite outcome of all-cause mortality and allograft loss (hazard ratio [HR], 1.73 per 1 mmol/L per hour decrease in plasma sodium; 95% confidence interval [95% CI], 1.23 to 2.45; P=0.002) and allograft loss alone (HR, 2.04 per 1 mmol/L per hour decrease in plasma sodium; 95% CI, 1.19 to 3.51; P=0.01). The association remained significant in a prespecified sensitivity analysis excluding patients with hyperglycemia. In addition, a steeper plasma sodium slope 3 months after transplantation independently correlated with lower mGFR at 12 months (β=1.93; 95% CI, 0.46 to 3.41; P=0.01).ConclusionsReduced osmoregulation performance occurs frequently in kidney transplant recipients and is an independent predictor of renal outcome.

2019 ◽  
Vol 21 (2) ◽  
Author(s):  
Hillary Ndemera ◽  
Busisiwe R. Bhengu

Kidney transplantation is the cornerstone for renal treatment in patients with end-stage renal failure. Despite improvements in short-term outcomes of renal transplantation, kidney allograft loss remains a huge challenge. The aim of the study was to assess factors influencing the durability of transplanted kidneys among transplant recipients in South Africa. A descriptive cross-sectional study design was used. A random sampling was used to select 171 participants. Data were collected through structured face-to-face interviews developed from in-depth consideration of relevant literature. Data were coded and entered into the SPSS software, version 24. The entered data were analysed using descriptive and inferential statistics. The results revealed that the average durability of transplanted kidneys was 9.07 years among selected kidney transplant recipients in South Africa. Factors associated with the durability of transplanted kidneys included age, the sewerage system and strict immunosuppressive adherence, all with a P-value = .000, followed by the mode of transport (P-value = .001) and support system (P-value = .004). Other variables including demographics, the healthcare system, medication and lifestyle modification engagement were not associated with the durability of transplanted kidneys. Understanding the factors influencing the durability of transplanted kidneys among kidney transplant recipients in South Africa is crucial. The study revealed associated factors and gaps which may be contributory factors to kidney allograft loss. This study provides an opportunity to introduce specific interventions to nephrology professionals to promote prolonged graft durability. It is recommended that a specific intervention model be developed, which targets South African kidney recipients taking into account the significant variables in this study and the socio-economic status of the country.


Circulation ◽  
2020 ◽  
Vol 142 (Suppl_3) ◽  
Author(s):  
Ryan S Cousins ◽  
Billy Mullinax ◽  
Lehman Godwin ◽  
Adam J Mitchell

Introduction: Screening for coronary artery disease in patients being considered for kidney transplant is common to stratify morbidity and mortality risk, but the optimal strategy, and its impact on outcomes, remains unclear. Here we test the hypothesis that myocardial perfusion imaging (MPI) abnormalities, left ventricular ejection fraction (LVEF), or coronary artery calcium (CAC) score are associated with all-cause mortality in potential kidney transplant recipients at Emory University Hospital (EUH). Methods: In a retrospective chart review, we assessed the relationship between patient demographics, single-photon emission MPI results, and CAC scoring with post-evaluation outcomes at 5 years in consecutive patients referred for pre-transplant stress testing at EUH in 2015. Mann-Whitney U and Chi-Square tests assessed between-group differences in continuous and categorical variables, respectively. Multivariate analysis was performed using logistic regression models. Results: During the study period, 589 patients (mean age 54 years; SEM 0.512, 58% male, 65% African American) underwent MPI and 424 also underwent CAC scoring. Overall, 90 patients (15%) had abnormal MPI (defined as any fixed or reversible defect) and 54 (9%) died during follow up. Age (mean 53.2 years; SEM 0.533 vs. 57.7 years; SEM 1.73, p=0.008), previous coronary artery bypass graft (CABG) (2.06% vs. 7.41%, p=0.017), and myocardial infarction (MI) post-evaluation (4.11% vs. 18.5%, p<0.001) were associated with all-cause mortality. Age (p=0.032) and MI post-evaluation (p<0.001) remained significant in multivariate analysis. MPI abnormalities, LVEF, and CAC score were not associated with all-cause mortality. Conclusions: Age and MI post-evaluation are associated with increased mortality in potential kidney transplant recipients referred for stress testing at EUH. We found no association between MPI abnormalities, LVEF, or CAC score and all-cause mortality.


2011 ◽  
Vol 35 (1) ◽  
pp. 17-23 ◽  
Author(s):  
Sylvia E. Rosas ◽  
Peter P. Reese ◽  
Yonghong Huan ◽  
Cataldo Doria ◽  
Philip T. Cochetti ◽  
...  

2020 ◽  
Vol 24 (12) ◽  
pp. 1177-1183
Author(s):  
Shufei Zeng ◽  
Torsten Slowinski ◽  
Wolfgang Pommer ◽  
Ahmed A. Hasan ◽  
Mohamed M. S. Gaballa ◽  
...  

Abstract Background Sclerostin is a hormone contributing to the bone-vascular wall cross talk and has been implicated in cardiovascular events and mortality in patients with chronic kidney disease (CKD). We analyzed the relationship between sclerostin and mortality in renal transplant recipients. Methods 600 stable renal transplant recipients (367men, 233 women) were followed for all-cause mortality for 3 years. Blood and urine samples for analysis and clinical data were collected at study entry. We performed Kaplan–Meier survival analysis and Cox regression models considering confounding factors such as age, eGFR, cold ischemia time, HbA1c, phosphate, calcium, and albumin. Optimal cut-off values for the Cox regression model were calculated based on ROC analysis. Results Sixty-five patients died during the observation period. Nonsurvivors (n = 65; sclerostin 57.31 ± 30.28 pmol/L) had higher plasma sclerostin levels than survivors (n = 535; sclerostin 47.52 ± 24.87 pmol/L) (p = 0.0036). Kaplan–Meier curve showed that baseline plasma sclerostin concentrations were associated with all-cause mortality in stable kidney transplant recipients (p = 0.0085, log-rank test). After multiple Cox regression analysis, plasma levels of sclerostin remained an independent predictor of all-cause mortality (hazard ratio, 1.011; 95% CI 1.002–1.020; p = 0.0137). Conclusions Baseline plasma sclerostin is an independent risk factor for all-cause mortality in patients after kidney transplantation.


2020 ◽  
Vol 35 (8) ◽  
pp. 1436-1443
Author(s):  
Augustine W Kang ◽  
Andrew G Bostom ◽  
Hongseok Kim ◽  
Charles B Eaton ◽  
Reginald Gohh ◽  
...  

Abstract Background Insufficient physical activity (PA) may increase the risk of all-cause mortality and cardiovascular disease (CVD) morbidity and mortality among kidney transplant recipients (KTRs), but limited research is available. We examine the relationship between PA and the development of CVD events, CVD death and all-cause mortality among KTRs. Methods A total of 3050 KTRs enrolled in an international homocysteine-lowering randomized controlled trial were examined (38% female; mean age 51.8 ± 9.4 years; 75% white; 20% with prevalent CVD). PA was measured at baseline using a modified Yale Physical Activity Survey, divided into tertiles (T1, T2 and T3) from lowest to highest PA. Kaplan–Meier survival curves were used to graph the risk of events; Cox proportional hazards regression models examined the association of baseline PA levels with CVD events (e.g. stroke, myocardial infarction), CVD mortality and all-cause mortality over time. Results Participants were followed up to 2500 days (mean 3.7 ± 1.6 years). The cohort experienced 426 CVD events and 357 deaths. Fully adjusted models revealed that, compared to the lowest tertile of PA, the highest tertile experienced a significantly lower risk of CVD events {hazard ratio [HR] 0.76 [95% confidence interval (CI) 0.59–0.98]}, CVD mortality [HR 0.58 (95% CI 0.35–0.96)] and all-cause mortality [HR 0.76 (95% CI 0.59–0.98)]. Results were similar in unadjusted models. Conclusions PA was associated with a reduced risk of CVD events and all-cause mortality among KTRs. These observed associations in a large, international sample, even when controlling for traditional CVD risk factors, indicate the potential importance of PA in reducing CVD and death among KTRs.


Sign in / Sign up

Export Citation Format

Share Document