scholarly journals Association of Circulating Trimethylamine N-Oxide and Its Dietary Determinants with the Risk of Kidney Graft Failure: Results of the TransplantLines Cohort Study

Nutrients ◽  
2021 ◽  
Vol 13 (1) ◽  
pp. 262
Author(s):  
Jose L. Flores-Guerrero ◽  
Maryse C. J. Osté ◽  
Paula B. Baraldi ◽  
Margery A. Connelly ◽  
Erwin Garcia ◽  
...  

Background. Due to the critical shortage of kidneys for transplantation, the identification of modifiable factors related to graft failure is highly desirable. The role of trimethylamine-N-oxide (TMAO) in graft failure remains undetermined. Here, we investigated the clinical utility of TMAO and its dietary determinants for graft failure prediction in renal transplant recipients (RTRs). Methods. We included 448 RTRs who participated in the TransplantLines Cohort Study. Cox proportional-hazards regression analyses were performed to study the association of plasma TMAO with graft failure. Net Benefit, which is a decision analysis method, was performed to evaluate the clinical utility of TMAO and dietary information in the prediction of graft failure. Results. Among RTRs (age 52.7 ± 13.1 years; 53% males), the baseline median TMAO was 5.6 (3.0–10.2) µmol/L. In multivariable regression analysis, the most important dietary determinants of TMAO were egg intake (Std. β = 0.09 [95%CI, 0.01; 0.18]; p = 0.03), fiber intake (Std. β = −0.14 [95%CI, −0.22, −0.05]; p = 0.002), and fish and seafood intake (Std. β = 0.12 [95%CI, 0.03,0.21]; p = 0.01). After a median follow-up of 5.3 (4.5–6.0) years, graft failure was observed in 58 subjects. TMAO was associated with an increased risk of graft failure, independent of age, sex, the body mass index (BMI), blood pressure, lipids, albuminuria, and the Estimated Glomerular Filtration Rate (eGFR) (Hazard Ratio per 1-SD increase of TMAO, 1.62 (95% confidence interval (CI): 1.22; 2.14, p < 0.001)). A TMAO and dietary enhanced prediction model offered approximately double the Net Benefit compared to a previously reported, validated prediction model for future graft failure, allowing the detection of 21 RTRs per 100 RTRs tested, with no false positives versus 10 RTRs, respectively. Conclusions. A predictive model for graft failure, enriched with TMAO and its dietary determinants, yielded a higher Net Benefit compared with an already validated model. This study suggests that TMAO and its dietary determinants are associated with an increased risk of graft failure and that it is clinically meaningful.

2021 ◽  
Vol 36 (Supplement_1) ◽  
Author(s):  
Jose L Flores-Guerrero ◽  
Maryse C.J. Osté ◽  
Paula B Baraldi ◽  
Margery A Connelly ◽  
Erwin Garcia ◽  
...  

Abstract Background and Aims Due to the critical shortage of kidneys for transplantation, the identification of modifiable factors related to graft failure is highly desirable. The role of trimethylamine-N-oxide (TMAO) in graft failure remains undetermined. Here we aimed to investigate the clinical utility of TMAO and its dietary determinants for graft failure prediction in renal transplant recipients (RTRs). Method We included 448 RTRs who participated in the TransplantLines Cohort Study. Cox proportional-hazards regression analyses were performed to study the association of plasma TMAO with graft failure. Net Benefit, a decision analysis method, was performed to evaluate the clinical utility of TMAO and dietary information in the prediction of graft failure. Results Among RTRs (age 52.7 ± 13.1 years; 53% males), baseline median TMAO was 5.6 (3.0–10.2) µmol/L. In multivariable regression analysis, the most important dietary determinants of TMAO were egg intake (Std. β = 0.10 [95%CI, 0.01;0.19]; P=0.03), fiber intake (Std. β = -0.13 [95%CI, -0.22, -0.05]; P =0.002), and fish and seafood intake (Std. β = 0.11 [95%CI, 0.02,0.20]; P=0.01). After a median follow-up of 5.3 (4.5–6.0) years, graft failure was observed in 58 subjects. TMAO was associated with increased risk of graft failure, independent of age, sex, BMI, blood pressure, lipids, albuminuria and eGFR (Hazard Ratio per 1-SD increase of TMAO, 1.62 (95% confidence interval (CI): 1.22; 2.14, P&lt;0.001). A TMAO and dietary enhanced prediction model offered approximately double Net Benefit compared to a previously reported, validated prediction model for future graft failure, allowing the detection of 21 RTRs per 100 RTRs tested with no false positives versus 10 RTRs respectively. Conclusion A predictive model for graft failure, enriched with TMAO and its dietary determinants yielded a higher Net Benefit compared with an already validated model. This study suggests that TMAO and its dietary determinants are associated with an increased risk of graft failure and that it is clinically meaningful.


2021 ◽  
Vol 8 ◽  
pp. 205435812098537
Author(s):  
Kyla L. Naylor ◽  
Gregory A. Knoll ◽  
Eric McArthur ◽  
Amit X. Garg ◽  
Ngan N. Lam ◽  
...  

Background: The frequency and outcomes of starting maintenance dialysis in the hospital as an inpatient in kidney transplant recipients with graft failure are poorly understood. Objective: To determine the frequency of inpatient dialysis starts in patients with kidney graft failure and examine whether dialysis start status (hospital inpatient vs outpatient setting) is associated with all-cause mortality and kidney re-transplantation. Design: Population-based cohort study. Setting: We used linked administrative healthcare databases from Ontario, Canada. Patients: We included 1164 patients with kidney graft failure from 1994 to 2016. Measurements: All-cause mortality and kidney re-transplantation. Methods: The cumulative incidence function was used to calculate the cumulative incidence of all-cause mortality and kidney re-transplantation, accounting for competing risks. Subdistribution hazard ratios from the Fine and Gray model were used to examine the relationship between inpatient dialysis starts (vs outpatient dialysis start [reference]) and the dependent variables (ie, mortality or re-transplant). Results: We included 1164 patients with kidney graft failure. More than half (55.8%) of patients with kidney graft failure, initiated dialysis as an inpatient. Compared with outpatient dialysis starters, inpatient dialysis starters had a significantly higher cumulative incidence of mortality and a significantly lower incidence of kidney re-transplantation ( P < .001). The 10-year cumulative incidence of mortality was 51.9% (95% confidence interval [CI]: 47.4, 56.9%) (inpatient) and 35.3% (95% CI: 31.1, 40.1%) (outpatient). After adjusting for clinical characteristics, we found inpatient dialysis starters had a significantly increased hazard of mortality in the first year after graft failure (hazard ratio: 2.18 [95% CI: 1.43, 3.33]) but at 1+ years there was no significant difference between groups. Limitations: Possibility of residual confounding and unable to determine inpatient dialysis starts that were unavoidable. Conclusions: In this study we identified that most patients with kidney graft failure had inpatient dialysis starts, which was associated with an increased risk of mortality. Further research is needed to better understand the reasons for an inpatient dialysis start in this patient population.


2020 ◽  
Vol 35 (Supplement_3) ◽  
Author(s):  
Rianne M. Douwes ◽  
Joanna Sophia Jacoline Vinke ◽  
António W Gomes-Neto ◽  
Hans Blokzijl ◽  
Stefan P Berger ◽  
...  

Abstract Background and Aims Use of proton-pump inhibitors (PPIs) is common practice in renal transplant recipients (RTRs). Emerging data suggest several adverse effects of use of PPIs, including development of iron deficiency (ID). Although the latter has been shown with respect to PPIs, specific analyses for different types of PPIs and the associated risk of ID have not been performed. Method We used data from the TransplantLines Biobank and Cohort study, an ongoing prospective cohort study among all types of solid organ transplant recipients. For the current study, we used data from stable RTRs with a functional graft for more than 1 year post transplantation (n=795). We excluded RTRs who used any form of iron supplementation (n=54) and EPO-stimulating agents (n=24), resulting in 728 RTRs eligible for analyses. Use of PPIs was subdivided in different types of PPIs, i.e. omeprazole, esomeprazole, pantoprazole, and rabeprazole. ID was defined as TSAT&lt;20% and ferritin &lt;300 µg/L. Logistic regression analysis was used to assess the associations between PPIs and ID. Results We included 728 RTRs (age 56±13 years, 61% males), with a mean eGFR of 53±18 ml/min/1.73m2, a median [interquartile range] ferritin level of 96 (44 – 191) µg/L and mean TSAT of 24±10%. PPIs were used by 504 (69%) of the included RTRs, of which 398 (79%), 55 (11%), 49 (10%), and 2 (0.4%) respectively used omeprazole, pantoprazole, esomeprazole, and rabeprazole. Use of PPIs was strongly associated with ID (OR, 2.20; 95%CI 1.48 – 3.28; P&lt;0.001), independent of adjustment for age, sex, BMI, eGFR, hs-CRP, smoking, alcohol use, use of calcineurine inhibitors, prednisolone, antiplatelet drugs, and antihypertensives. When subdividing the PPIs into the different types, both omeprazole (OR, 1.98; 95%CI 1.39 – 2.83; P&lt;0.001) and esomeprazole (OR, 2.11; 95%CI 1.09 – 4.07; P=0.03) were independently associated with iron deficiency, whereas pantoprazole was not associated (OR, 0.89; 95%CI 0.47 – 1.70; P=0.73). Conclusion Omeprazole and esomeprazole, but not pantoprazole, are associated with an increased risk of ID. Our results are in line with previous reports that pantoprazole has the lowest potency with least increase in intragastric pH, thereby possibly interfering less with reduction of ferric to ferrous iron, and subsequently iron absorption. Future studies are warranted to confirm our present findings.


Author(s):  
Camilo G. Sotomayor ◽  
Dion Groothof ◽  
Joppe J. Vodegel ◽  
Michele F. Eisenga ◽  
Tim J. Knobbe ◽  
...  

2020 ◽  
Vol 2020 ◽  
pp. 1-13 ◽  
Author(s):  
Xiangtong Liu ◽  
Zhiwei Li ◽  
Jingbo Zhang ◽  
Shuo Chen ◽  
Lixin Tao ◽  
...  

Background. Sleep duration is associated with type 2 diabetes (T2D). However, few T2D risk scores include sleep duration. We aimed to develop T2D scores containing sleep duration and to estimate the additive value of sleep duration. Methods. We used data from 43,404 adults without T2D in the Beijing Health Management Cohort study. The participants were surveyed approximately every 2 years from 2007/2008 to 2014/2015. Sleep duration was calculated from the self-reported usual time of going to bed and waking up at baseline. Logistic regression was employed to construct the risk scores. Integrated discrimination improvement (IDI) and net reclassification improvement (NRI) were used to estimate the additional value of sleep duration. Results. After a median follow-up of 6.8 years, we recorded 2623 (6.04%) new cases of T2D. Shorter (both 6-8 h/night and <6 h/night) sleep durations were associated with an increased risk of T2D (odds ratio OR=1.43, 95% confidence interval CI=1.30-1.59; OR=1.98, 95%CI=1.63-2.41, respectively) compared with a sleep duration of >8 h/night in the adjusted model. Seven variables, including age, education, waist-hip ratio, body mass index, parental history of diabetes, fasting plasma glucose, and sleep duration, were selected to form the comprehensive score; the C-index was 0.74 (95% CI: 0.71-0.76) for the test set. The IDI and NRI values for sleep duration were 0.017 (95% CI: 0.012-0.022) and 0.619 (95% CI: 0.518-0.695), respectively, suggesting good improvement in the predictive ability of the comprehensive nomogram. The decision curves showed that women and individuals older than 50 had more net benefit. Conclusions. The performance of T2D risk scores developed in the study could be improved by containing the shorter estimated sleep duration, particularly in women and individuals older than 50.


2017 ◽  
Vol 31 (2) ◽  
pp. 220-229 ◽  
Author(s):  
Kim L. W. Bunthof ◽  
Carmen M. Verhoeks ◽  
Jan A. J. G. van den Brand ◽  
Luuk B. Hilbrands

2009 ◽  
Vol 23 (5) ◽  
pp. 465-475 ◽  
Author(s):  
Mario Rotondi ◽  
Giuseppe Stefano Netti ◽  
Elena Lazzeri ◽  
Giovanni Stallone ◽  
Elisabetta Bertoni ◽  
...  

2020 ◽  
Vol 10 (1) ◽  
Author(s):  
Vatsa Dave ◽  
Kevan R. Polkinghorne ◽  
Khai Gene Leong ◽  
John Kanellis ◽  
William R. Mulley

Abstract The evidence supporting an initial mycophenolate mofetil (MMF) dose of 2 g daily in tacrolimus-treated renal transplant recipients is limited. In a non-contemporaneous single-centre cohort study we compared the incidence of leukopaenia, rejection and graft dysfunction in patients initiated on MMF 1.5 g and 2 g daily. Baseline characteristics and tacrolimus trough levels were similar by MMF group. MMF doses became equivalent between groups by 12-months post-transplant, driven by dose reductions in the 2 g group. Leukopaenia occurred in 42.4% of patients by 12-months post-transplant. MMF 2 g was associated with a 1.80-fold increased risk of leukopaenia compared to 1.5 g. Rejection occurred in 44.8% of patients by 12-months post-transplantation. MMF 2 g was associated with half the risk of rejection relative to MMF 1.5 g. Over the first 7-years post-transplantation there was no difference in renal function between groups. Additionally, the development of leukopaenia or rejection did not result in reduced renal function at 7-years post-transplant. Leukopaenia was not associated with an increased incidence of serious infections or rejection. This study demonstrates the initial MMF dose has implications for the incidence of leukopaenia and rejection. Since neither dose produced superior long-term graft function, clinical equipoise remains regarding the optimal initial mycophenolate dose in tacrolimus-treated renal transplant recipients.


2012 ◽  
Vol 12 (6) ◽  
pp. 1618-1623 ◽  
Author(s):  
H. G. Otten ◽  
M. C. Verhaar ◽  
H. P. E. Borst ◽  
R. J. Hené ◽  
A. D. van Zuilen

Sign in / Sign up

Export Citation Format

Share Document