Low Graft Function and Ongoing Hyperparathyroidism Are Closely Related to Post-Transplantation Osteoporosis

2013 ◽  
Vol 45 (4) ◽  
pp. 1562-1566 ◽  
Author(s):  
E. Tutal ◽  
M.E. Uyar ◽  
T. Colak ◽  
Z. Bal ◽  
B.G. Demirci ◽  
...  
2019 ◽  
Vol 41 (2) ◽  
pp. 284-287
Author(s):  
Pedro Guilherme Coelho Hannun ◽  
Luis Gustavo Modelli de Andrade

Abstract Introduction: The prediction of post transplantation outcomes is clinically important and involves several problems. The current prediction models based on standard statistics are very complex, difficult to validate and do not provide accurate prediction. Machine learning, a statistical technique that allows the computer to make future predictions using previous experiences, is beginning to be used in order to solve these issues. In the field of kidney transplantation, computational forecasting use has been reported in prediction of chronic allograft rejection, delayed graft function, and graft survival. This paper describes machine learning principles and steps to make a prediction and performs a brief analysis of the most recent applications of its application in literature. Discussion: There is compelling evidence that machine learning approaches based on donor and recipient data are better in providing improved prognosis of graft outcomes than traditional analysis. The immediate expectations that emerge from this new prediction modelling technique are that it will generate better clinical decisions based on dynamic and local practice data and optimize organ allocation as well as post transplantation care management. Despite the promising results, there is no substantial number of studies yet to determine feasibility of its application in a clinical setting. Conclusion: The way we deal with storage data in electronic health records will radically change in the coming years and machine learning will be part of clinical daily routine, whether to predict clinical outcomes or suggest diagnosis based on institutional experience.


Author(s):  
Antonia Margarete Schuster ◽  
N. Miesgang ◽  
L. Steines ◽  
C. Bach ◽  
B. Banas ◽  
...  

AbstractThe B cell activating factor BAFF has gained importance in the context of kidney transplantation due to its role in B cell survival. Studies have shown that BAFF correlates with an increased incidence of antibody-mediated rejection and the development of donor-specific antibodies. In this study, we analyzed a defined cohort of kidney transplant recipients who were treated with standardized immunosuppressive regimens according to their immunological risk profile. The aim was to add BAFF as an awareness marker in the course after transplantation to consider patient’s individual immunological risk profile. Included patients were transplanted between 2016 and 2018. Baseline data, graft function, the occurrence of rejection episodes, signs of microvascular infiltration, and DSA kinetics were recorded over 3 years. BAFF levels were determined 14 d, 3 and 12 months post transplantation. Although no difference in graft function could be observed, medium-risk patients showed a clear dynamic in their BAFF levels with low levels shortly after transplantation and an increase in values of 123% over the course of 1 year. Patients with high BAFF values were more susceptible to rejection, especially antibody-mediated rejection and displayed intensified microvascular inflammation; the combination of high BAFF + DSA puts patients at risk. The changing BAFF kinetics of the medium risk group as well as the increased occurrence of rejections at high BAFF values enables BAFF to be seen as an awareness factor. To compensate the changing immunological risk, a switch from a weaker induction therapy to an intensified maintenance therapy is required.


2021 ◽  
Vol 135 (23) ◽  
pp. 2607-2618
Author(s):  
Laurie Bruzzese ◽  
Gwénaël Lumet ◽  
Donato Vairo ◽  
Claire Guiol ◽  
Régis Guieu ◽  
...  

Abstract Ischaemia–reperfusion injury (IRI) is a major cause of acute kidney injury (AKI) and chronic kidney disease, which consists of cellular damage and renal dysfunction. AKI is a major complication that is of particular concern after cardiac surgery and to a lesser degree following organ transplantation in the immediate post-transplantation period, leading to delayed graft function. Because effective therapies are still unavailable, several recent studies have explored the potential benefit of hypoxic preconditioning (HPC) on IRI. HPC refers to the acquisition of increased organ tolerance to subsequent ischaemic or severe hypoxic injury, and experimental evidences suggest a potential benefit of HPC. There are three experimental forms of HPC, and, for better clarity, we named them as follows: physical HPC, HPC via treated-cell administration and stabilised hypoxia-inducible factor (HIF)-1α HPC, or mimicked HPC. The purpose of this review is to present the latest developments in the literature on HPC in the context of renal IRI in pre-clinical models. The data we compiled suggest that preconditional activation of hypoxia pathways protects against renal IRI, suggesting that HPC could be used in the treatment of renal IRI in transplantation.


2019 ◽  
Vol 13 (6) ◽  
pp. 1068-1076 ◽  
Author(s):  
Nuria Montero ◽  
Maria Quero ◽  
Emma Arcos ◽  
Jordi Comas ◽  
Inés Rama ◽  
...  

Abstract Background Obese kidney allograft recipients have worse results in kidney transplantation (KT). However, there is lack of information regarding the effect of body mass index (BMI) variation after KT. The objective of the study was to evaluate the effects of body weight changes in obese kidney transplant recipients. Methods In this study we used data from the Catalan Renal Registry that included KT recipients from 1990 to 2011 (n = 5607). The annual change in post-transplantation BMI was calculated. The main outcome variables were delayed graft function (DGF), estimated glomerular filtration rate (eGFR) and patient and graft survival. Results Obesity was observed in 609 patients (10.9%) at the time of transplantation. The incidence of DGF was significantly higher in obese patients (40.4% versus 28.3%; P < 0.001). Baseline obesity was significantly associated with worse short- and long-term graft survival (P < 0.05) and worse graft function during the follow-up (P < 0.005). BMI variations in obese patients did not improve eGFR or graft or patient survival. Conclusions Our conclusion is that in obese patients, decreasing body weight after KT does not improve either short-term graft outcomes or long-term renal function.


2020 ◽  
Vol 35 (Supplement_3) ◽  
Author(s):  
Vladimir Hanzal ◽  
Janka Slatinska ◽  
Petra Hruba ◽  
Ondrej Viklicky

Abstract Background and Aims Cytomegalovirus (CMV) disease and infection negatively influence outcome of kidney transplantation. The aim of this retrospective study was to analyze risk factors for CMV disease and its influence on kidney graft function and survival. Method 1050 patients underwent kidney transplantation from January 2014 to December 2018 and received calcineurin inhibitor, mycophenolate mofetil and steroid-based immunosuppression. Recipients with PRA>20% received rATG while others had received basiliximab as induction. 825 out of 1050 patients (78.6%) received CMV prophylaxis (D+/R-, n=173; R+, n=652). Patients were followed up to 71 months /median 38 months/. Results CMV tissue invasive disease occurred in 49 out of 1050 patients (4.7%), while CMV infection in 87 patients (8.3%). CMV disease, but not CMV infection, had significant negative influence on graft survival at 5 years post transplantation (p=0.0029). Patients with CMV disease had significantly worse graft function at 4 years post transplantation (p<0.0001). CMV disease occurred in 31 out of 173 patients (17.9%) in D+/R- group vs. 18 out of 652 patients (2.8%) in R+ group. Incidence of CMV infection was 30/173 patients (17.3%) in D+/R- group vs. 57/652 patients (8.7%) with induction therapy. Shortening of CMV prophylaxis was found in 82 patients (9.9%). Leukopenia (≤ 2.0 x 109/L) was observed in 97 (11.7%) patients from those who received CMV prophylaxis, Its shortening significantly increased risk for both CMV infection (20,7% vs. 7.2%, p<0,0001) and CMV disease (8,5% vs. 4,2%, p=0,04). Among most significant risk factors for CMV disease in univariable analysis were CMV mismatch (OR 11, 95% CI: 5,9-20,4; p<0,0001), delayed graft function (OR 2,8, 95% CI: 1,6-5,1; p<0,0001, cadaveric donor (OR 6, 95% CI: 1,5-25,1; p=0,00013) and shortening of CMV prophylaxis (OR 2.1, 95% CI: 0,91-4,86; p=0,08). Multivariable analysis revealed as independent significant predictors of CMV disease DGF (OR 2,29, 95% CI: 1,2-4,3; p=0,01) and CMV mismatch (OR 10,8, 95% CI: 5,7-20,6; p<0.0001) in a model adjusted for type of donor, prophylaxis shortening and leukopenia. Conclusion CMV mismatch is the main independent predictor of CMV disease after kidney transplantation in multivariable analysis.


2015 ◽  
Author(s):  
Laurent Mesnard ◽  
Thangamani Muthukumar ◽  
Maren Burbach ◽  
Carol Li ◽  
Huimin Shang ◽  
...  

BACKGROUND: Kidney transplantation is the treatment of choice for most patients with end-stage renal disease and existing data suggest that post transplant graft function is a predictor of kidney graft failure. METHODS: Exome sequencing of DNA from kidney graft recipients and their donors was used to determine recipient and donor mismatches at the amino acid level. The number of mismatches that are more likely to induce an immune response in the recipient was computationally estimated and designated the allogenomics mismatch score. The relationship between the allogenomics score and post transplant kidney allograft function was examined using linear regression. RESULTS: A significant inverse correlation between the allogenomics mismatch score and kidney graft function at 36 months post transplantation was observed in a discovery cohort of kidney recipient-donor pairs (r2>=0.57, P<0.05, the score vs. level of serum creatinine or estimated glomerular filtration rate). This relationship was confirmed in an independent validation cohort of kidney recipient-donor pairs. We observed that the strength of the correlation increased with time post-transplantation. This inverse correlation remained after excluding HLA loci from the calculation of the score. Exome sequencing yielded allogenomics scores with stronger correlations with graft function than simulations of genotyping assays which measure common polymorphisms only. CONCLUSIONS: The allogenomics mismatch score, derived by exome sequencing of recipient-donor pairs, facilitates quantification of histoincompatibility between the organ donor and recipient impacting long-term post transplant graft function. The allogenomics mismatch score, by serving as a prognostic biomarker, may help identify patients at risk for graft failure.


Sign in / Sign up

Export Citation Format

Share Document