scholarly journals THE RELATIONSHIP BETWEEN URIC ACID LEVELS AND GRAFT FUNCTION IN RENAL TRANSPLANT PATIENTS WHO DISCONTINUED STEROID THERAPY

Author(s):  
Hulya Colak ◽  
Sibel Ersan ◽  
Mehmet Tanrisev ◽  
Banu Yilmaz ◽  
Orcun Ural ◽  
...  

Introduction: High uric acid levels are commonly encountered in kidney transplant recipients, and can be associated with allograft dysfunction. Our study aims to examine the relationship between UA levels and graft function in patients discontinuing steroids. Methods: In this single-center-retrospective study, 56 patients from among 678 RT patients transplanted from living donors between 1999-2020 were included. Causes of steroid discontinuation, creatinine levels concurrent with uric acid levels before and after steroid discontinuation (mean 3.9 ± 2.1 years), acute rejection numbers, demographics, durations of dialysis and transplantation, medications (d.a. use of immunosuppressive, antihypertensive), laboratory data, human leukocyte antigen (HLA) mismatch numbers, blood-pressure [BP], body mass index, late acute rejection (LAR) numbers (3 months post-transplantation) were all recorded. Results: Creatinine and uric acid levels were seen to have increased after steroid discontinuation, there was a significant relationship between them (p<0.001). Statistically significant correlation was found between increased creatinine levels after steroid discontinuation and graft survival with higher HLA mismatch; 39 (69.6%) patients with mismatch ≥2, and 17 patients with mismatch <2 (30.4%) (p=0.049) . No significant relationship was found between LAR numbers before and after steroid discontinuation, and creatinine levels after steroid discontinuation. In conclusion, per model obtained as a result of multivariate linear analysis, hyperuricemia and HLA mismatch numbers (p= 0.048 and p= 0.044, respectively) are independent predictive factors for graft dysfunction in patients discontinuing steroids. Accordingly, negative effects of modeling should be kept in mind for long-term graft survival in patients who plan to continue with steroid-sparing regimens.

2021 ◽  
Vol 36 (Supplement_1) ◽  
Author(s):  
Clara Pardinhas ◽  
Rita Leal ◽  
Francisco Caramelo ◽  
Teofilo Yan ◽  
Carolina Figueiredo ◽  
...  

Abstract Background and Aims As kidney transplants are growing in absolute numbers, so are patients with failed allografts and thus potential candidates for re-transplantation. Re-transplantation is challenging due to immunological barriers, surgical difficulties and clinical complexities but it has been proven that successful second transplantation improves life expectancy over dialysis. It is important to evaluate re-transplantation outcomes since 20% of patients on the waiting list are waiting for a second graft. Our aim was to compare major clinical outcomes such as acute rejection, graft and patient survival, between patients receiving a first or a second kidney transplant. Method We performed a retrospective study, that included 1552 patients submitted to a first (N=1443, 93%) or a second kidney transplant (N=109, 7%), between January 2008 and December 2018. Patients with more than 2 grafts or multi-organ transplant were excluded. Demographic, clinical and histocompatibility characteristics of both groups were registered from our unit database and compared. Delayed graft function was defined has the need of dialysis in the first week post-transplant. All acute rejection episodes were biopsy proven, according to Banff 2017 criteria. Follow-up time was defined at 1st June 2020 for functioning grafts or at graft failure (including death with a functioning graft). Results Recipients of a second graft were significantly younger (43 ±12 vs 50 ± 13 years old, p&lt;0.001) and there were significantly fewer expanded-criteria donors in the second transplant group (31.5% vs 57.5%, p&lt;0.001). The waiting time for a second graft was longer (63±50 vs 48±29 months, p=0.011). HLA mismatch was similar for both groups but PRA was significantly higher for second KT patients (21.6±25% versus 3±9%; p&lt;0.001). All patients submitted to a second KT had thymoglobulin as induction therapy compared to 16% of the first KT group (p&lt;0.001). We found no difference in primary dysfunction or delayed graft function between groups. Acute rejection was significantly more frequent in second kidney transplant recipients (19% vs 5%, p&lt;0.001), being 10 acute cellular rejections, 7 were antibody mediated and 3 were borderline changes. For the majority of the patients (85%), acute rejection occurred in the first-year post-transplant. Death censored graft failure occurred in 236 (16.4%) patients with first kidney transplant and 25 (23%) patients with a second graft, p=0.08. Survival analysis showed similar graft survival for both groups (log-rank p=0.392). We found no difference in patients’ mortality at follow up for both groups. Conclusion Although second graft patients presented more episodes of biopsy proven acute rejection, especially at the first-year post-transplant, we found no differences in death censored graft survival or patients’ mortality for patients with a second kidney transplant. Second transplants should be offered to patients whenever feasible.


2019 ◽  
Vol 13 (6) ◽  
pp. 1068-1076 ◽  
Author(s):  
Nuria Montero ◽  
Maria Quero ◽  
Emma Arcos ◽  
Jordi Comas ◽  
Inés Rama ◽  
...  

Abstract Background Obese kidney allograft recipients have worse results in kidney transplantation (KT). However, there is lack of information regarding the effect of body mass index (BMI) variation after KT. The objective of the study was to evaluate the effects of body weight changes in obese kidney transplant recipients. Methods In this study we used data from the Catalan Renal Registry that included KT recipients from 1990 to 2011 (n = 5607). The annual change in post-transplantation BMI was calculated. The main outcome variables were delayed graft function (DGF), estimated glomerular filtration rate (eGFR) and patient and graft survival. Results Obesity was observed in 609 patients (10.9%) at the time of transplantation. The incidence of DGF was significantly higher in obese patients (40.4% versus 28.3%; P &lt; 0.001). Baseline obesity was significantly associated with worse short- and long-term graft survival (P &lt; 0.05) and worse graft function during the follow-up (P &lt; 0.005). BMI variations in obese patients did not improve eGFR or graft or patient survival. Conclusions Our conclusion is that in obese patients, decreasing body weight after KT does not improve either short-term graft outcomes or long-term renal function.


Medicina ◽  
2018 ◽  
Vol 54 (5) ◽  
pp. 66
Author(s):  
Aureliusz Kolonko ◽  
Beata Bzoma ◽  
Piotr Giza ◽  
Beata Styrc ◽  
Michał Sobolewski ◽  
...  

Background: The panel-reactive antibodies that use the complement-dependent cytotoxicity test (PRA-CDC) are still a standard method for monitoring the degree of immunization in kidney transplant candidates on active waiting lists in some countries, including Poland. The aim of this study was to analyze the relationship between the maximum and the last pre-transplant PRA titer on the percentage of positive cross-matches and rate of early acute rejection episodes. Material and methods: The retrospective analysis included 528 patients from two transplant centers. All patients were divided into three groups, depending on their peak and last pre-transplant PRA titers. There were 437 (82.8%) patients with peak PRA <20% (non-sensitized group, non-ST) and 91 (17.2%) patients with peak PRA >20%. Among the latter group, 38 had maintained PRA level >20% at the time of transplantation (sensitized patients, ST), whereas 53 had pre-transplant PRA ≤20% (previously sensitized patients, prev-ST). Results: The percentages of positive crossmatches were 76.9% in ST and 53.7% in prev-ST groups versus 18.4 in non-ST group (both p < 0.001). The acute rejection rates were 18.9, 17.6 and 6.8%, respectively (p < 0.001 for ST or prev-ST versus non-ST). The pre-transplant PRA titer drop did not decrease the risk of early acute rejection [OR = 1.09 (95% CI: 0.31–3.85)] in a multiple logistic regression analysis. The occurrences of primary graft non-function and delayed graft function were similar in all study groups. Conclusions: Previously immunized kidney transplant candidates even with substantial decrease in pre-transplant PRA-CDC levels are still at high immunological risk when compared with non-immunized patients, and they should receive lymphocyte-depleting induction therapy.


2019 ◽  
Vol 9 (1) ◽  
Author(s):  
Kipyo Kim ◽  
Haena Moon ◽  
Yu Ho Lee ◽  
Jung-Woo Seo ◽  
Yang Gyun Kim ◽  
...  

AbstractRecent studies indicate that urinary mitochondrial DNA (mtDNA) is predictive of ischemic AKI and is related to delayed graft function (DGF) in renal transplantation. Nevertheless, the clinical implications and prognostic value of urinary mtDNA in kidney transplantation remain undetermined. Here, we aimed to evaluate the associations between cell-free mtDNA and clinical parameters, including pathological findings in allograft biopsy and post-transplant renal function. A total of 85 renal transplant recipients were enrolled, and blood and urine samples were collected at a median of 17 days after transplantation. Cell-free nuclear and mtDNA levels were measured by quantitative polymerase chain reaction for LPL and ND1 genes. Urinary cell-free mtDNA levels were significantly higher in patients with DGF (P < 0.001) and cases of deceased donor transplantation (P < 0.001). The subjects with acute rejection showed higher urinary mtDNA levels than those without abnormalities (P = 0.043). In addition, allograft functions at 9- and 12-month post-transplantation were significantly different between tertile groups of mtDNA independent of the presence of DGF or acute rejection, showing significantly better graft outcome in the lowest tertile group. Urinary cell-free mtDNA levels during the early post-transplant period are significantly associated with DGF, acute rejection in graft biopsy, and short-term post-transplant renal function.


Author(s):  
Antonia Margarete Schuster ◽  
N. Miesgang ◽  
L. Steines ◽  
C. Bach ◽  
B. Banas ◽  
...  

AbstractThe B cell activating factor BAFF has gained importance in the context of kidney transplantation due to its role in B cell survival. Studies have shown that BAFF correlates with an increased incidence of antibody-mediated rejection and the development of donor-specific antibodies. In this study, we analyzed a defined cohort of kidney transplant recipients who were treated with standardized immunosuppressive regimens according to their immunological risk profile. The aim was to add BAFF as an awareness marker in the course after transplantation to consider patient’s individual immunological risk profile. Included patients were transplanted between 2016 and 2018. Baseline data, graft function, the occurrence of rejection episodes, signs of microvascular infiltration, and DSA kinetics were recorded over 3 years. BAFF levels were determined 14 d, 3 and 12 months post transplantation. Although no difference in graft function could be observed, medium-risk patients showed a clear dynamic in their BAFF levels with low levels shortly after transplantation and an increase in values of 123% over the course of 1 year. Patients with high BAFF values were more susceptible to rejection, especially antibody-mediated rejection and displayed intensified microvascular inflammation; the combination of high BAFF + DSA puts patients at risk. The changing BAFF kinetics of the medium risk group as well as the increased occurrence of rejections at high BAFF values enables BAFF to be seen as an awareness factor. To compensate the changing immunological risk, a switch from a weaker induction therapy to an intensified maintenance therapy is required.


2002 ◽  
Vol 87 (02) ◽  
pp. 194-198 ◽  
Author(s):  
Torsten Slowinski ◽  
Ingeborg Hauser ◽  
Birgit Vetter ◽  
Lutz Fritsche ◽  
Daniela Bachert ◽  
...  

SummaryWe analysed whether the factor V Leiden mutation – the most common hereditary predisposing factor for venous thrombosis – is associated with early and long-term graft dysfunction after kidney transplantation in 394 Caucasian kidney transplant recipients. The presence of factor V Leiden mutation was identified by allele specific PCR. The prevalence of the factor V Leiden mutation was compared to 32216 unselected neonates. The prevalence of the factor V Leiden mutation (GA genotype) was similar in 394 kidney transplant recipients and 32216 neonates. The frequency of known factors predicting long-term graft function were similar in patients with the GA genotype and with the normal factor V gene (GG genotype). The GA genotype was associated with the occurrence of no primary graft function (risk: 2.87; 95% confidence interval: 1.01-8.26; p < 0.05), the number of dialysis after transplantation in patients with no primary graft function until graft function (7.5 ± 2.06 dialysis in GA patients; 4.2 ± 0.36 dialyses in GG patients; p < 0.05), and the risk for at least one acute rejection episode (risk: 3.83; 95% confidence interval: 1.38-10.59; p < 0.02). The slope of 1/creatinine per year was significantly lower in patients with the GA genotype (GA patients: – 0.0204 ± 0.008 dl/mg per year; GG patients: 0.0104 ± 0.004 dl/mg per year; p < 0.02). The annual enhancement of the daily protein excretion rate was elevated in patients with the GA genotype (GA patients: 38.5 ± 16.6 mg/24 h per year; GG patients: 4.9 ± 4.4 mg/24 h per year; p < 0.02). Our study showed that the factor V Leiden mutation is associated with the occurrence of delayed graft function, acute rejection episodes and chronic graft dysfunction after kidney transplantation.


F1000Research ◽  
2016 ◽  
Vol 5 ◽  
pp. 2893 ◽  
Author(s):  
Rossana Rosa ◽  
Jose F. Suarez ◽  
Marco A. Lorio ◽  
Michele I. Morris ◽  
Lilian M. Abbo ◽  
...  

Background: Antiretroviral therapy (ART) poses challenging drug-drug interactions with immunosuppressant agents in transplant recipients.  We aimed to determine the impact of specific antiretroviral regimens in clinical outcomes of HIV+ kidney transplant recipients. Methods: A single-center, retrospective cohort study was conducted at a large academic center. Subjects included 58 HIV- to HIV+ adult, first-time kidney transplant patients. The main intervention was ART regimen used after transplantation.  The main outcomes assessed at one- and three-years were: patient survival, death-censored graft survival, and biopsy-proven acute rejection; we also assessed serious infections within the first six months post-transplant. Results: Patient and graft survival at three years were both 90% for the entire cohort. Patients receiving protease inhibitor (PI)-containing regimens had lower patient survival at one and three years than patients receiving PI-sparing regimens: 85% vs. 100% (p=0.06) and 82% vs. 100% (p=0.03), respectively. Patients who received PI-containing regimens had twelve times higher odds of death at 3 years compared to patients who were not exposed to PIs (odds ratio, 12.05; 95% confidence interval, 1.31-1602; p=0.02).  Three-year death-censored graft survival was lower in patients receiving PI vs. patients on PI-sparing regimens (82 vs 100%, p=0.03). Patients receiving integrase strand transfer inhibitors-containing regimens had higher 3-year graft survival. There were no differences in the incidence of acute rejection by ART regimen. Individuals receiving PIs had a higher incidence of serious infections compared to those on PI-sparing regimens (39 vs. 8%, p=0.01). Conclusions: PI-containing ART regimens are associated with adverse outcomes in HIV+ kidney transplant recipients.


2021 ◽  
Vol 10 (16) ◽  
pp. 3635
Author(s):  
Florian Terrec ◽  
Johan Noble ◽  
Hamza Naciri-Bennani ◽  
Paolo Malvezzi ◽  
Bénédicte Janbon ◽  
...  

Background: In many centers, a protocol kidney biopsy (PKB) is performed at 3 months post-transplantation (M3), without a demonstrated benefit on death-censored graft survival (DCGS). In this study, we compared DCGS between kidney transplant recipients undergoing a PKB or without such biopsy while accounting for the obvious indication bias. Methods: In this retrospective, single-center study conducted between 2007 and 2013, we compared DCGS with respect to the availability and features of a PKB. We built a propensity score (PS) to account for PKB indication likelihood and adjusted the DCGS analysis on PKB availability and the PS. Results: A total of 615 patients were included: 333 had a PKB, 282 did not. In bivariate Kaplan–Meier survival analysis, adjusting for the availability of a PKB and for the PS, a PKB was associated with a better 5-year DCGS independently of the PS (p < 0.001). Among the PKB+ patients, 87 recipients (26%) had IF/TA > 0. Patients with an IF/TA score of 3 had the worst survival. A total of 144 patients (44%) showed cv lesions. Patients with cv2 and cv3 lesions had the worst 5-year DCGS. Conclusions: A M3 PKB was associated with improved graft survival independently of potential confounders. These results could be explained by the early treatment of subclinical immunological events. It could be due to better management of the immunosuppressive regimen.


Author(s):  
Simon Ville ◽  
Marine Lorent ◽  
Clarisse Kerleau ◽  
Anders Asberg ◽  
Christophe Legendre ◽  
...  

BackgroundThe recognition that metabolism and immune function are regulated by an endogenous molecular clock generating circadian rhythms suggests that the magnitude of ischemia-reperfusion and subsequent inflammation on kidney transplantation, could be affected by the time of the day. MethodsAccordingly, we evaluated 5026 first kidney transplant recipients from deceased heart-beating donors. In a cause-specific multivariable analysis, we compare delayed graft function (DGF) and graft survival according to the time of kidney clamping and declamping. Participants were divided into clamping between midnight and noon (AM clamping group, 65%) or clamping between noon and midnight (PM clamping group, 35%), and similarly, AM declamping or PM declamping (25% / 75%). ResultsDGF occurred among 550 participants (27%) with AM clamping and 339 (34%) with PM clamping (adjusted OR = 0.81, 95%CI: 0.67 to 0.98, p= 0.03). No significant association of clamping time with overall death censored graft survival was observed (HR = 0.92, 95%CI: 0.77 to 1.10, p= 0.37). No significant association of declamping time with DGF or graft survival was observed. ConclusionsClamping between midnight and noon was associated with a lower incidence of DGF whilst the declamping time was not associated with kidney graft outcomes.


2021 ◽  
Vol 36 (Supplement_1) ◽  
Author(s):  
Ahmed Abo omar ◽  
Gamal Saadi

Abstract Background and Aims Transplantation is the first successful modality of renal replacement therapy (RRT) for irreversible chronic kidney disease (CKD; stage 5). Identifying additional factors associated with poor long-term prognosis after transplantation may provide clues regarding the pathophysiological mechanisms involved in allograft failure and identify high-risk patients who may benefit from additional monitoring or interventions. Successful kidney transplantation results in a substantial decrease in β2M levels, but a delayed decrease or increasing levels after transplantation may serve as a marker of acute rejection or inflammation. Several reports show that elevated sCD30 levels, pre and post transplantation are associated with a poor prognosis for long term kidney graft survival. These studies found higher CD30 levels in allograft recipients and a good predictor of impending acute rejection. The aim of the work is to study the prognostic outcomes of transplanted kidney using CD30 and β2-Microglobulin Method prospective study was conducted in nephrology unit –internal medicine department at Tanta and Kasr El Ainy university ,over 1 year.20 patients subjected to primary Tx.participated in this study.Cd30 and β2M.at day -1,2weeks and 3 months,with clinical follow up after 1 year to detect graft survival Results At day -1,level of cd30 was higher in rejection group than the other patient group.2 weeks post transplantation ,level of cd30 was higher in rejection group than the other patient group and at 3 monthes post transplantation level of cd30 was higher in rejection group than the other patient group,and these differences are statistically highly significant.(p values :0.003 ,0.005 and 0.002 respectively) Successful transplantation leads to significant decrease in serum cd30 at 2 weeks post tx.(P1 &lt;0.005) and at 3 monthes post tx. (P1&lt;0.001) although in rejection group, significant decrease in cd30 was at 2 weeks post tx.only(P1&lt;0.005) and at 3 monthes serum cd30 began to rise again with( P1 0.157). At day -1,level of β2microglobulin was higher in rejection group than the other patient groupwith statistically significant difference (p. 0.01).2 weeks post transplantation ,level of β2microglobulin was higher in rejection group than the other patient groupbut statistically not significant(p. 0.18 ) and at 3 monthes post transplantation level of β2microglobulin was higher in rejection group than the other patient group but statistically non significant(p. 0.18 ). Successful transplantation leads to significant decrease in serum β2microglobulin at 2 weeks post tx.(P1 &lt;0.002) and at 3 monthes post tx. (P1&lt;0.001) although in rejection group ,significant decrease in β2microglobulin was at 3 monthes post tx.only(P1&lt;0.005) and at 2 weeks no significant decrease(p1 0.15) Conclusion pre transplantation high Cd30 and β2M is associated with poor outcome.failure of decrease of cd30 and β2M post Tx. also associated with poor outcome or infection. Successful transplantation leads to significant decrease in serum cd30 and β2M. which can be used as predictors of graft survival with better sensitivity and specificity than serum creatinin.


Sign in / Sign up

Export Citation Format

Share Document