Does rural follow-up of renal allografts give impaired graft survival?

2000 ◽  
Vol 13 (0) ◽  
pp. S92-S94
Author(s):  
A. R. Pontin ◽  
M. D. Pascoe ◽  
J. F. Botha ◽  
V. Tandon
2000 ◽  
Vol 13 (S1) ◽  
pp. S92-S94 ◽  
Author(s):  
A.R. Pontin ◽  
M.D. Pascoe ◽  
J.F. Botha ◽  
V. Tandon ◽  
D. Kahn

2021 ◽  
Vol 36 (Supplement_1) ◽  
Author(s):  
Clara Pardinhas ◽  
Rita Leal ◽  
Francisco Caramelo ◽  
Teofilo Yan ◽  
Carolina Figueiredo ◽  
...  

Abstract Background and Aims As kidney transplants are growing in absolute numbers, so are patients with failed allografts and thus potential candidates for re-transplantation. Re-transplantation is challenging due to immunological barriers, surgical difficulties and clinical complexities but it has been proven that successful second transplantation improves life expectancy over dialysis. It is important to evaluate re-transplantation outcomes since 20% of patients on the waiting list are waiting for a second graft. Our aim was to compare major clinical outcomes such as acute rejection, graft and patient survival, between patients receiving a first or a second kidney transplant. Method We performed a retrospective study, that included 1552 patients submitted to a first (N=1443, 93%) or a second kidney transplant (N=109, 7%), between January 2008 and December 2018. Patients with more than 2 grafts or multi-organ transplant were excluded. Demographic, clinical and histocompatibility characteristics of both groups were registered from our unit database and compared. Delayed graft function was defined has the need of dialysis in the first week post-transplant. All acute rejection episodes were biopsy proven, according to Banff 2017 criteria. Follow-up time was defined at 1st June 2020 for functioning grafts or at graft failure (including death with a functioning graft). Results Recipients of a second graft were significantly younger (43 ±12 vs 50 ± 13 years old, p<0.001) and there were significantly fewer expanded-criteria donors in the second transplant group (31.5% vs 57.5%, p<0.001). The waiting time for a second graft was longer (63±50 vs 48±29 months, p=0.011). HLA mismatch was similar for both groups but PRA was significantly higher for second KT patients (21.6±25% versus 3±9%; p<0.001). All patients submitted to a second KT had thymoglobulin as induction therapy compared to 16% of the first KT group (p<0.001). We found no difference in primary dysfunction or delayed graft function between groups. Acute rejection was significantly more frequent in second kidney transplant recipients (19% vs 5%, p<0.001), being 10 acute cellular rejections, 7 were antibody mediated and 3 were borderline changes. For the majority of the patients (85%), acute rejection occurred in the first-year post-transplant. Death censored graft failure occurred in 236 (16.4%) patients with first kidney transplant and 25 (23%) patients with a second graft, p=0.08. Survival analysis showed similar graft survival for both groups (log-rank p=0.392). We found no difference in patients’ mortality at follow up for both groups. Conclusion Although second graft patients presented more episodes of biopsy proven acute rejection, especially at the first-year post-transplant, we found no differences in death censored graft survival or patients’ mortality for patients with a second kidney transplant. Second transplants should be offered to patients whenever feasible.


2014 ◽  
Vol 86 (4) ◽  
pp. 325 ◽  
Author(s):  
Saverio Forte ◽  
Pasquale Martino ◽  
Silvano Palazzo ◽  
Matteo Matera ◽  
Floriana Giangrande ◽  
...  

Introduction: The intrarenal resistance index (RI) is a calculated parameter for the assessment of the status of the graft during the follow-up ultrasound of the transplanted kidney. Currently it is still unclear the predictive value of RI, also in function of the time. Materials and Methods: We retrospectively investigated the correlation between the RI and the graft survival (GS) and the overall survival (OS) after transplantation. We evaluated 268 patients transplanted between 2003 and 2011, the mean followup was 73 months (12-136). The RI was evaluated at 8 days, 6 months, 1 year and 3 years. The ROC analysis was used to calculate the predictive value of RI and the Kaplan Mayer curves was used to evaluated the OS and PS. Results: The ROC analysis, correlated to the GS, identified a value of RI equal to 0.75 as a cut-off. All patients was stratified according to the RI at 8 days (RI ≤ 0,75: 212 vs RI > 0.75: 56), at 6 months (RI ≤ 0.75: 237 vs RI > 0.75: 31), at 1 year (RI ≤ 0.75: 229 vs RI > 0.75: 39) and at 3 years (RI ≤ 0.75: 224 vs RI > 0.75: 44). The RI showed statistically significant differences between the two groups in favor of those who had an RI ≤ 0.75 only at 8 days and at 6 moths (p = 0.0078 and p = 0.02 to 8 days to 6 months) on the GS. On the contrary, we observed that the RI estimated at 1 year and 3 years has not correlated with the GS. The same RI cut-off was correlate with PS after transplantation. We observed that there are no correlations between the RI and OS. Conclusions: The RI proved to be a good prognostic factor on survival organ when it was evaluated in the first months of follow- up after transplantation. This parameter does not appear, however, correlate with OS of the transplanted subject.


2017 ◽  
Vol 45 (5) ◽  
pp. 1095-1101 ◽  
Author(s):  
Bum-Sik Lee ◽  
Seong-Il Bin ◽  
Jong-Min Kim ◽  
Won-Kyeong Kim ◽  
Jun Weon Choi

Background: Clinical outcomes after meniscal allograft transplantation (MAT) in arthritic knees are unclear, and objective estimates of graft survival according to the articular cartilage status have not been performed. Hypothesis: MAT should provide clinical benefits in knees with high-grade cartilage damage, but their graft survivorship should be inferior to that in knees with low-grade chondral degeneration after MAT. Study Design: Cohort study; Level of evidence, 3. Methods: The records of 222 consecutive patients who underwent primary MAT were reviewed to compare clinical outcomes and graft survivorship. The patients were grouped according to the degree and location of articular cartilage degeneration: low-grade chondral lesions (International Cartilage Repair Society [ICRS] grade ≤2) on both the femoral and tibial sides (ideal indication), high-grade lesions (ICRS grade 3 or 4) on either the femoral or tibial side (relative indication), and high-grade lesions on both sides (salvage indication). Kaplan-Meier survival analysis with the log-rank test was performed to compare the clinical survival rates and graft survival rates between the groups. A Lysholm score of <65 was considered a clinical failure, and graft failure was defined as a meniscal tear or meniscectomy of greater than one-third of the allograft, objectively evaluated by magnetic resonance imaging (MRI) and second-look arthroscopic surgery. Results: The mean (±SD) Lysholm score significantly improved from 63.1 ± 15.1 preoperatively to 85.1 ± 14.3 at the latest follow-up of a mean 44.6 ± 19.7 months ( P < .001). However, the postoperative scores were not significantly different between the 3 groups (85.7 ± 14.2 for ideal indication, 84.7 ± 17.0 for relative indication, and 84.7 ± 14.2 for salvage indication; P = .877). On MRI at the latest follow-up of a mean 23.0 ± 19.9 months and second-look arthroscopic surgery of a mean 19.3 ± 20.7 months, there were 25 (11.3%) failed MAT procedures (4 medial, 21 lateral); of these, 5 lateral MAT procedures (2.3%) went on to allograft removal. Clinical survival rates were not significantly different between the groups ( P = .256). However, on objective evaluation, the estimated cumulative graft survival rate at 5 years in the salvage indication group (62.2% [95% CI, 41.6-82.8]) was significantly lower than that in the other 2 groups (ideal indication: 93.8% [95% CI, 88.5-99.1]; relative indication: 90.9% [95% CI, 81.1-100.0]) ( P = .006). Conclusion: Our findings showed that MAT was an effective symptomatic treatment in knees with advanced bipolar chondral lesions. However, better graft survival can be expected when articular cartilage is intact or if chondral damage is limited to a unipolar lesion. MAT should be considered before the progression of chondral damage to a bipolar lesion for better graft survivorship and should be performed cautiously in arthritic knees.


2020 ◽  
Vol 35 (Supplement_3) ◽  
Author(s):  
Inge Derad ◽  
Johanna Busch ◽  
Martin Nitschke ◽  
Malte Ziemann

Abstract Background and Aims Posttransplant kidney survival depends on several risk factors. A careful immunogenetic matching and the absence of HLA donor specific antibodies (DSA) seem to determine the longevity of the transplant. Method Screening the presence of donor specific HLA antibodies in our posttransplant outpatients was implemented in 2010 (every 6 months in case of DSA free patients for two years, then yearly, and every 3 months in case of DSA + patients for two years, then twice a year). At the same time a treatment protocol was implemented, omitting reduction of immunosuppressive drugs in case of newly detected DSA, and most important with preventing steroid withdrawal in this case.The present single center study reports the long-term survival and kidney function from patients undergoing HLA-screening after transplantation between 2010 and 2016 with a follow-up until 2018. Using a Kaplan-Meier analysis patients without HLA antibodies (no HLA-ab), with HLA antibodies but without DSA (NDSA), and with donor-specific HLA antibodies (DSA) were compared by logrank-testing. Results A full dataset was obtained from 318 patients. The mean overall survival (patients and organ function) didn´t differ between the three groups, p=0.318: no HLA-ab 7.2 years (95%confidence interval 6.7;7.6), NDSA 6.6 (5.9;7.2), DSA 6.8 (6.1;7.5), overall 7.0 (6.6;7.3), events are given in Table1. Whereas the mean patient survival didn´t differ between the groups (p=0.715), the mean death-censored graft survival differed significantly, p=0.008, with a reduced transplant survival in the patients with HLA antibodies but without donorspecific antibodies: no HLA-ab 8.0 years (95%confidence interval 7.7;8.3), NDSA 7.0 (6.4;7.6), DSA 7.6 (7.1;8.2), overall 7.7 (7.4;8.0), numbers are given in Table1. Conclusion In conclusion, the presence of HLA antibodies was associated with a reduced transplant survival. Patients with HLA antibodies had a worse survival than patients with DSA undergoing HLA screening with a personalised immunosuppressive regimen. Immunosuppressive regimen of the groups, as well as other known risk factors of graft survival have to be further analysed. The results of these multivariate analyses have to be awaited to determine whether the risk for graft loss inferred by HLA antibodies is independent from other factors.


2019 ◽  
Vol 47 (12) ◽  
pp. 2919-2926 ◽  
Author(s):  
Jakob Ackermann ◽  
Gergo Merkely ◽  
Alexandre Barbieri Mestriner ◽  
Nehal Shah ◽  
Andreas H. Gomoll

Background: Assays to quantitate the quality of autologous chondrocyte implants have recently become available. However, the correlation of the assay score with radiological and clinical outcomes has not been established. Purpose/Hypothesis: The purpose was to assess the influence of cell identity (chondrocyte/synoviocyte gene expression ratio) and viability on patient-reported outcome measures, graft survival, and repair tissue quality. It was hypothesized that greater cell product quality as assessed through an identity assay and cell viability is associated with superior outcomes after autologous chondrocyte implantation (ACI) for symptomatic cartilage defects. Study Design: Cohort study; Level of evidence, 3. Methods: Seventy-nine patients with a minimum follow-up of 2 years were included in this study. Of these, 67 patients were available for imaging assessment utilizing the Magnetic Resonance Observation of Cartilage Repair Tissue (MOCART) scoring system. Patients were assigned to groups either below or above the cohort’s mean based on their individual cell identity score and viability percentage. Results: Patients were predominantly female (57.7%) with a mean age of 30.0 ± 9.3 years. No differences were seen between Knee injury and Osteoarthritis Outcome Score, Lysholm, Tegner, or International Knee Documentation Committee Subjective Knee Evaluation Form within the viability and cell identity groups at a final follow-up of 3.8 ± 1.4 years after ACI ( P > .05). In a subset of patients, the mean MOCART score was 68.3 ± 15.6 at an average magnetic resonance imaging follow-up of 17.7 ± 9.56 months. Low cell identity was significantly associated with the degree of defect filling ( P = .025), integration of border zone ( P = .01), effusion ( P = .024), and ACI graft failure ( P = .002). Patients with above-average cell identity scores had a significantly higher survival rate at 5-year follow-up compared with patients with below-average scores (95.8% vs 64.7%; P = .013). Cell viability did not influence MOCART subscales or graft failure (all P > .05). Cell viability and identity showed no significant correlation with each other ( r = −0.045; P = .694). Conclusion: Cell identity was significantly correlated with structural repair quality and graft survival after second-generation ACI for symptomatic chondral lesions in the knee. While improved imaging outcome and higher graft survivorship were associated with a higher individual cell identity score indicating a higher chondrocyte/synoviocyte gene expression ratio in the final cell product, clinical outcome did not correlate with the identity score.


2020 ◽  
Vol 5 (4) ◽  
pp. 2473011420S0038
Author(s):  
Gregory F. Pereira ◽  
John Steele ◽  
Amanda N. Fletcher ◽  
Samuel B. Adams ◽  
Ryan B. Clement

Category: Ankle Introduction/Purpose: The term osteochondral lesion of the talus (OLT) refers to any pathology of the talar articular cartilage and corresponding subchondral bone. In general, OLTs can pose a formidable treatment challenge to the orthopaedic surgeon due to the poor intrinsic ability of cartilage to heal as well as the tenuous vascular supply to the talus. Although many treatment options exist, including microfracture, retrograde drilling, autologous chondrocyte implantation (ACI), and osteochondral autograft transfer system (OATS) these options may be inadequate to treat large cartilage lesions. Osteochondral allografts have demonstrated promise as the primary treatment for OLTs with substantial cartilage and bone involvement. To our knowledge, this is the first systematic review of outcomes after fresh osteochondral allograft transplantation for OLTs. Methods: PudMed, the Cochrane Central Register of Controlled Trials, EMBASE, and Medline were searched using PRISMA guidelines. Studies that evaluated outcomes in adult patients after fresh osteochondral allograft transplantation for chondral defects of the talus were included. Operative results, according to standardized scoring systems, such as the AOFAS Ankle/Hindfoot scale and the Visual Analog Scale were compared across various studies. The methodological quality of the included studies was assessed using the Coleman methodology score. Results: There were a total of 12 eligible studies reporting on 191 patients with OLTs with an average follow-up of 56.8 months (range 6-240). The mean age was 37.5 (range 17-74) years and the overall graft survival rate was 86.6%. The AOFAS Ankle/Hindfoot score was obtained pre- and postoperatively in 6 of the 12 studies and had significant improvements in each (P<0.05). Similarly, the VAS pain score was evaluated in 5 of the 12 studies and showed significant decreases (P<0.05) from pre- to postoperatively with an aggregate mean preoperative VAS score of 7.3 and an aggregate postoperative value of 2.6. The reported short-term complication rate was 0%. The overall failure rate was 13.4% and 21.6% percent of patients had subsequent procedures. Conclusion: The treatment of osteochondral lesions of the talus remains a challenge to orthopaedic surgeons. From this systematic review, one can conclude that osteochondral allograft transplantation for osteochondral lesions of the talus results in predictably favorable outcomes with an impressive graft survival rate and high satisfaction rates at intermediate follow-up. [Table: see text]


2014 ◽  
Vol 2014 ◽  
pp. 1-9 ◽  
Author(s):  
Mirjam Tielen ◽  
Job van Exel ◽  
Mirjam Laging ◽  
Denise K. Beck ◽  
Roshni Khemai ◽  
...  

Background. Nonadherence to medication is a common problem after kidney transplantation. The aim of this study was to explore attitudes towards medication, adherence, and the relationship with clinical outcomes.Method. Kidney recipients participated in a Q-methodological study 6 weeks after transplantation. As a measure of medication adherence, respondents completed the Basel Assessment of Adherence to Immunosuppressive Medications Scale (BAASIS©-interview). Moreover, the intrapatient variability in the pharmacokinetics of tacrolimus was calculated, which measures stability of drug intake. Data on graft survival was retrieved from patient records up to 2 years after transplantation.Results. 113 renal transplant recipients (19–75 years old) participated in the study. Results revealed three attitudes towards medication adherence—attitude 1: “confident and accurate,” attitude 2: “concerned and vigilant,” and attitude 3: “appearance oriented and assertive.” We found association of attitudes with intrapatient variability in pharmacokinetics of tacrolimus, but not with self-reported nonadherence or graft survival. However, self-reported nonadherence immediately after transplantation was associated with lower two-year graft survival.Conclusion. These preliminary findings suggest that nonadherence shortly after kidney transplantation may be a risk factor for lower graft survival in the years to follow. The attitudes to medication were not a risk factor.


Author(s):  
Michał Ciszek ◽  
Krzysztof Mucha ◽  
Bartosz Foroncewicz ◽  
Dorota Żochowska ◽  
Maciej Kosieradzki ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document