scholarly journals C4d Deposition in Acute Rejection: An Independent Long-Term Prognostic Factor

2002 ◽  
Vol 13 (1) ◽  
pp. 234-241
Author(s):  
Andrew M. Herzenberg ◽  
John S. Gill ◽  
Ognjenka Djurdjev ◽  
Alex B. Magil

ABSTRACT. Peritubular capillary deposition of C4d has been demonstrated to be associated with both acute humoral and vascular rejection and increased graft loss. Whether it is an independent predictor of long-term graft survival rates is uncertain. The biopsies (n = 126) from all patients (n = 93) with a tissue diagnosis of acute rejection that were performed between July 1, 1995, and December 31, 1997, were classified according to Cooperative Clinical Trials in Transplantation (CCTT) criteria. Fresh frozen tissue was immunostained for C4d. There were 58 patients with CCTT type I (interstitial) rejection and 35 with CCTT type II (vascular) rejection. For 34 patients, at least one biopsy exhibited peritubular C4d deposition (C4d+ group). The C4d+ group had proportionately more female patients (P = 0.003), more patients with high (>30%) panel-reactive antibody levels (P = 0.024), more patients with resistance to conventional antirejection therapy (P = 0.010), and fewer patients with postrejection hypertension (P = 0.021) and exhibited a greater rate of graft loss (38 versus 7%, P = 0.001). Peritubular C4d deposition was associated with significantly lower graft survival rates in the CCTT type I rejection group (P = 0.003) and the CCTT type II rejection group (P = 0.003). Multivariate analyses demonstrated that peritubular C4d deposition (P = 0.0002), donor age (P = 0.0002), cold ischemic time (P = 0.0211), and HLA matches (P = 0.0460) were significant independent determinants of graft survival rates. Peritubular C4d deposition is a significant predictor of graft survival rates and is independent of histologic rejection type and a variety of clinical prognostic factors.

2021 ◽  
Vol 36 (Supplement_1) ◽  
Author(s):  
Sophie Coche ◽  
Ben Sprangers ◽  
Steven Van Laecke ◽  
Laurent Weekers ◽  
Vicky De Meyer ◽  
...  

Abstract Background and Aims Recurrence of anti-glomerular basement membrane (anti-GBM) glomerulonephritis in the kidney graft is a rare event, described in limited case reports and registry analysis. The aim of this study was to evaluate in a large cohort of patients with detailed data collection and long follow-up the risk of recurrence of anti-GBM disease and graft loss caused by recurrence, the risk factors associated with clinical recurrence and the long-term patient and graft survival. Method Multicenter retrospective study. Inclusion criteria: patients with anti-GBM glomerulonephritis transplanted with a kidney between 1977 and 2015. Exclusion criteria: systemic vasculitis (except ANCA), lupus erythematosus and cryoglobulinemia. Clinical recurrence was defined as reappearance of signs of glomerulonephritis along with histological signs of proliferative glomerulonephritis and linear IgG staining on kidney biopsy, with or without anti-GBM antibodies. Results Fifty-three patients were included. Clinical recurrence in a first kidney transplant occurred in only one patient five years after transplantation -a prevalence rate of 1.9%- in the context of cessation of immunosuppressive drugs. The graft was lost due to recurrence. Histological recurrence with linear IgG staining on kidney biopsy in the absence of histologic signs of proliferative glomerulonephritis was observed in four patients, in the context of cellular rejection. Two patients lost their kidney graft from severe acute rejection; the others fully recovered. Patient survival was 100%, 94% and 89% at 5, 10 and 15 years, respectively. Overall, death-censored first graft survival rates were 88%, 83% and 79% at 5, 10 and 15 years, respectively. Conclusion Recurrence rate of anti-GBM glomerulonephritis after transplantation is very low, and associated with graft loss. The long-term patient and graft survival rates are excellent.


Author(s):  
Lucas Souto NACIF ◽  
Rafael Soares PINHEIRO ◽  
Rafael Antônio de Arruda PÉCORA ◽  
Liliana DUCATTI ◽  
Vinicius ROCHA-SANTOS ◽  
...  

Introduction: Late acute rejection leads to worse patient and graft survival after liver transplantation. Aim: To analyze the reported results published in recent years by leading transplant centers in evaluating late acute rejection and update the clinical manifestations, diagnosis and treatment of liver transplantation. Method: Systematic literature review through Medline-PubMed database with headings related to late acute rejection in articles published until November 2013 was done. Were analyzed demographics, immunosuppression, rejection, infection and graft and patient survival rates. Results: Late acute rejection in liver transplantation showed poor results mainly regarding patient and graft survival. Almost all of these cohort studies were retrospective and descriptive. The incidence of late acute rejection varied from 7-40% in these studies. Late acute rejection was one cause for graft loss and resulted in different outcomes with worse patient and graft survival after liver transplant. Late acute rejection has been variably defined and may be a cause of chronic rejection with worse prognosis. Late acute rejection occurs during a period in which the goal is to maintain lower immunosuppression after liver transplantation. Conclusion: The current articles show the importance of late acute rejection. The real benefit is based on early diagnosis and adequate treatment at the onset until late follow up after liver transplantation.


2018 ◽  
Vol 45 (1) ◽  
pp. 17-22 ◽  
Author(s):  
Heidi J. Reich ◽  
Jon A. Kobashigawa ◽  
Tamar Aintablian ◽  
Danny Ramzy ◽  
Michelle M. Kittleson ◽  
...  

Using older donor hearts in cardiac transplantation may lead to inferior outcomes: older donors have more comorbidities that reduce graft quality, including coronary artery disease, hypertension, diabetes mellitus, and dyslipidemia. Shorter cold ischemic times might overcome the detrimental effect of older donor age. We examined the relationship between donor allograft age and cold ischemic time on the long-term outcomes of heart transplant recipients. rom 1994 through 2010, surgeons at our hospital performed 745 heart transplantations. We retrospectively classified these cases by donor ages of <50 years (younger) and ≥50 years (older), then by cold ischemic times of <120 min (short), 120 to 240 min (intermediate), and >240 min (long). Endpoints included recipient and graft survival, and freedom from cardiac allograft vasculopathy, nonfatal major adverse cardiac events, and rejection. For intermediate ischemic times, the 5-year recipient survival rate was lower when donors were older (70% vs 82.6%; P=0.02). This was also true for long ischemic times (69.8% vs 87.6%; P=0.09). For short ischemic times, we found no difference in 5-year recipient or graft survival rates (80% older vs 85.6% younger; P=0.79), in freedom from nonfatal major adverse cardiac events (83.3% vs 91.5%; P=0.46), or in freedom from cardiac allograft vasculopathy (50% vs 70.6%; P=0.66). Rejection rates were mostly similar. Long-term graft survival in heart transplantation patients with older donor allografts may improve when cold ischemic times are shorter.


2018 ◽  
Vol 13 (5) ◽  
pp. 763-771 ◽  
Author(s):  
Napat Leeaphorn ◽  
Jeremy Ryan A. Pena ◽  
Natanong Thamcharoen ◽  
Eliyahu V. Khankin ◽  
Martha Pavlakis ◽  
...  

Background and objectivesRecent evidence suggests that HLA epitope-mismatching at HLA-DQ loci is associated with the development of anti-DQ donor-specific antibodies and adverse graft outcomes. However, the clinical significance of broad antigen HLA-DQ mismatching for graft outcomes is not well examined.Design, setting, participants, & measurementsUsing the United Network Organ Sharing/the Organ Procurement and Transplantation Network (UNOS/OPTN) data, patients with primary kidney transplants performed between 2005 and 2014 were included. Patients were classified as having either zero HLA-DQ mismatches, or one or two HLA-DQ mismatches. Primary outcomes were death-censored graft survival and incidence of acute rejection.ResultsA total of 93,782 patients were included. Of these, 22,730 (24%) and 71,052 (76%) received zero and one or two HLA-DQ mismatched kidneys, respectively. After adjusting for variables including HLA-ABDR, HLA-DQ mismatching was associated with a higher risk of graft loss in living kidney donor recipients with an adjusted hazard ratio (HR) of 1.18 (95% confidence interval [95% CI], 1.07 to 1.30; P<0.01), but not in deceased kidney donor recipients (HR, 1.05; 95% CI, 0.98 to 1.12; P=0.18) (P value for interaction <0.01). When taking cold ischemic time into account, HLA-DQ mismatching was associated with a higher risk of graft loss in deceased kidney donor recipients with cold ischemic time ≤17 hours (HR, 1.12; 95% CI, 1.02 to 1.27; P=0.002), but not in deceased kidney donor recipients with cold ischemic time >17 hours (HR, 0.97; 95% CI, 0.88 to 1.06; P=0.49) (P value for interaction <0.01). Recipients with one or two HLA-DQ mismatched kidneys had a higher incidence of acute rejection at 1 year, with adjusted odds ratios of 1.13 (95% CI, 1.03 to 1.23; P<0.01) in deceased donor and 1.14 (95% CI, 1.03 to 1.27; P=0.02) in living donor kidney transplant recipients. Specific donor-DQ mismatches seemed to be associated with the risk of acute rejection and graft failure, whereas others did not.ConclusionsHLA-DQ mismatching is associated with lower graft survival independent of HLA-ABDR in living donor kidney transplants and deceased donor kidney transplants with cold ischemia time ≤17 hours, and a higher 1-year risk of acute rejection in living and deceased donor kidney transplants.


Author(s):  
Rafique Umer Harvitkar ◽  
Abhijit Joshi

Abstract Introduction Laparoscopic fundoplication (LF) has almost completely replaced the open procedure performed for gastroesophageal reflux disease (GERD) and hiatus hernia (HH). Several studies have suggested that long-term results with surgery for GERD are better than a medical line of management. In this retrospective study, we outline our experience with LF over 10 years. Also, we analyze the factors that would help us in better patient selection, thereby positively affecting the outcomes of surgery. Patients and Methods In this retrospective study, we identified 27 patients (14 females and 13 males) operated upon by a single surgeon from 2010 to 2020 at our institution. Out of these, 25 patients (12 females and 13 males) had GERD with type I HH and 2 (both females) had type II HH without GERD. The age range was 24 to 75 years. All patients had undergone oesophago-gastro-duodenoscopy (OGD scopy). A total of 25 patients had various degrees of esophagitis. Two patients had no esophagitis. These patients were analyzed for age, sex, symptoms, preoperative evaluation, exact procedure performed (Nissen’s vs. Toupet’s vs. cruroplasty + gastropexy), morbidity/mortality, and functional outcomes. They were also reviewed to examine the length of stay, length of procedure, complications, and recurrent symptoms on follow-up. Symptoms were assessed objectively with a score for six classical GERD symptoms preoperatively and on follow-up at 1-, 4- and 6-weeks postsurgery. Further evaluation was performed after 6 months and then annually for 2 years. Results 14 females (53%) and 13 males (48%) with a diagnosis of GERD (with type I HH) and type II HH were operated upon. The mean age was 46 years (24–75 years) and the mean body mass index (BMI) was 27 (18–32). The range of duration of the preoperative symptoms was 6 months to 2 years. The average operating time dropped from 130 minutes for the first 12 cases to 90 minutes for the last 15 cases. The mean hospital stay was 3 days (range: 2–4 days). In the immediate postoperative period, 72% (n = 18) of the patients reported improvement in the GERD symptoms, while 2 (8%) patients described heartburn (grade I, mild, daily) and 1 (4%) patient described bloating (grade I, daily). A total of 5 patients (20%) reported mild dysphagia to solids in the first 2 postoperative weeks. These symptoms settled down after 2 to 5 weeks of postoperative proton-pump inhibitor (PPI) therapy and by adjusting consistency of oral feeds. There was no conversion to open, and we observed no perioperative mortality. There were no patients who underwent redo surgeries in the series. Conclusion LF is a safe and highly effective procedure for a patient with symptoms of GERD, and it gives long-term relief from the symptoms. Stringent selection criteria are necessary to optimize the results of surgery. Experience is associated with a significant reduction of operating time.


2021 ◽  
Vol 10 (15) ◽  
pp. 3237
Author(s):  
Lukas Johannes Lehner ◽  
Robert Öllinger ◽  
Brigitta Globke ◽  
Marcel G. Naik ◽  
Klemens Budde ◽  
...  

(1) Background: Simultaneous pancreas–kidney transplantation (SPKT) is a standard therapeutic option for patients with diabetes mellitus type I and kidney failure. Early pancreas allograft failure is a complication potentially associated with worse outcomes. (2) Methods: We performed a landmark analysis to assess the impact of early pancreas graft loss within 3 months on mortality and kidney graft survival over 10 years. This retrospective single-center study included 114 adult patients who underwent an SPKT between 2005 and 2018. (3) Results: Pancreas graft survival rate was 85.1% at 3 months. The main causes of early pancreas graft loss were thrombosis (6.1%), necrosis (2.6%), and pancreatitis (2.6%). Early pancreas graft loss was not associated with reduced patient survival (p = 0.168) or major adverse cerebral or cardiovascular events over 10 years (p = 0.741) compared to patients with functioning pancreas, after 3 months. Moreover, kidney graft function (p = 0.494) and survival (p = 0.461) were not significantly influenced by early pancreas graft loss. (4) Conclusion: In this study, using the landmark analysis technique, early pancreas graft loss within 3 months did not significantly impact patient or kidney graft survival over 10 years.


Neurosurgery ◽  
2017 ◽  
Vol 81 (1) ◽  
pp. 29-44 ◽  
Author(s):  
Jörg Klekamp

Abstract BACKGROUND: The clinical significance of pathologies of the spinal dura is often unclear and their management controversial. OBJECTIVE: To classify spinal dural pathologies analogous to vascular aneurysms, present their symptoms and surgical results. METHODS: Among 1519 patients with spinal space-occupying lesions, 66 patients demonstrated dural pathologies. Neuroradiological and surgical features were reviewed and clinical data analyzed. RESULTS: Saccular dural diverticula (type I, n = 28) caused by defects of both dural layers, dissections between dural layers (type II, n = 29) due to defects of the inner layer, and dural ectasias (type III, n = 9) related to structural changes of the dura were distinguished. For all types, symptoms consisted of local pain followed by signs of radiculopathy or myelopathy, while one patient with dural ectasia presented a low-pressure syndrome and 10 patients with dural dissections additional spinal cord herniation. Type I and type II pathologies required occlusion of their dural defects via extradural (type I) or intradural (type II) approaches. For type III pathologies of the dural sac no surgery was recommended. Favorable results were obtained in all 14 patients with type I and 13 of 15 patients with type II pathologies undergoing surgery. CONCLUSION: The majority of dural pathologies involving root sleeves remain asymptomatic, while those of the dural sac commonly lead to pain and neurological symptoms. Type I and type II pathologies were treated with good long-term results occluding their dural defects, while ectasias of the dural sac (type III) were managed conservatively.


1983 ◽  
Vol 4 ◽  
pp. 271-276 ◽  
Author(s):  
R. A. Sommerfeld ◽  
H. Gubler

Analyses of several years of data show that acoustic emission activity is greater from unstable snowpacks than from stable snowpacks. Two types of signals have been identified: type I spikes and type II long-term elevation of the noise level. It is thought that the type I signals originate from macroscopic cracks. The type II signals may originate from differential movement on shearing surfaces, but this is less certain. Increased noise levels of both types correlate well with slope instability, when the slope stability is known. In some climates the limited range of signal detection might be a significant problem. A foam-mounted geophone set into the snow near active layers appears to be the best sensor available at present.


2021 ◽  
Vol 10 (22) ◽  
pp. 5308
Author(s):  
Renana Yemini ◽  
Ruth Rahamimov ◽  
Ronen Ghinea ◽  
Eytan Mor

With scarce organ supply, a selection of suitable elderly candidates for transplant is needed, as well as auditing the long-term outcomes after transplant. We conducted an observational cohort study among our patient cohort >60 years old with a long follow up. (1). Patients and Methods: We used our database to study the results after transplant for 593 patients >60 years old who underwent a transplant between 2000–2017. The outcome was compared between live donor (LD; n = 257) recipients, an old-to-old (OTO, n = 215) group using an extended criteria donor (ECD) kidney, and a young-to-old (YTO, n = 123) group using a standard-criteria donor. The Kaplan−Meir method was used to calculate the patient and graft survival and Cox regression analysis in order to find risk factors associated with death. (2). Results: The 5- and 10-year patient survival was significantly better in the LD group (92.7% and 66.9%) compared with the OTO group (73.3% and 42.8%) and YTO group (70.9% and 40.6%) (p < 0.0001). The 5- and 10-year graft survival rates were 90.3% and 68.5% (LD), 61.7% and 30.9% (OTO), and 64.1% and 39.9%, respectively (YTO group; p < 0.0001 between the LD and the two DD groups). There was no difference in outcome between patients in their 60’s and their 70’s. Factors associated with mortality included: age (HR-1.060), DM (HR-1.773), IHD (HR-1.510), and LD/DD (HR-2.865). (3). Conclusions: Our 17-years of experience seems to justify the rational of an old-to-old allocation policy in the elderly population. Live-donor transplant should be encouraged whenever possible. Each individual decision of elderly candidates for transplant should be based on the patient’s comorbidity and predicted life expectancy.


2021 ◽  
Vol 16 (1) ◽  
Author(s):  
Chun-Yu Lin ◽  
Tao-Hsin Tung ◽  
Meng-Yu Wu ◽  
Chi-Nan Tseng ◽  
Feng-Chun Tsai

Abstract Background The DeBakey classification divides Stanford acute type A aortic dissection (ATAAD) into DeBakey type I (D1) and type II (D2) according to the extent of acute aortic dissection (AAD). This retrospective study aimed to compare the early and late outcomes of D1-AAD and D2-AAD through a propensity score-matched analysis. Methods Between January 2009 and April 2020, 599 consecutive patients underwent ATAAD repair at our institution, and were dichotomized into D1 (n = 543; 90.7%) and D2 (n = 56; 9.3%) groups. Propensity scoring was performed with a 1:1 ratio, resulting in a matched cohort of 56 patients per group. The clinical features, postoperative complications, 5-year cumulative survival and freedom from reoperation rates were compared. Results In the overall cohort, the D1 group had a lower rate of preoperative shock and more aortic arch replacement with longer cardiopulmonary bypass time. The D1 group had a higher in-hospital mortality rate than the D2 group in overall (15.8% vs 5.4%; P = 0.036) and matched cohorts (19.6% vs 5.4%; P = 0.022). For patients that survived to discharge, the D1 and D2 groups demonstrated similar 5-year survival rates in overall (77.0% vs 85.2%; P = 0.378) and matched cohorts (79.1% vs 85.2%; P = 0.425). The 5-year freedom from reoperation rates for D1 and D2 groups were 80.0% and 97.1% in overall cohort (P = 0.011), and 93.6% and 97.1% in matched cohort (P = 0.474), respectively. Conclusions Patients with D1-AAD had a higher risk of in-hospital mortality than those with D2-AAD. However, for patients who survived to discharge, the 5-year survival rates were comparable between both groups.


Sign in / Sign up

Export Citation Format

Share Document