scholarly journals A case report on lipofuscin deposition in a graft biopsy two years after kidney transplantation: an insignificant bystander or a pathogenic benefactor?

2019 ◽  
Vol 20 (1) ◽  
Author(s):  
Vivian W. Y. Leung ◽  
Sarah-Jeanne Pilon ◽  
Pierre O. Fiset ◽  
Shaifali Sandal

Abstract Background Lipofuscin deposition is a characteristic manifestation of aging. There is very limited literature in humans and in animals describing these deposits in native kidneys. Overall, it is thought to be non-pathogenic and successful transplants from a donor with lipofuscin deposits have been reported. We present the case of a patient who underwent a kidney transplant and a for-cause biopsy post-transplantation incidentally revealed lipofuscin deposition. Case presentation A 48-year old gentleman with a past medical history of diabetes, hypertension, coronary artery disease, and ischemic and then hemorrhagic cardiovascular accident underwent a successful kidney transplant. His donor was an expanded criteria donor with no major past medical history. Post-transplant course was complicated by delayed graft function requiring one dialysis treatment for hyperkalemia. After that he had an uneventful course and achieved a baseline creatinine of 1.2 mg/dL, with no proteinuria. On a routine 19-month follow-up he was noted to have proteinuria and an antibody against the major-histocompatibility-complex class I-related chain A. A graft biopsy revealed acute antibody-mediated rejection and impressive lipofuscin deposition. He was subsequently treated with an antibody-mediated rejection protocol that included high dose steroids, Rituximab, plasmapheresis, and intravenous immunoglobulin, but responded poorly to this regimen. A 6-month follow up biopsy continued to show lipofuscin deposition, with similar microvascular injury scores and 12-months later his creatinine remained stable but his proteinuria worsened. Patient was struggling with recurrent infectious episodes requiring hospitalizations and thus no further diagnostic or therapeutic treatments were pursued. Conclusions Lipofuscin deposition has been reported in solid organ transplants but the significance and cause are not well understood. Several physiologic and some pathologic causes to these deposits have been reported including age, diabetes, medications and a genetic syndrome. We propose that immunologic causes such as rejection in the presence of other risk factors could potentiate the oxidative stress leading to excessive lipofuscin deposition in kidney transplants. In the case of our patient, we conclude that these deposits were likely recipient-derived, and postulate that the cumulative burden of inflammation from rejection, and underlying medical conditions led to increased lipofuscin deposition. We speculate them to be an innocent bystander.

2021 ◽  
Vol 8 (1) ◽  
pp. e001036
Author(s):  
Sandra Lindstedt ◽  
Edgar Grins ◽  
Hillevi Larsson ◽  
Johan Nilsson ◽  
Hamid Akbarshahi ◽  
...  

There have been a few reports of successful lung transplantation (LTx) in patients with SARS-CoV-2-induced acute respiratory distress syndrome (ARDS); however, all reports were with rather short follow-up. Here we present a 62-year-old man without prior lung diseases. Following SARS-CoV-2-induced ARDS and 6 months of extracorporeal membrane oxygenation, he underwent LTx. 3 months post-transplantation he developed acute hypoxia requiring emergency intubation. Chest imaging showed acute rejection, and de novo DQ8-DSA was discovered. He was treated with a high dose of corticosteroids and plasmapheresis and was extubated 4 days later, yet the de novo DQ8-DSA remained. After sessions of plasmapheresis and rituximab, the levels of de novo DQ8-DSA remained unchanged. Nine months post-transplantation the patient died of respiratory failure. We herein discuss the decision to transplant, the transplantation itself and the postoperative course with severe antibody-mediated rejection. In addition, we evaluated the histological changes of the explanted lungs and compared these with end-stage idiopathic pulmonary fibrosis tissue, where both similarities and differences are seen. With the current case experience, one might consider close monitoring regarding DSA, and gives further support that LTx should only be considered for very carefully selected patients.


2020 ◽  
Vol 35 (Supplement_3) ◽  
Author(s):  
Alessandra Palmisano ◽  
Eleonora Salsi ◽  
Paride Fenaroli ◽  
Anna Maria Degli Antoni ◽  
Ilaria Gandolfini ◽  
...  

Abstract Background and Aims ESBL-producing and carbapenem resistant (CR) Enterobacteriaceae are a common cause of severe infection, morbidity and mortality in kidney transplant recipients (KTR). Few studies have investigated the risk factors for ESBL-producing/CR Enterobacteriaceae colonization and infection in this group of patients, the effect of colonization and infection on KTR’s renal graft function, and the use of hospital resources. Method Retrospective follow-up study on a consecutive series of patients undergoing kidney transplantation at Parma University Hospital (Italy) between January-2016 and December-2018. We performed a multivariable-adjusted analysis of the predictive factor associated with MDR infection/colonization via general linear models for prevalence- and risk- ratio. Renal function (eGFR) decline was compared by mixed-effects random-coefficients models, hospital resources by negative binomial regression. Results We enrolled 180 KTR (mean recipient’s age: 52.4 [SD 12.4]; males 65%; mean donor’s age: 54.6 [SD 15.6]) and followed them up for 2-years post transplantation. Cumulative prevalence of colonization 3-months post-transplantation and cumulative incidence of infection were 26.1% and 9.4% for ESBL, and 4.4% and 1.6% for CR. ESBL colonization was associated with hemodialysis vs peritoneal dialysis (93% vs 70% non-colonized; adjusted RR 0.21 [95% CI: 0.06 to 0.98]), dialysis vintage (mean months: 65 vs 42; adjusted associated with being above the median, RR 2.17 [95% CI: 1.32 to 3.55]) and retention of ureteral stent for more than one month after transplant (28% vs 12%; adjusted RR 2.09 [95% CI: 1.27 to 3.44]) ; ESBL infection was associated with retention of ureteral stent (47% vs 13%; adjusted RR 4.89 [95% CI 2.11 to 11.35]) whereas CR colonization was associated with surgical complication during transplant admission (50% vs 15%; adjusted RR 4.61 [95% CI 1.28 to 16.66]). Two patients (both with CR) died over the study follow-up, whereas none of the patients lost the graft. CR infection was associated lower baseline (3-months post-transplantation) eGFR compared to the other groups (-28.4mL/min/1.73m2 [95% CI: -50.5 to -6.3]); a numerically more rapid decline (up to - 5mL/min/year) of eGFR, albeit not statistically significant, was observed in patients with CR colonization compared to non-colonized at 2 years of follow-up. In comparison with non-colonized patients, adjusted mean days of carbapenem treatment in ESBL/CR colonized/infected was 5.7 vs 0.7 (P=0.003); length-of-hospital stay 5.8 vs 1.0 (P=0.055); days on drug-resistant-infection intravenous-outpatient-therapy 20.7 vs 0.1 (P= 0.008). Conclusions The study shows that ESBL and CR colonization and infection in KTR are statistically associated with longer hemodialysis vintage, urological procedures, and surgical complications. They cause an increase in the hospital resources use and may jeopardize transplant outcomes.


Author(s):  
Antonia Margarete Schuster ◽  
N. Miesgang ◽  
L. Steines ◽  
C. Bach ◽  
B. Banas ◽  
...  

AbstractThe B cell activating factor BAFF has gained importance in the context of kidney transplantation due to its role in B cell survival. Studies have shown that BAFF correlates with an increased incidence of antibody-mediated rejection and the development of donor-specific antibodies. In this study, we analyzed a defined cohort of kidney transplant recipients who were treated with standardized immunosuppressive regimens according to their immunological risk profile. The aim was to add BAFF as an awareness marker in the course after transplantation to consider patient’s individual immunological risk profile. Included patients were transplanted between 2016 and 2018. Baseline data, graft function, the occurrence of rejection episodes, signs of microvascular infiltration, and DSA kinetics were recorded over 3 years. BAFF levels were determined 14 d, 3 and 12 months post transplantation. Although no difference in graft function could be observed, medium-risk patients showed a clear dynamic in their BAFF levels with low levels shortly after transplantation and an increase in values of 123% over the course of 1 year. Patients with high BAFF values were more susceptible to rejection, especially antibody-mediated rejection and displayed intensified microvascular inflammation; the combination of high BAFF + DSA puts patients at risk. The changing BAFF kinetics of the medium risk group as well as the increased occurrence of rejections at high BAFF values enables BAFF to be seen as an awareness factor. To compensate the changing immunological risk, a switch from a weaker induction therapy to an intensified maintenance therapy is required.


2013 ◽  
Vol 70 (9) ◽  
pp. 848-853 ◽  
Author(s):  
Ljiljana Ignjatovic ◽  
Rajko Hrvacevic ◽  
Dragan Jovanovic ◽  
Zoran Kovacevic ◽  
Neven Vavic ◽  
...  

Background/Aim. Tremendous breakthrough in solid organ transplantation was made with the introduction of calcineurin inhibitors (CNI). At the same time, they are potentially nephrotoxic drugs with influence on onset and progression of renal graft failure. The aim of this study was to evaluate the outcome of a conversion from CNIbased immunosuppressive protocol to sirolimus (SRL) in recipients with graft in chronic kidney disease (CKD) grade III and proteinuria below 500 mg/day. Methods. In the period 2003-2011 24 patients (6 famale and 18 male), mean age 41 ? 12.2 years, on triple immunosuppressive therapy: steroids, antiproliferative drug [mycophenolate mofetil (MMF) or azathiopirine (AZA)] and CNI were switched from CNI to SRL and followe-up for 76 ? 13 months. Nine patients (the group I) had early postransplant conversion after 4 ? 3 months and 15 patients (the group II) late conversion after 46 ? 29 months. During the regular outpatient controls we followed graft function through the serum creatinine and glomerular filtration rate (GFR), proteinuria, lipidemia and side effects. Results. Thirty days after conversion, in all the patients GFR, proteinuria and lipidemia were insignificantly increased. In the first two post-conversion months all the patients had at least one urinary or respiratory infection, and 10 patients reactivated cytomegalovirus (CMV) infection or disease, and they were successfully treated with standard therapy. After 21 ? 11 months 15 patients from both groups discontinued SRL therapy due to reconversion to CNI (10 patients) and double immunosuppressive therapy (3 patients), return to hemodialysis (1 patient) and death (1 patient). Nine patients were still on SRL therapy. By the end of the follow-up they significantly improved GFR (from 53.2 ? 12.7 to 69 ? 15 mL/min), while the increase in proteinuria (from 265 ? 239 to 530.6 ? 416.7 mg/day) and lipidemia (cholesterol from 4.71 ? 0.98 to 5.61 ? 1.6 mmol/L and triglycerides from 2.04 ? 1.18 to 2.1 ? 0.72 mmol/L) were not significant. They were stable during the whole follow-up period. Ten patients were reconverted from SRL to CNI due to the abrupt increase of proteinuria (from 298 ? 232 to 1639 ? 1641/mg day in 7 patients), rapid growth of multiple ovarian cysts (2 patients) and operative treatment of persisted hematoma (1 patient). Thirty days after reconversion they were stable with an insignificant decrease in GFR (from 56.10 ? 28.09 to 47 ? 21 mL/min) and significantly improved proteinuria (from 1639 ? 1641 to 529 ? 688 mg/day). By the end of the follow-up these patients showed nonsignificant increase in the serum creatinine (from 172 ? 88 to 202 ? 91 mmol/L), decrease in GFR (from 56.10 ? 28.09 to 47 ? 21 mL/day) and increased proteinuria (from 528.9 ? 688 to 850 ? 1083 mg/min). Conclusion. In this small descriptive study, conversion from CNI to SRL was followed by an increased incidence of infections and consecutive 25-50% dose reduction in the second antiproliferative agent (AZA, MMF), with a possible influence on the development of glomerulopathy in some patients, which was the major reason for discontinuation of SRL therapy in the 7 (29%) patients. Nine (37.5%) of the patients experienced the greatest benefit of CIN to SRL conversion without serious post-conversion complications.


2021 ◽  
Vol 36 (Supplement_1) ◽  
Author(s):  
Clara Pardinhas ◽  
Rita Leal ◽  
Francisco Caramelo ◽  
Teofilo Yan ◽  
Carolina Figueiredo ◽  
...  

Abstract Background and Aims As kidney transplants are growing in absolute numbers, so are patients with failed allografts and thus potential candidates for re-transplantation. Re-transplantation is challenging due to immunological barriers, surgical difficulties and clinical complexities but it has been proven that successful second transplantation improves life expectancy over dialysis. It is important to evaluate re-transplantation outcomes since 20% of patients on the waiting list are waiting for a second graft. Our aim was to compare major clinical outcomes such as acute rejection, graft and patient survival, between patients receiving a first or a second kidney transplant. Method We performed a retrospective study, that included 1552 patients submitted to a first (N=1443, 93%) or a second kidney transplant (N=109, 7%), between January 2008 and December 2018. Patients with more than 2 grafts or multi-organ transplant were excluded. Demographic, clinical and histocompatibility characteristics of both groups were registered from our unit database and compared. Delayed graft function was defined has the need of dialysis in the first week post-transplant. All acute rejection episodes were biopsy proven, according to Banff 2017 criteria. Follow-up time was defined at 1st June 2020 for functioning grafts or at graft failure (including death with a functioning graft). Results Recipients of a second graft were significantly younger (43 ±12 vs 50 ± 13 years old, p<0.001) and there were significantly fewer expanded-criteria donors in the second transplant group (31.5% vs 57.5%, p<0.001). The waiting time for a second graft was longer (63±50 vs 48±29 months, p=0.011). HLA mismatch was similar for both groups but PRA was significantly higher for second KT patients (21.6±25% versus 3±9%; p<0.001). All patients submitted to a second KT had thymoglobulin as induction therapy compared to 16% of the first KT group (p<0.001). We found no difference in primary dysfunction or delayed graft function between groups. Acute rejection was significantly more frequent in second kidney transplant recipients (19% vs 5%, p<0.001), being 10 acute cellular rejections, 7 were antibody mediated and 3 were borderline changes. For the majority of the patients (85%), acute rejection occurred in the first-year post-transplant. Death censored graft failure occurred in 236 (16.4%) patients with first kidney transplant and 25 (23%) patients with a second graft, p=0.08. Survival analysis showed similar graft survival for both groups (log-rank p=0.392). We found no difference in patients’ mortality at follow up for both groups. Conclusion Although second graft patients presented more episodes of biopsy proven acute rejection, especially at the first-year post-transplant, we found no differences in death censored graft survival or patients’ mortality for patients with a second kidney transplant. Second transplants should be offered to patients whenever feasible.


2020 ◽  
Vol 4 (1) ◽  
pp. 1-4
Author(s):  
Bernd Ludwig ◽  
Johanna Schneider ◽  
Daniela Föll ◽  
Qian Zhou

Abstract Background Antibody-mediated rejection (AMR) in cardiac transplantation may manifest early within the first weeks after transplantation but also late after months to years following transplantation resulting in mild heart failure to cardiogenic shock. While patients with early cardiac AMR are less affected and seem to have survival rates comparable to transplant recipients without AMR, late cardiac AMR is frequently associated with graft dysfunction, fulminant forms of cardiac allograft vasculopathy, and a high mortality rate. Nevertheless, AMR of cardiac allografts remains difficult to diagnose and to treat. Case summary We report the case of a 47-year-old male patient with late AMR of the cardiac allograft 3 years after heart transplantation. Antibody-mediated rejection was confirmed by endomyocardial biopsy and the presence of donor-specific antibodies (DSA). The patient was treated with high dose of prednisolone, plasmapheresis, intravenous Gamma Globulin, rituximab, immunoadsorption, and bortezomib. Under this treatment regimen left ventricular ejection fraction and pro B-type natriuretic peptide recovered, and the patient improved to New York Heart Association Class I. Currently, 3 years after the diagnosis of cardiac AMR, graft function continues to be nearly normal, and there is no evidence for transplant vasculopathy. Discussion This case illustrates that AMR can occur at any time after transplantation. Although graft function fully recovered after treatment in our patient, the level of DSA remained high, suggesting that DSA may not be a reliable parameter to determine the intensity and duration of the therapy.


Blood ◽  
2006 ◽  
Vol 108 (11) ◽  
pp. 2891-2891 ◽  
Author(s):  
Luznik Leo ◽  
Chen R. Allen ◽  
Kaup Michele ◽  
Bright C. Emilie ◽  
Bolanos-Meade Javier ◽  
...  

Abstract Prolonged pharmacologic immunosuppression is a major obstacle to early immunologic recovery after allogeneic BMT. Based on our results in animal models, we studied whether properly timed high-dose Cy post-HLA matched related and unrelated BMT is an effective strategy for limiting GVHD; we hypothesized that avoiding prolonged immunosuppression would speed immune recovery and reconstitution of regulatory T cells (T regs) thereby decreasing post-transplant complications. We are reporting results on 46 consecutive patients (median age 41, range 1–64) with high-risk hematologic malignancies (20 AML, 12 ALL, 6 NHL, 3 HD, 2 MM, 2 CML, 1 CMMoL); 28 received related and 18 unrelated unmanipulated HLA-matched BM (median of 2.2 x 108 MNC per kg) after conditioning with busulfan on days -7 to -3 and Cy (50 mg/kg/day) on days -2 and -1, and followed by Cy (50 mg/kg/day) on days +3 and +4 as the sole GVHD prophylaxis. All the patients had advanced disease (20 in advanced remission with the rest having refractory disease), and the median follow-up is 13 (range 6–24) months. All but two patients had sustained engraftment. The cumulative incidence of acute grades II–IV and grades III–IV GVHD were 41% and 9%, respectively. All patients with GVHD responded fully to standard therapy (steroids ± tacrolimus) or therapy per BMT CTN0302, and all except 2 patients were rapidly weaned from all immunosuppressive agents. Of the thirty-six patients alive after day 100, only 1 of the 23 patients that received HLA-matched related, and 3 of 13 patients that received unrelated allografts, developed chronic GVHD. Twenty-six (56%) patients are alive, of whom 21 (45%) are in complete remission. There were no deaths secondary to infection or GVHD. CMV reactivation was detected in 11 of 36 (31%) patients, of whom 9 had GVHD. There was no CMV infection. Median (± SEM) CD4+ T cell counts were 99 ± 16/mL and 209 ± 49/mL on days 60 (n = 23) and 180 (n= 8), respectively. Corresponding values for CD8+ T cells were 248 ± 132/mL and 228 ± 161/mL on days 60 and 180, respectively. Patients with grade II–IV GVHD had significantly fewer peripheral blood (PB) CD4+Foxp3+ T cells compared to patients with grade 0–I GVHD (p<0.05). Development of grade II–IV GVHD negatively correlated with the expression of the Foxp3 (p<0.05) and was associated with relatively higher expression of interferon-γ mRNA (p=0.08) in PB, suggesting higher effector function in the absence of Tregs in patients with grade II–IV GVHD. No differences in IL-10 mRNA expression between patients with or without GVHD were found, while significantly higher expression of interleukin-2 mRNA was detected in patients with grade II–IV GVHD (p<0.025). These results indicate that high-dose post-transplantation Cy is effective as a single agent strategy for limiting acute and chronic GVHD after myeloablative HLA-matched related and unrelated allografting; this approach also limits the need for prolonged immunosuppression, resulting in favorable immunoreconstitution with few opportunistic infections in this unfavorable group of patients. Longer follow-up and larger numbers of patients are needed to assess the impact of this strategy on survival.


2017 ◽  
Vol 2017 ◽  
pp. 1-9 ◽  
Author(s):  
Nils Lachmann ◽  
Michael Duerr ◽  
Constanze Schönemann ◽  
Axel Pruß ◽  
Klemens Budde ◽  
...  

Throughout the past years we stepwise modified our immunosuppressive treatment regimen for patients with antibody-mediated rejection (ABMR). Here, we describe three consecutive groups treated with different regimens. From 2005 until 2008, we treated all patients with biopsy-proven ABMR with rituximab (500 mg), low-dose (30 g) intravenous immunoglobulins (IVIG), and plasmapheresis (PPH, 6x) (group RLP,n=12). Between 2009 and June 2010, patients received bortezomib (1.3 mg/m2, 4x) together with low-dose IVIG and PPH (group BLP,n=11). In July 2010, we increased the IVIG dose and treated all subsequent patients with bortezomib, high-dose IVIG (1.5 g/kg), and PPH (group BHP,n=11). Graft survival at three years after treatment was 73% in group BHP as compared to 45% in group BLP and 25% in group RLP. At six months after treatment median serum creatinine was 2.1 mg/dL, 2.9 mg/dL, and 4.2 mg/dL in groups BHP, BLP, and RLP, respectively (p=0.02). Following treatment, a significant decrease of donor-specific HLA antibody (DSA) mean fluorescence intensity from8467±6876to5221±4711(p=0.01) was observed in group BHP, but not in the other groups. Our results indicate that graft survival, graft function, and DSA levels could be improved along with stepwise modifications to our treatment regimen, that is, the introduction of bortezomib and high-dose IVIG treatment.


2019 ◽  
Vol 13 (1) ◽  
pp. 108-115
Author(s):  
Nael Husain Zaer

Background: Drug resistant epilepsy is defined as failure of adequate trials of two tolerated, appropriately chosen and used antiepileptic drug schedules to achieve sustained seizure freedom. Up to 30% of patients referred to clinics with a diagnosis of pharmaco-resistant epilepsy may have been misdiagnosed, and many can be helped by optimizing their treatment.Pseudoresistance, in which seizures persist because the underlying disorder has not been adequately or appropriately treated, must be ruled out or corrected before drug treatment can be considered to have failed. Objectives: The objectives of this study were to determine the causes of drug failure in patients with epilepsy and to differentiate between drug resistant epilepsy and pseudoresistant epilepsy. Type of the study: This is a retrospective study. Method: It is conducted in Baghdad governorate at the epilepsy clinic in the neurosciences hospital during the period from the 1st of February through July 2013. Two hundred patients with refractory epilepsy were involved. These patients attended the epilepsy clinic during 2011 and 2012. The data was collected from the files of the patients including age, gender, weight, history of presenting illness, type of seizure, drugs used, duration of disease, EEG and imaging findings, compliance and follow up. Results: Drug resistance epilepsy constituted a prevalence of 24% (128) as the total number of patients with epilepsy attending the hospital during the same period was 527.The mean age of patients with refractory epilepsy was 25 years. Male were 56.5% (113/200) and urban residents were 70.5% (141/200). The study revealed that 64% (128/200) of refractory epilepsy was attributed to drug resistance; while the remaining proportion was pseudoresistance 36% (72/200). The main cause of pseudoresistance was poor compliance 36.1% (26/72).The most common type of seizure in the sampled patients was generalized tonic clonic seizures in 51.5% (103/200).Compliance was found to be statistically associated with abnormal EEG finding, past medical history (hypertension, cardiac diseases, encephalitis, diabetes mellitus and any significant history) and quality of follow up. The follow-up was found to be statistically associated with the family history, past medical history( encephalitis and hypertension) and compliance of patient. Conclusion:A considerable number of patientsdiagnosed as cases of drug resistant epilepsy had another explanation causing drug failure.The study recommends the application of consensus definition for drug resistant epilepsy and periodic evaluation of patients with drug resistant epilepsy to exclude pseudoresistance.


2021 ◽  
Vol 8 (Supplement_1) ◽  
pp. S213-S214
Author(s):  
Petros Svoronos ◽  
Prakhar Vijayvargiya ◽  
Pradeep Vaitla ◽  
James j Wynn ◽  
Elena Beam ◽  
...  

Abstract Background Based on expert opinion, solid organ transplant recipients from donors with bacteremia are treated with 7-14 days of pre-emptive antibiotic therapy (PAT). However, studies addressing necessity, optimal duration of therapy, and outcomes in kidney transplant recipients (KTR) are lacking. Methods We retrospectively reviewed all kidney transplants performed at our institution from 01/01/2015-01/01/2021 to identify those cases where matched deceased donors had positive blood cultures. Bacteremia was defined per CDC criteria. We analyzed rate of infection in the KTR with the same organism identified in the donor blood culture within 30 days of transplantation. Results A total of 56 KTRs with donor positive blood cultures were identified. Demographic data are summarized in Table 1. Twenty of 56 cases (35.8%) had bacteremia and 36 (64.2%) had organisms classified as common commensals. The most common organisms in the bacteremia group were Gram-negative bacteria (12/20) and Staphylococcus aureus (6/20). Most common commensals were coagulase-negative staphylococci (26/36) (Table 2). All KTR received preoperative antibiotics at the time of transplantation, primarily cefazolin (15/20), and vast majority received TMP/SMX prophylaxis, for Pneumocystis jirovecii, post-transplant (19/20). PAT was administered in 70% (14/20) cases of bacteremia for a median of 8.5 days (IQR 7-14), while six cases were left untreated (Table 2). In contrast, majority of cases with common commensals were not treated (75%, 27/36). Of the cases treated (9/36), median duration of therapy was 7 days (IQR 5-14). No cases of infection with the same organism identified in the donor blood culture were reported in KTR within 30 days of transplantation. Conclusion KTR donors with bacteremia who were treated received a median of 8.5 days of PAT with no instances of breakthrough infection. In contrast, majority of donor blood cultures with organisms classified as common commensals were not treated and did well. Future studies are needed to assess whether perioperative antibiotics coupled with TMP/SMX prophylaxis post-transplantation are sufficient in select cases of transplantation from donors with bacteremia. Disclosures All Authors: No reported disclosures


Sign in / Sign up

Export Citation Format

Share Document