scholarly journals Comparing right- and left sided injection-drug related infective endocarditis

2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Allan Clarelin ◽  
Magnus Rasmussen ◽  
Lars Olaison ◽  
Sigurdur Ragnarsson

AbstractThe aim of the study was to compare background characteristics, microbiology and outcome of patients with right-sided and left-sided intravenous drug use (IDU) associated infective endocarditis (IE). A nationwide retrospective study using the Swedish Registry on Infective Endocarditis between 2008 and 2019 was conducted. A total of 586 people with IDU-IE were identified and divided into left-sided (n = 204) and right-sided (n = 382) IE. Descriptive statistics, Cox-regression and Kaplan–Meier survival estimates were used. The mean age of patients in the left-sided group was 46 years compared to 35 years in the right-sided group, p < 0.001. Left-sided IE had a higher proportion of females. Staphylococcus aureus was the causative pathogen in 48% of cases in the left-sided group compared to 88% in the right-sided group. Unadjusted and adjusted long-term survival was better in right-sided IE compared to left-sided IE. Independent predictors of long-term mortality were increasing age, end-stage renal disease, nosocomial infection, brain emboli and left-sided IE. Left-sided IE was common in people with IDU but the proportion of females with left-sided IE was low. S. aureus was twice as common in right-sided IE compared to left-sided IE, and the long-term prognosis of right sided IDU-associated IE was better compared to left-sided IE despite the fact that few were operated.

Nutrients ◽  
2018 ◽  
Vol 10 (8) ◽  
pp. 1035 ◽  
Author(s):  
Chieh-Li Yen ◽  
Kun-Hua Tu ◽  
Ming-Shyan Lin ◽  
Su-Wei Chang ◽  
Pei-Chun Fan ◽  
...  

Background: A beneficial effect of a ketoanalogue-supplemented low-protein diet (sLPD) in postponing dialysis has been demonstrated in numerous previous studies. However, evidence regarding its effect on long-term survival is limited. Our study assessed the long-term outcomes of patients on an sLPD after commencing dialysis. Methods: This retrospective study examined patients with new-onset end-stage renal disease with permanent dialysis between 2001 and 2013, extracted from Taiwan’s National Health Insurance Research Database. Patients who received more than 3 months of sLPD treatment in the year preceding the start of dialysis were extracted. The outcomes studied were all-cause mortality, infection rate, and major cardiac and cerebrovascular events (MACCEs). Results: After propensity score matching, the sLPD group (n = 2607) showed a lower risk of all-cause mortality (23.1% vs. 27.6%, hazard ratio (HR) 0.77, 95% confidence interval (CI) 0.70–0.84), MACCEs (19.2% vs. 21.5%, HR 0.86, 95% CI 0.78–0.94), and infection-related death (9.9% vs. 12.5%, HR 0.76, 95% CI 0.67–0.87) than the non-sLPD group did. Conclusion: We found that sLPD treatment might be safe without long-term negative consequences after dialysis treatment.


Circulation ◽  
2015 ◽  
Vol 132 (suppl_3) ◽  
Author(s):  
Mahmoud Diab ◽  
Christoph Sponholz ◽  
Michael Bauer ◽  
Andreas Kortgen ◽  
Philipp Scheffel ◽  
...  

Background: Infective endocarditis (IE) is a dangerous disease with high mortality (20-40%). A leading cause of death is multi-organ failure (MODS) with liver dysfunction (LD) as major contributor. Data on LD in IE patients are scarce. We assessed the impact of preoperative - and newly occurring LD on in-hospital mortality and long-term survival in IE patients. Methods: We retrospectively reviewed our database for surgery of left-sided endocarditis between 1/07 and 4/13. We used the hepatic Sepsis-related Organ Failure Assessment (hSOFA) score to assess the degree of LD. We performed Chi-Square, Cox regression and multivariate analyses. Results: The 308 patients had a mean age of 62 ±13.9. Preoperative LD (hSOFA > 0, Bilirubin > 32 μmol/L) was present in 1/4 (n=81) of patients and was associated with severely elevated in-hospital mortality (51.9% vs.14.6% without preoperative LD, p<0.001). Newly-occurring postoperative LD developed in another quarter (n=57 of 227 patients without LD) of patients and was associated with elevated in-hospital mortality (24.6% vs. 11.2%, p<0.001). Kaplan-Meyer 5-year survival was significantly better in patients without LD (51% vs. 19.9%, p<0.01). Survival curves were practically identical after the perioperative phase was over (Fig.). Quality of life in survivors was also the same. Cox regression analysis revealed preoperative LD as independent predictor of long-term survival (adjusted hazard ratio 1.695, 95% confidence interval 1.160-2.477, p=0.009) and duration of cardiopulmonary bypass (CPB) and S. aureus infection as independent predictors of newly-occurring postoperative LD. Conclusions: LD in patients with endocarditis is a significant independent risk factor for in-hospital mortality. A considerable fraction of patients develop LD perioperatively, which is associated with cardiopulmonary bypass-duration and S. aureus infection. However, after surviving surgery, prognosis no longer seems to be predicted by LD.


2021 ◽  
pp. postgradmedj-2019-137292
Author(s):  
Feng-You Kuo ◽  
Wei-Chun Huang ◽  
Pei-Ling Tang ◽  
Chin-Chang Cheng ◽  
Cheng-Hung Chiang ◽  
...  

BackgroundUse of statin has been associated with reduced risk of cardiovascular diseases events and mortality. However, in patients with end-stage renal disease (ESRD), the protective effects of statin are controversial. To evaluate the impact of chronic statin use on clinical outcomes of patients with acute myocardial infarction (AMI) with ESRD.MethodsWe enrolled 8056 patients with ESRD who were initially diagnosed and admitted for first AMI from Taiwan’s National Health Insurance Research Database. Of which, 2134 patients underwent statin therapy. We randomly selected and use age, sex, hypertension, diabetes mellitus (DM), peripheral vascular diseases (PVD), heart failure (HF), cerebrovascular accidents (CVA), chronic obstructive pulmonary disease, matched with the study group as controls (non-stain user). We compared the effects of statin use in term of all-cause death among patients with AMI with ESRD.ResultsStatin use resulted in a significantly higher survival rate in patients ith AMI with ESRD compared with non-statin users. After adjusted the comorbidities the male patients and patients with DM, PVD, HF and CVA had lower long-term survival rate (all p<0.001). Patients who underwent percutaneous coronary intervention (p<0.001), ACE inhibitors/angiotensin II receptor blockers (p<0.001), β receptor blockers (p<0.001) and statin therapy (p=0.007) had better long-term survival rate. Patients with AMI with ESRD on statin therapy exhibited a significantly lower risk of mortality compared with non-statin users (p<0.0001).ConclusionAmong patients with ESRD with AMI, statin therapy was associated with reduced all-cause mortality.


2020 ◽  
Vol 13 (1) ◽  
pp. 25-29 ◽  
Author(s):  
Iisa Lindström ◽  
Sara Protto ◽  
Niina Khan ◽  
Jussi Hernesniemi ◽  
Niko Sillanpää ◽  
...  

BackgroundMasseter area (MA), a surrogate for sarcopenia, appears to be useful when estimating postoperative survival, but there is lack of consensus regarding the potential predictive value of sarcopenia in acute ischemic stroke (AIS) patients. We hypothesized that MA and density (MD) evaluated from pre-interventional CT angiography scans predict postinterventional survival in patients undergoing mechanical thrombectomy (MT).Materials and methods312 patients treated with MT for acute occlusions of the internal carotid artery (ICA) or the M1 segment of the middle cerebral artery (M1-MCA) between 2013 and 2018. Median follow-up was 27.4 months (range 0–70.4). Binary logistic (alive at 3 months, OR <1) and Cox regression analyses were used to study the effect of MA and MD averages (MAavg and MDavg) on survival.ResultsIn Kaplan–Meier analysis, there was a significant inverse relationship with both MDavg and MAavg and mortality (MDavg P<0.001, MAavg P=0.002). Long-term mortality was 19.6% (n=61) and 3-month mortality 12.2% (n=38). In multivariable logistic regression analysis at 3 months, per 1-SD increase MDavg (OR 0.61, 95% CI 0.41 to 0.92, P=0.018:) and MAavg (OR 0.57, 95% CI 0.35 to 0.91, P=0.019) were the independent predictors associated with lower mortality. In Cox regression analysis, MDavg and MAavg were not associated with long-term survival.ConclusionsIn acute ischemic stroke patients, MDavg and MAavg are independent predictors of 3-month survival after MT of the ICA or M1-MCA. A 1-SD increase in MDavg and MAavg was associated with a 39%–43% decrease in the probability of death during the first 3 months after MT.


1996 ◽  
Vol 16 (1_suppl) ◽  
pp. 505-509 ◽  
Author(s):  
Timothy E. Bunchman

The proper treatment of an infant with end-stage renal disease depends upon a number of factors including parental willingness to take on the task, experience of the health-care team, local and regional resources, and society's willingness to accept this support as a standard of care. Whereas the abilityto keep infants aliveon peritoneal dialysis (PD) is obtainable, it is not without physical, financial, as well as emotional cost. In order for a family to agree to take on such a task, an understanding of the risks and long-term prognosis should be offered. This “informed consent” is difficult to obtain in such a highly charged situation when emotions often dictate choice independently of logic. Long-term outcome of infants on PD has improved over time, yet is still fraught with complications. Options of treatment or nontreatment are explored.


2014 ◽  
Vol 34 (1) ◽  
pp. 85-94 ◽  
Author(s):  
Yao-Peng Hsieh ◽  
Chia-Chu Chang ◽  
Yao-Ko Wen ◽  
Ping-Fang Chiu ◽  
Yu Yang

ObjectivePeritoneal dialysis (PD) has become more prevalent as a treatment modality for end-stage renal disease, and peritonitis remains one of its most devastating complications. The aim of the present investigation was to examine the frequency and predictors of peritonitis and the impact of peritonitis on clinical outcomes.MethodsOur retrospective observational cohort study enrolled 391 patients who had been treated with continuous ambulatory PD (CAPD) for at least 90 days. Relevant demographic, biochemical, and clinical data were collected for an analysis of CAPD-associated peritonitis, technique failure, drop-out from PD, and patient mortality.ResultsThe peritonitis rate was 0.196 episodes per patient–year. Older age (>65 years) was the only identified risk factor associated with peritonitis. A multivariate Cox regression model demonstrated that technique failure occurred more often in patients experiencing peritonitis than in those free of peritonitis ( p < 0.001). Kaplan–Meier analysis revealed that the group experiencing peritonitis tended to survive longer than the group that was peritonitis-free ( p = 0.11). After multivariate adjustment, the survival advantage reached significance (hazard ratio: 0.64; 95% confidence interval: 0.46 to 0.89; p = 0.006). Compared with the peritonitis-free group, the group experiencing peritonitis also had more drop-out from PD ( p = 0.03).ConclusionsThe peritonitis rate was relatively low in the present investigation. Elderly patients were at higher risk of peritonitis episodes. Peritonitis independently predicted technique failure, in agreement with other reports. However, contrary to previous studies, all-cause mortality was better in patients experiencing peritonitis than in those free of peritonitis. The underlying mechanisms of this presumptive “peritonitis paradox” remain to be clarified.


Medicina ◽  
2019 ◽  
Vol 56 (1) ◽  
pp. 2
Author(s):  
Anna Szarnecka-Sojda ◽  
Wojciech Jacheć ◽  
Maciej Polewczyk ◽  
Agnieszka Łętek ◽  
Jarosław Miszczuk ◽  
...  

Background and Objectives: An increase in the incidence of end-stage renal disease (ESRD) is associated with the need for a wider use of vascular access. Although arteriovenous (A-V) fistula is a preferred form of vascular access, for various reasons, permanent catheters are implanted in many patients. Materials and Methods: A retrospective analysis of clinical data was carried out in 398 patients (204 women) who in 2010–2016 were subjected to permanent dialysis catheters implantation as first vascular access or following A-V fistula dysfunction. The factors influencing the risk of complications related to vascular access and mortality were evaluated and the comparison of the group of patients with permanent catheter implantation after A-V fistula dysfunction with patients with first-time catheter implantation was carried out. Results: The population of 398 people with ESRD with mean age of 68.73 ± 13.26 years had a total of 495 permanent catheters implanted. In 129 (32.6%) patients, catheters were implanted after dysfunction of a previously formed dialysis fistula. An upward trend was recorded in the number of permanent catheters implanted in relation to A-V fistulas. Ninety-two infectious complications (23.1%) occurred in the study population in 65 patients (16.3%). Multivariate analysis showed that permanent catheters were more often used as the first vascular access option in elderly patients and cancer patients. Mortality in the mean 1.38 ± 1.17 years (min 0.0, max 6.70 years) follow-up period amounted to 50%. Older age and atherosclerosis were the main risk factors for mortality. Patients with dialysis fistula formed before the catheter implantation had a longer lifetime compared to the group in which the catheter was the first access. Conclusion: The use of permanent catheters for dialysis therapy is associated with a relatively high incidence of complications and low long-term survival. The main factors determining long-term survival were age and atherosclerosis. Better prognosis was demonstrated in patients after the use of A-V fistula as the first vascular access option.


Blood ◽  
2010 ◽  
Vol 116 (21) ◽  
pp. 3453-3453 ◽  
Author(s):  
Stuart L Goldberg ◽  
Marc Elmann ◽  
Mark Kaminetzky ◽  
Eriene-Heidi Sidhom ◽  
Anthony R Mato ◽  
...  

Abstract Abstract 3453 Individuals undergoing allogeneic transplantation receive multiple red blood cell transfusions both as part of the transplant procedure and as part of the pre-transplant care of the underlying disease. Therefore these patients may be at risk for complications of transfusional iron overload. Several studies have noted that individuals entering the transplant with baseline elevated serum ferritin values have decreased overall survival and higher rates of disease relapse. Whether the iron is a direct contributor to inferior outcomes or is a marker of more advanced disease (thereby requiring greater transfusions) is unclear. Little is known about the incidence and consequences of iron overload among long-term survivors of allogeneic transplantation. Methods: Using Kaplan-Meier and Cox regression analyses, we performed a single center, retrospective cohort study of consecutive allogeneic transplants performed at Hackensack University Medical Center from January 2002 through June 30, 2009 to determine the association between serum ferritin (measured approximately 1 yr post allogeneic transplant) and overall survival. Results: During the study time frame, 637 allogeneic transplants (Donor Lymphocyte Infusion procedures excluded) were performed at our center and 342 (54%) survived ≥ one year. Among 1-year survivors 240 (70%) had post-transplant serum ferritin values available for review, including 132 (55%) allogeneic sibling, 68 (28%) matched unrelated, and 40 (17%) mismatched unrelated donor transplants. The median post-transplant ferritin value among 1-year survivors of allogeneic transplant was 628 ng/ml (95% CI 17, 5010), with 93 (39%) above 1000 ng/ml and 40 (17%) above 2500 ng/ml. The median post-transplant ferritin levels varied by underlying hematologic disease (aplastic anemia = 1147, acute leukemia = 1067, MDS = 944, CLL = 297, CML = 219, lymphoma = 123, multiple myeloma = 90). The Kaplan-Meier projected 5-year survival rate was 76% for the cohort that had survived one year and had available ferritin values. Fifty late deaths have occurred; causes of late death were disease relapse (n=37, 74%), GVHD (n=7, 14%), infection (n=4, 8%), cardiac (n=1, 2%) and second malignancy (n=1, 2%). The 1-year post-transplant serum ferritin value was a significant predictor of long term survival. Using a cut-off ferritin value of 1000 ng/ml, the 5-year projected survivals were 85% (95 CI 75%-91%) and 64% (95% CI 52–73%) for the low and high ferritin cohorts respectively (Figure, log-rank p<0.001), with a hazard ratio of 3.5 (95% CI 2–6.4, p<0.001). Similarly a serum ferritin value >2500 ng/ml was associated with inferior survival (HR 2.97, p<0.001). Underlying hematologic disease also correlated with 5-year projected survival including 70%, 83%, and 89% for acute leukemia/MDS, lymphoma/myeloma/CLL, and aplastic anemia/CML groupings, respectively (log-rank p<0.01 for leukemia/MDS vs other groupings). Patients receiving bone marrow grafts did better than those receiving peripheral blood stem cells (HR = 2.2; p = 0.03). Age, gender, donor type (sibling, matched unrelated, mismatch unrelated) and intensity of regimen (ablative vs. non-myeloablative) were not predictive of inferior survival in univariate analysis. In the multivariate Cox-regression analysis, elevated post-transplant ferritin >1000 ng/ml (HR 3.3, 95%CI 1.6–6.1; p<0.001) and diagnosis of acute leukemia/MDS (HR 4.5, 95%CI 1.1–18.7; p=0.04) remained independent predictors of inferior survival, even when adjusted for age, gender, type of graft, donor type, and intensity of conditioning regimen. Relapse deaths (25% vs. 9%; p<0.001) and GVHD deaths (6% vs 0.6%; p=0.03) were more common in the high ferritin cohort. Conclusions: Among patients who have survived one-year following allogeneic transplantation, a post-transplant serum ferritin value greater than 1000 ng/ml is a predictor of inferior long-term outcomes. To our knowledge this is the first report on the importance of late monitoring of serum ferritin, but it is in agreement with prior studies suggesting a pre-transplant ferritin value is a predictor of outcomes. Prospective studies attempting to modify outcomes by reducing post-transplant iron overload states are needed. Disclosures: No relevant conflicts of interest to declare.


2002 ◽  
Vol 66 (6) ◽  
pp. 595-595 ◽  
Author(s):  
Jiro Aoki ◽  
Yuji Ikari ◽  
Hiroyoshi Nakajima ◽  
Tokuichiro Sugimoto ◽  
Kazuhiro Hara

2021 ◽  
Vol 2021 ◽  
pp. 1-6
Author(s):  
Jialing Zhang ◽  
Xiangxue Lu ◽  
Shixiang Wang ◽  
Han Li

Background. The neutrophil-to-lymphocyte ratio (NLR) and platelet-to-lymphocyte ratio (PLR) are markers for systemic inflammation condition. Although NLR has emerged as a risk factor for poor survival in end-stage renal disease (ESRD) patients, the relationship between PLR and mortality is still unknown. We aimed to explore the interaction of NLR and PLR in predicting mortality in hemodialysis (HD) patients. Method. We enrolled 360 HD patients for a 71-month follow-up. The endpoint was all-cause and cardiovascular (CV) mortality. Pearson correlation analysis was conducted to evaluate the relationship between factors and NLR or PLR. Kaplan-Meier curves and Cox proportional analysis were used to assess the prognostic value of NLR and PLR. Results. NLR was positively correlated with neutrophil and negatively correlated with lymphocyte, hemoglobin, and serum albumin. PLR was positively correlated with neutrophil and platelet and negatively correlated with lymphocyte and hemoglobin. In multivariate Cox regression, a higher NLR level was independently associated with all-cause mortality (OR 2.011, 95% CI 1.082-3.74, p = 0.027 ), while a higher PLR level might predict CV mortality (OR 2.768, 95% CI 1.147-6.677, p = 0.023 ) in HD patients. Conclusion. NLR and PLR are cheap and reliable biomarkers for all-cause and CV mortality to predict survival in HD patients.


Sign in / Sign up

Export Citation Format

Share Document