scholarly journals 2290. Carbapenem vs. Piperacillin–tazobactam Definitive Therapy for Patients with Bloodstream Infections Due to Ceftriaxone Not Susceptible Escherichia coli or Klebsiella species

2019 ◽  
Vol 6 (Supplement_2) ◽  
pp. S785-S785
Author(s):  
Shawn H MacVane ◽  
Amira A Bhalodi ◽  
Kyle Spafford ◽  
Romney Humphries ◽  
Niels Oppermann

Abstract Background Definitive therapy with piperacillin–tazobactam (TZP) for ceftriaxone (CRO)-resistant E. coli or K. pneumoniae bloodstream infections (BSI) has been shown to be inferior to carbapenem therapy in a randomized trial. Methods The Premier US database was queried for hospitalized patients with monomicrobial E. coli or Klebsiella spp BSI that were not susceptible (NS) to CRO between June 2015 and May 2018. Adults with index positive blood culture(s) drawn within the first 2 hospital days who were treated with active antibiotic therapy that continued for ≥3 consecutive days were included. We defined antibiotics administered on or prior to Day 3 as empirical therapy and all subsequent days as definitive therapy. Outcomes among patients who received definitive therapy with a carbapenem vs. TZP were evaluated. Results There were 954 patients (mean age, 67.6 years; 52.4% women) who met selection criteria and received active empirical therapy. 729/954 received carbapenem definitive therapy and 38/954 received TZP definitive therapy. Median Charlson Comorbidity Index scores were similar between carbapenem and TZP definitive therapy groups (6 vs. 5, P = 0.78). Crude 14-day in-hospital mortality for CRO-NS BSI due to E. coli or Klebsiella spp. was 4.4%. Definitive therapy with TZP (6/38; 15.8%) was associated with an increased likelihood of 14-day mortality relative to that of a carbapenem (22/729; 3.0%; P < 0.0001). The increased 14-day mortality observation was consistent in a multivariate cox proportional hazards model (adjusted hazard ratio, 5.70; 95% CI, 2.09 to 13.23; P = 0.002). Of patients who received carbapenem definitive therapy, 14-day mortality was 2.7% (19/693) if a carbapenem was part of empirical therapy and 8.3% (3/36; P = 0.06) if empirical therapy did not include a carbapenem. Median post-blood culture length of stay (7 vs. 6 days, P = 0.65) and hospital costs ($13,886 vs. $13,559, P = 0.62) were similar between carbapenem and TZP definitive therapy groups<./p> Conclusion In this large US database, definitive therapy with TZP was associated with an increased likelihood of 14-day mortality relative to that of definitive carbapenem therapy in patients with CRO-NS BSI due to E. coli or Klebsiella spp. These findings support recent clinical evidence in favor of definitive carbapenem therapy for CRO-NS BSI due to E. coli or Klebsiella spp. Disclosures All authors: No reported disclosures.

Author(s):  
I Karaiskos ◽  
G L Daikos ◽  
A Gkoufa ◽  
G Adamis ◽  
A Stefos ◽  
...  

Abstract Background Infections caused by KPC-producing Klebsiella pneumoniae (Kp) are associated with high mortality. Therefore, new treatment options are urgently required. Objectives To assess the outcomes and predictors of mortality in patients with KPC- or OXA-48-Kp infections treated with ceftazidime/avibactam with an emphasis on KPC-Kp bloodstream infections (BSIs). Methods A multicentre prospective observational study was conducted between January 2018 and March 2019. Patients with KPC- or OXA-48-Kp infections treated with ceftazidime/avibactam were included in the analysis. The subgroup of patients with KPC-Kp BSIs treated with ceftazidime/avibactam was matched by propensity score with a cohort of patients whose KPC-Kp BSIs had been treated with agents other than ceftazidime/avibactam with in vitro activity. Results One hundred and forty-seven patients were identified; 140 were infected with KPC producers and 7 with OXA-48 producers. For targeted therapy, 68 (46.3%) patients received monotherapy with ceftazidime/avibactam and 79 (53.7%) patients received ceftazidime/avibactam in combination with at least another active agent. The 14 and 28 day mortality rates were 9% and 20%, respectively. The 28 day mortality among the 71 patients with KPC-Kp BSIs treated with ceftazidime/avibactam was significantly lower than that observed in the 71 matched patients, whose KPC-Kp BSIs had been treated with agents other than ceftazidime/avibactam (18.3% versus 40.8%; P = 0.005). In the Cox proportional hazards model, ultimately fatal disease, rapidly fatal disease and Charlson comorbidity index ≥2 were independent predictors of death, whereas treatment with ceftazidime/avibactam-containing regimens was the only independent predictor of survival. Conclusions Ceftazidime/avibactam appears to be an effective treatment against serious infections caused by KPC-Kp.


2020 ◽  
Vol 38 (6_suppl) ◽  
pp. 293-293
Author(s):  
Alex Z. Wang ◽  
Luke P. O'Connor ◽  
Nitin Yerram ◽  
Johnathan Zeng ◽  
Sherif Mehralivand ◽  
...  

293 Background: Active surveillance (AS) is now considered a well-accepted alternative for low-favorable intermediate risk prostate cancer over definitive therapy. Few studies have incorporated the use of multi-parametric MRI (mpMRI) into the treatment paradigm. In this study we investigate imaging findings that are predictive of a patient dropping off AS. Methods: Our institutional database was queried for all patients who met criteria for active surveillance from 11/2003 to 5/2017. Criteria for inclusion included ≥ 2 mpMRIs, ≥ 2 prostate biopsies, and a diagnosis of Gleason Grade group (GG) 1 or higher. Patients were excluded if they received any other therapy for the treatment of their prostate cancer such as radiation, chemotherapy, focal therapy, or immunologic therapy. Patient demographics, mpMRI, biopsy and most recent follow-up data were recorded. Factors, including PSA density (PSAD), PSA, lesion size, and PI-RADS category, associated with AS progression were evaluated in Cox Proportional Hazards Model. Results: An analysis of a total of 212 patients were performed during the study time interval. 88 patients were dropped from AS during this time and of those patients the median amount of time before removal was 4.70 years (range, 0.7-10.5). On univariable analysis, PI-RADS category (HR, 1.302 for every increase in 1 unit of the PI-RADS category; 95% CI, 1.046-1.62, p = 0.01) and PSAD (HR, 4.98 for every increase in 0.001 ng/mL/cc; 95% CI, 2.127-11.66; p < 0.001) were found to be associated with being removed from AS. On multivariable analysis, both PI-RADS score (HR, 1.281 for every increase in 1 unit of the PI-RADS category; 95% CI, 1.025-1.6; p = 0.003) and PSAD (HR, 4.188 for every increase in 0.001 ng/mL/cc; 95% CI, 1.640-10.7; p < 0.001) remained associated with being removed from AS. Conclusions: PI-RADS categories and PSAD predict the risk of a patient to drop off active surveillance. AS. Patients with these criteria should be considered high risk in any current AS protocol.[Table: see text]


Crisis ◽  
2018 ◽  
Vol 39 (1) ◽  
pp. 27-36 ◽  
Author(s):  
Kuan-Ying Lee ◽  
Chung-Yi Li ◽  
Kun-Chia Chang ◽  
Tsung-Hsueh Lu ◽  
Ying-Yeh Chen

Abstract. Background: We investigated the age at exposure to parental suicide and the risk of subsequent suicide completion in young people. The impact of parental and offspring sex was also examined. Method: Using a cohort study design, we linked Taiwan's Birth Registry (1978–1997) with Taiwan's Death Registry (1985–2009) and identified 40,249 children who had experienced maternal suicide (n = 14,431), paternal suicide (n = 26,887), or the suicide of both parents (n = 281). Each exposed child was matched to 10 children of the same sex and birth year whose parents were still alive. This yielded a total of 398,081 children for our non-exposed cohort. A Cox proportional hazards model was used to compare the suicide risk of the exposed and non-exposed groups. Results: Compared with the non-exposed group, offspring who were exposed to parental suicide were 3.91 times (95% confidence interval [CI] = 3.10–4.92 more likely to die by suicide after adjusting for baseline characteristics. The risk of suicide seemed to be lower in older male offspring (HR = 3.94, 95% CI = 2.57–6.06), but higher in older female offspring (HR = 5.30, 95% CI = 3.05–9.22). Stratified analyses based on parental sex revealed similar patterns as the combined analysis. Limitations: As only register-­based data were used, we were not able to explore the impact of variables not contained in the data set, such as the role of mental illness. Conclusion: Our findings suggest a prominent elevation in the risk of suicide among offspring who lost their parents to suicide. The risk elevation differed according to the sex of the afflicted offspring as well as to their age at exposure.


2020 ◽  
Vol 132 (4) ◽  
pp. 998-1005 ◽  
Author(s):  
Haihui Jiang ◽  
Yong Cui ◽  
Xiang Liu ◽  
Xiaohui Ren ◽  
Mingxiao Li ◽  
...  

OBJECTIVEThe aim of this study was to investigate the relationship between extent of resection (EOR) and survival in terms of clinical, molecular, and radiological factors in high-grade astrocytoma (HGA).METHODSClinical and radiological data from 585 cases of molecularly defined HGA were reviewed. In each case, the EOR was evaluated twice: once according to contrast-enhanced T1-weighted images (CE-T1WI) and once according to fluid attenuated inversion recovery (FLAIR) images. The ratio of the volume of the region of abnormality in CE-T1WI to that in FLAIR images (VFLAIR/VCE-T1WI) was calculated and a receiver operating characteristic curve was used to determine the optimal cutoff value for that ratio. Univariate and multivariate analyses were performed to identify the prognostic value of each factor.RESULTSBoth the EOR evaluated from CE-T1WI and the EOR evaluated from FLAIR could divide the whole cohort into 4 subgroups with different survival outcomes (p < 0.001). Cases were stratified into 2 subtypes based on VFLAIR/VCE-T1WIwith a cutoff of 10: a proliferation-dominant subtype and a diffusion-dominant subtype. Kaplan-Meier analysis showed a significant survival advantage for the proliferation-dominant subtype (p < 0.0001). The prognostic implication has been further confirmed in the Cox proportional hazards model (HR 1.105, 95% CI 1.078–1.134, p < 0.0001). The survival of patients with proliferation-dominant HGA was significantly prolonged in association with extensive resection of the FLAIR abnormality region beyond contrast-enhancing tumor (p = 0.03), while no survival benefit was observed in association with the extensive resection in the diffusion-dominant subtype (p=0.86).CONCLUSIONSVFLAIR/VCE-T1WIis an important classifier that could divide the HGA into 2 subtypes with distinct invasive features. Patients with proliferation-dominant HGA can benefit from extensive resection of the FLAIR abnormality region, which provides the theoretical basis for a personalized resection strategy.


2020 ◽  
Vol 32 (2) ◽  
pp. 160-167 ◽  
Author(s):  
Alessandro Siccoli ◽  
Victor E. Staartjes ◽  
Marlies P. de Wispelaere ◽  
Marc L. Schröder

OBJECTIVEWhile it has been established that lumbar discectomy should only be performed after a certain waiting period unless neurological deficits are present, little is known about the association of late surgery with outcome. Using data from a prospective registry, the authors aimed to quantify the association of time to surgery (TTS) with leg pain outcome after lumbar discectomy and to identify a maximum TTS cutoff anchored to the minimum clinically important difference (MCID).METHODSTTS was defined as the time from the onset of leg pain caused by radiculopathy to the time of surgery in weeks. MCID was defined as a minimum 30% reduction in the numeric rating scale score for leg pain from baseline to 12 months. A Cox proportional hazards model was utilized to quantify the association of TTS with MCID. Maximum TTS cutoffs were derived both quantitatively, anchored to the area under the curve (AUC), and qualitatively, based on cutoff-specific MCID rates.RESULTSFrom a prospective registry, 372 patients who had undergone first-time tubular microdiscectomy were identified; 308 of these patients (83%) obtained an MCID. Attaining an MCID was associated with a shorter TTS (HR 0.718, 95% CI 0.546–0.945, p = 0.018). Effect size was preserved after adjustment for potential confounders. The optimal maximum TTS was estimated at 23.5 weeks based on the AUC, while the cutoff-specific method suggested 24 weeks. Discectomy after this cutoff starts to yield MCID rates under 80%. The 24-week cutoff also coincided with the time point after which the specificity for MCID first drops below 50% and after which the negative predictive value for nonattainment of MCID first surpasses ≥ 20%.CONCLUSIONSThe study findings suggest that late lumbar discectomy is linked with poorer patient-reported outcomes and that—in accordance with the literature—a maximum TTS of 6 months should be aimed for.


2020 ◽  
Vol 30 (Supplement_5) ◽  
Author(s):  
M V Tancredi ◽  
S Sakabe ◽  
C S B Domingues ◽  
G F M Pereira² ◽  
E A Waldman

Abstract Background To estimate median survival time of AIDS patients, with and without tuberculosis (TB), in a cohort in Sao Paulo, Brazil, and to investigate survival predictors. Methods Retrospective cohort study of AIDS patients above 12 years old, registered at the Ministry of Health AIDS surveillance system between 2003-2007, and followed until 2014. Survival analysis used the Kaplan-Meier method and Cox proportional hazards model to estimate hazard ratios (HR), with respective 95% confidence intervals (CI = 95%). Results 35,515 patients were included, being 4,581 (12.9%) co-infected with TB. Among the latter, probability of survival 12 years after AIDS diagnosis was 95.2%, 82.9%, and 21.9%, respectively for patients receiving at least one third line ARV (HAART2), receiving triple therapy (HAART1) and the last one not on ARV. In the same period, the probability of survival for patients without TB, in the same order as for the therapeutic regimens, was 95.2%, 90.5%, and 40.9%, respectively. The main factors associated with survival, adjusted for the year of diagnosis, were: Living in the city of Sao Paulo (HR = 1,16;IC95% 1,01-1,32), living away from the capital city (HR = 1.43; 95%CI 1.25-1.62); or on the coast (HR = 1.49; 95%CI 1.21-1.82); having TB (HR = 1.70; 95%CI 1.49-1.87); above 49 years old (HR = 1.35; 95%CI 1.18-1.54); black (HR = 1.27; 95%CI 1.12-1.45); IV drug use (HR = 1.73; 95%CI 1.49-2.02); CD4+ below 200 cell/mm³ at AIDS diagnosis (HR = 2.31; 95%CI 1.97-2.72); viral load above 500 copies at AIDS diagnosis (HR = 1.99; 95%CI 1.72-2.30); HAART1 scheme (HR = 1.94; 95%CI 1.47-2.55); no ARV (HR = 8.22; 95%CI 2.95-22.87). Conclusions A large proportion of patients did not receive ARVs or were late diagnosed with AIDS, especially those with TB, whose survival was shorter. Survival is heterogeneous in the state, being lower in regions with higher TB rates. The results point to the need for specific strategies for patients with TB-HIV co-infection. Key messages Tuberculosis is the main cause of death among HIV-infected people, being responsible for one third of deaths in this group and causing a great impact on the survival of this population. The Brazilian policy of universal access to ARV and treatment for TB has increased the survival of AIDS-TB from 22% to 95% and in patients without TB from 50% to 95% up to 12 years after diagnosis.


Risks ◽  
2021 ◽  
Vol 9 (6) ◽  
pp. 103
Author(s):  
Morne Joubert ◽  
Tanja Verster ◽  
Helgard Raubenheimer ◽  
Willem D. Schutte

Survival analysis is one of the techniques that could be used to predict loss given default (LGD) for regulatory capital (Basel) purposes. When using survival analysis to model LGD, a proposed methodology is the default weighted survival analysis (DWSA) method. This paper is aimed at adapting the DWSA method (used to model Basel LGD) to estimate the LGD for International Financial Reporting Standard (IFRS) 9 impairment requirements. The DWSA methodology allows for over recoveries, default weighting and negative cashflows. For IFRS 9, this methodology should be adapted, as the estimated LGD is a function of in the expected credit losses (ECL). Our proposed IFRS 9 LGD methodology makes use of survival analysis to estimate the LGD. The Cox proportional hazards model allows for a baseline survival curve to be adjusted to produce survival curves for different segments of the portfolio. The forward-looking LGD values are adjusted for different macro-economic scenarios and the ECL is calculated for each scenario. These ECL values are probability weighted to produce a final ECL estimate. We illustrate our proposed IFRS 9 LGD methodology and ECL estimation on a dataset from a retail portfolio of a South African bank.


2021 ◽  
Vol 21 (1) ◽  
Author(s):  
Maryam Farhadian ◽  
Sahar Dehdar Karsidani ◽  
Azadeh Mozayanimonfared ◽  
Hossein Mahjub

Abstract Background Due to the limited number of studies with long term follow-up of patients undergoing Percutaneous Coronary Intervention (PCI), we investigated the occurrence of Major Adverse Cardiac and Cerebrovascular Events (MACCE) during 10 years of follow-up after coronary angioplasty using Random Survival Forest (RSF) and Cox proportional hazards models. Methods The current retrospective cohort study was performed on 220 patients (69 women and 151 men) undergoing coronary angioplasty from March 2009 to March 2012 in Farchshian Medical Center in Hamadan city, Iran. Survival time (month) as the response variable was considered from the date of angioplasty to the main endpoint or the end of the follow-up period (September 2019). To identify the factors influencing the occurrence of MACCE, the performance of Cox and RSF models were investigated in terms of C index, Integrated Brier Score (IBS) and prediction error criteria. Results Ninety-six patients (43.7%) experienced MACCE by the end of the follow-up period, and the median survival time was estimated to be 98 months. Survival decreased from 99% during the first year to 39% at 10 years' follow-up. By applying the Cox model, the predictors were identified as follows: age (HR = 1.03, 95% CI 1.01–1.05), diabetes (HR = 2.17, 95% CI 1.29–3.66), smoking (HR = 2.41, 95% CI 1.46–3.98), and stent length (HR = 1.74, 95% CI 1.11–2.75). The predictive performance was slightly better by the RSF model (IBS of 0.124 vs. 0.135, C index of 0.648 vs. 0.626 and out-of-bag error rate of 0.352 vs. 0.374 for RSF). In addition to age, diabetes, smoking, and stent length, RSF also included coronary artery disease (acute or chronic) and hyperlipidemia as the most important variables. Conclusion Machine-learning prediction models such as RSF showed better performance than the Cox proportional hazards model for the prediction of MACCE during long-term follow-up after PCI.


Author(s):  
Yuko Yamaguchi ◽  
Marta Zampino ◽  
Toshiko Tanaka ◽  
Stefania Bandinelli ◽  
Yusuke Osawa ◽  
...  

Abstract Background Anemia is common in older adults and associated with greater morbidity and mortality. The causes of anemia in older adults have not been completely characterized. Although elevated circulating growth and differentiation factor 15 (GDF-15) has been associated with anemia in older adults, it is not known whether elevated GDF-15 predicts the development of anemia. Methods We examined the relationship between plasma GDF-15 concentrations at baseline in 708 non-anemic adults, aged 60 years and older, with incident anemia during 15 years of follow-up among participants in the Invecchiare in Chianti (InCHIANTI) Study. Results During follow-up, 179 (25.3%) participants developed anemia. The proportion of participants who developed anemia from the lowest to highest quartile of plasma GDF-15 was 12.9%, 20.1%, 21.2%, and 45.8%, respectively. Adults in the highest quartile of plasma GDF-15 had an increased risk of developing anemia (Hazards Ratio 1.15, 95% Confidence Interval 1.09, 1.21, P&lt;.0001) compared to those in the lower three quartiles in a multivariable Cox proportional hazards model adjusting for age, sex, serum iron, soluble transferrin receptor, ferritin, vitamin B12, congestive heart failure, diabetes mellitus, and cancer. Conclusions Circulating GDF-15 is an independent predictor for the development of anemia in older adults.


2021 ◽  
Vol 7 (1) ◽  
pp. e000896
Author(s):  
Taro Takeuchi ◽  
Yuri Kitamura ◽  
Soya Ishizuka ◽  
Sachiko Yamada ◽  
Hiroshi Aono ◽  
...  

ObjectivesTo compare the mortality of Japanese athletes in the 1964 Tokyo Olympic Games with that of the Japanese population, and to elucidate factors associated with their mortality.MethodsWe obtained from the Japan Sport Association study subjects’ biographical information, information on lifestyles and medical data. Missing data were obtained from online databases. Standardised mortality ratio (SMR) was calculated to compare athletes’ mortality with the Japanese population. Cox proportional hazards model was applied to estimate the HR for each category of body mass index (BMI), smoking history and handgrip strength. This analysis was limited to male athletes due to the small number of female athletes.ResultsAmong 342 (283 men, 59 women) athletes, deaths were confirmed for 70 (64 men, 6 women) athletes between September 1964 and December 2017. Total person years was 15 974.8, and the SMR was 0.64 (95% CI 0.50 to 0.81). Multivariate analysis performed on 181 male athletes. Mortality was significantly higher for BMI≥25 kg/m2 than for 21–23 kg/m2 (HR: 3.03, 95% CI 1.01 to 9.07). We found no statistically significant associations between smoking history and mortality; the HR (95% CI) for occasional and daily smokers were 0.82 (0.26 to 2.57) and 1.30 (0.55 to 3.03) compared with never smokers. We also found no statistically significant associations between handgrip strength and mortality (P for trend: 0.51).ConclusionJapanese athletes in the 1964 Tokyo Olympic Games lived longer than the Japanese population. BMI≥25 kg/m2 was associated with higher mortality, but smoking history and handgrip strength were not associated with mortality.


Sign in / Sign up

Export Citation Format

Share Document