scholarly journals Corticosteroid Use Following the Onset of Invasive Aspergillosis is Associated with Increased Mortality: A Propensity Score-Matched Study

2017 ◽  
Vol 4 (suppl_1) ◽  
pp. S55-S55
Author(s):  
Michael Abers ◽  
Jatin Vyas

Abstract Background The safety of corticosteroid use (CSU) during active infection is controversial. In the invasive aspergillosis (IA) literature, CSU is typically defined using the time period prior to IA onset. Clinicians caring for patients with IA are unable to control prior CSU. The more clinically relevant question is whether CSU after IA onset is harmful. Methods Patients hospitalized at our institution from 2004 to 2014 with IA were retrospectively identified. CSU, defined as the average daily prednisone equivalent dose during the 7-day period following IA onset, was calculated for each patient. A CSU cut-off of 7.5mg was used to assign patients to treatment (>7.5mg) or control (<7.5mg, including no CSU) groups. A propensity score (PS) was generated to predict group assignment. Nearest neighbor matching was performed with a caliper width of 0.2. A Cox proportional hazards model was used to assess survival 6 weeks after IA onset. Results PS matching generated 61 matched pairs (122 patients). Baseline characteristics did not differ significantly between groups (Table). CSU was associated with increased mortality (PS adjusted hazard ratio [HR] 2.91, 95% CI 1.32–6.40). In the CSU group, a trend towards lower mortality was noted if corticosteroid dose was tapered to 7.5mg/day (HR 0.68, 95% CI 0.46–1.02). Conclusion CSU after IA onset is associated with increased mortality. In IA patients with CSU, efforts to reduce corticosteroid dose may be beneficial. Disclosures All authors: No reported disclosures.

2020 ◽  
Vol 2020 ◽  
pp. 1-8
Author(s):  
Ya-Hsu Yang ◽  
Chih-Chiang Chiu ◽  
Hao-Wei Teng ◽  
Chun-Teng Huang ◽  
Chun-Yu Liu ◽  
...  

Background. Late onset depression (LOD) often occurs in the context of vascular disease and may be associated with risk of dementia. Aspirin is widely used to reduce the risk of cardiovascular disease and stroke. However, its role in patients with LOD and risk of dementia remains inconclusive. Materials and Methods. A population-based study was conducted using data from National Health Insurance of Taiwan during 1996–2009. Patients fulfil diagnostic criteria for LOD with or without subsequent dementia (incident dementia) and among whom users of aspirin (75 mg daily for at least 6 months) were identified. The time-dependent Cox proportional hazards model was applied for multivariate analyses. Propensity scores with the one-to-one nearest-neighbor matching model were used to select matching patients. Cumulative incidence of incident dementia after diagnosis of LOD was calculated by Kaplan–Meier Method. Results. A total of 6028 (13.4%) and 40,411 (86.6%) patients were defined as, with and without diagnosis of LOD, among whom 2,424 (41.9%) were aspirin users. Patients with LOD had more comorbidities such as cardiovascular diseases, diabetes, and hypertension comparing to those without LOD. Among patients with LOD, aspirin users had lower incidence of subsequent incident dementia than non-users (Hazard Ratio = 0.734, 95% CI 0.641–0.841, p<0.001). After matching aspirin users with non-users by propensity scores-matching method, the cumulative incidence of incident dementia was significantly lower in aspirin users of LOD patients (p=0.022). Conclusions. Aspirin may be associated with a lower risk of incident dementia in patients with LOD. This beneficial effect of aspirin in LOD patients needs validation in prospective clinical trials and our results should be interpreted with caution.


2016 ◽  
Vol 44 (1) ◽  
pp. 71-80 ◽  
Author(s):  
Hyo Jin Kim ◽  
Hajeong Lee ◽  
Dong Ki Kim ◽  
Kook-Hwan Oh ◽  
Yon Su Kim ◽  
...  

Background: Vascular access (VA) is essential for hemodialysis (HD) patients, and its dysfunction is a major complication. However, little is known about outcomes in patients with recurrent VA dysfunction. We explored the influence of recurrent VA dysfunction on cardiovascular (CV) events, death and VA abandonment. Methods: This is a single-center, retrospective study conducted in patients who underwent VA surgery between 2009 and 2014. VA dysfunction was defined as VA stenosis or thrombosis requiring intervention after the first successful cannulation. Patients with ≥2 interventions within 180 days were categorized as having recurrent VA dysfunction. Outcomes were analyzed using Cox proportional hazards model before and after propensity score matching. Results: Of 766 patients (ages 59.6 ± 14.3 years, 59.7% male), 10.1% were in the recurrent VA dysfunction group. Most baseline parameters after matching were similar between the recurrent and non-recurrent groups. A total of 213 propensity score-matched patients were followed for 28.7 ± 15.8 months, during which 46 (21.6%), 30 (14.1%) and 14 (6.6%) patients had de novo CV outcomes, died and abandoned VA, respectively. Recurrent VA dysfunction after adjustment remained an independent risk factor for CV events (adjusted hazards ratio (aHR), 2.71; 95% CI 1.48-4.98; p = 0.001). Moreover, recurrent VA dysfunction predicted composite all-cause mortality (ACM)/CV events (aHR 1.99; 95% CI 1.21-3.28; p = 0.007). Conclusions: Recurrent VA dysfunction was a novel independent risk factor for CV and composite ACM/CV events in HD patients, but not for VA abandonment. Patients with recurrent vascular dysfunction should be carefully monitored not only for VA patency but also for CV events.


2020 ◽  
Vol 9 (9) ◽  
pp. 3012
Author(s):  
Shu-Yu Tai ◽  
Jiun-Shiuan He ◽  
Chun-Tung Kuo ◽  
Ichiro Kawachi

Although a disparity has been noted in the prevalence and outcome of chronic disease between rural and urban areas, studies about diabetes-related complications are lacking. The purpose of this study was to examine the association between urbanization and occurrence of diabetes-related complications using Taiwan’s nationwide diabetic mellitus database. In total, 380,474 patients with newly diagnosed type 2 diabetes between 2000 and 2008 were included and followed up until 2013 or death; after propensity score matching, 31,310 pairs were included for analysis. Occurrences of seven diabetes-related complications of interest were identified. Cox proportional hazards model was used to determine the time-to-event hazard ratio (HR) among urban, suburban and rural groups. We found that the HRs of all cardiovascular events during the five-year follow-up was 1.04 times (95% confidence interval (CI) 1.00–1.07) and 1.15 times (95% CI 1.12–1.19) higher in suburban and rural areas than in urban areas. Patients in suburban and rural areas had a greater likelihood of congestive heart failure, stroke, and end-stage renal disease than those in urban areas. Moreover, patients in rural areas had a higher likelihood of ischemic heart disease, blindness, and ulcer than those in urban areas. Our empirical findings provide evidence for potential urban–rural disparities in diabetes-related complications in Taiwan.


2021 ◽  
Vol 21 (1) ◽  
Author(s):  
Masaru Ejima ◽  
Tsukasa Okamoto ◽  
Takafumi Suzuki ◽  
Tatsuhiko Anzai ◽  
Kunihiko Takahashi ◽  
...  

Abstract Background Fibrotic hypersensitivity pneumonitis (HP) is a chronic interstitial lung disease caused by allergic responses to repeated exposures to a causative antigen. Therapeutic evidence of the use of corticosteroids to treat fibrotic HP remains lacking, although corticosteroids are recognized as a major treatment option. The purpose of this study was to evaluate the efficacy of corticosteroid treatment in patients with fibrotic HP in a propensity score-matched cohort. Methods A retrospective review of the medical records from 2005 to 2019 in a single center was conducted, and 144 patients with fibrotic HP were identified. Semiquantitative scores for lung abnormalities on HRCT were evaluated. Patients who received (PDN group) and did not receive (non-PDN group) corticosteroid treatment were matched using a propensity score method. Survival rates, serial changes in pulmonary function and annual changes in HRCT scores were compared in the matched cohort. Results In the matched analysis, 30 individuals in the PDN group were matched with 30 individuals in the non-PDN group, the majority of whom had ILD without extensive fibrosis. The survival rate was significantly better in the PDN group (P = 0.032 for the stratified Cox proportional hazards model; HR, 0.250). The absolute changes in FVC at 6, 12, and 24 months from baseline were significantly better in the PDN group. Fewer patients in the PDN group experienced annual deterioration, as reflected in the HRCT score, due to ground-glass attenuation, consolidation, reticulation, traction bronchiectasis and honeycombing. Conclusion We demonstrated that corticosteroids improved survival and slowed fibrotic progression in a matched cohort, the majority of whom had ILD without extensive fibrosis. Fibrotic HP with less severe fibrosis may benefit from corticosteroid treatment. We propose that the early initiation of corticosteroids should be considered for fibrotic HP when worsening fibrosis is observed.


Crisis ◽  
2018 ◽  
Vol 39 (1) ◽  
pp. 27-36 ◽  
Author(s):  
Kuan-Ying Lee ◽  
Chung-Yi Li ◽  
Kun-Chia Chang ◽  
Tsung-Hsueh Lu ◽  
Ying-Yeh Chen

Abstract. Background: We investigated the age at exposure to parental suicide and the risk of subsequent suicide completion in young people. The impact of parental and offspring sex was also examined. Method: Using a cohort study design, we linked Taiwan's Birth Registry (1978–1997) with Taiwan's Death Registry (1985–2009) and identified 40,249 children who had experienced maternal suicide (n = 14,431), paternal suicide (n = 26,887), or the suicide of both parents (n = 281). Each exposed child was matched to 10 children of the same sex and birth year whose parents were still alive. This yielded a total of 398,081 children for our non-exposed cohort. A Cox proportional hazards model was used to compare the suicide risk of the exposed and non-exposed groups. Results: Compared with the non-exposed group, offspring who were exposed to parental suicide were 3.91 times (95% confidence interval [CI] = 3.10–4.92 more likely to die by suicide after adjusting for baseline characteristics. The risk of suicide seemed to be lower in older male offspring (HR = 3.94, 95% CI = 2.57–6.06), but higher in older female offspring (HR = 5.30, 95% CI = 3.05–9.22). Stratified analyses based on parental sex revealed similar patterns as the combined analysis. Limitations: As only register-­based data were used, we were not able to explore the impact of variables not contained in the data set, such as the role of mental illness. Conclusion: Our findings suggest a prominent elevation in the risk of suicide among offspring who lost their parents to suicide. The risk elevation differed according to the sex of the afflicted offspring as well as to their age at exposure.


2020 ◽  
Vol 132 (4) ◽  
pp. 998-1005 ◽  
Author(s):  
Haihui Jiang ◽  
Yong Cui ◽  
Xiang Liu ◽  
Xiaohui Ren ◽  
Mingxiao Li ◽  
...  

OBJECTIVEThe aim of this study was to investigate the relationship between extent of resection (EOR) and survival in terms of clinical, molecular, and radiological factors in high-grade astrocytoma (HGA).METHODSClinical and radiological data from 585 cases of molecularly defined HGA were reviewed. In each case, the EOR was evaluated twice: once according to contrast-enhanced T1-weighted images (CE-T1WI) and once according to fluid attenuated inversion recovery (FLAIR) images. The ratio of the volume of the region of abnormality in CE-T1WI to that in FLAIR images (VFLAIR/VCE-T1WI) was calculated and a receiver operating characteristic curve was used to determine the optimal cutoff value for that ratio. Univariate and multivariate analyses were performed to identify the prognostic value of each factor.RESULTSBoth the EOR evaluated from CE-T1WI and the EOR evaluated from FLAIR could divide the whole cohort into 4 subgroups with different survival outcomes (p < 0.001). Cases were stratified into 2 subtypes based on VFLAIR/VCE-T1WIwith a cutoff of 10: a proliferation-dominant subtype and a diffusion-dominant subtype. Kaplan-Meier analysis showed a significant survival advantage for the proliferation-dominant subtype (p < 0.0001). The prognostic implication has been further confirmed in the Cox proportional hazards model (HR 1.105, 95% CI 1.078–1.134, p < 0.0001). The survival of patients with proliferation-dominant HGA was significantly prolonged in association with extensive resection of the FLAIR abnormality region beyond contrast-enhancing tumor (p = 0.03), while no survival benefit was observed in association with the extensive resection in the diffusion-dominant subtype (p=0.86).CONCLUSIONSVFLAIR/VCE-T1WIis an important classifier that could divide the HGA into 2 subtypes with distinct invasive features. Patients with proliferation-dominant HGA can benefit from extensive resection of the FLAIR abnormality region, which provides the theoretical basis for a personalized resection strategy.


Risks ◽  
2021 ◽  
Vol 9 (6) ◽  
pp. 103
Author(s):  
Morne Joubert ◽  
Tanja Verster ◽  
Helgard Raubenheimer ◽  
Willem D. Schutte

Survival analysis is one of the techniques that could be used to predict loss given default (LGD) for regulatory capital (Basel) purposes. When using survival analysis to model LGD, a proposed methodology is the default weighted survival analysis (DWSA) method. This paper is aimed at adapting the DWSA method (used to model Basel LGD) to estimate the LGD for International Financial Reporting Standard (IFRS) 9 impairment requirements. The DWSA methodology allows for over recoveries, default weighting and negative cashflows. For IFRS 9, this methodology should be adapted, as the estimated LGD is a function of in the expected credit losses (ECL). Our proposed IFRS 9 LGD methodology makes use of survival analysis to estimate the LGD. The Cox proportional hazards model allows for a baseline survival curve to be adjusted to produce survival curves for different segments of the portfolio. The forward-looking LGD values are adjusted for different macro-economic scenarios and the ECL is calculated for each scenario. These ECL values are probability weighted to produce a final ECL estimate. We illustrate our proposed IFRS 9 LGD methodology and ECL estimation on a dataset from a retail portfolio of a South African bank.


2021 ◽  
Vol 21 (1) ◽  
Author(s):  
Maryam Farhadian ◽  
Sahar Dehdar Karsidani ◽  
Azadeh Mozayanimonfared ◽  
Hossein Mahjub

Abstract Background Due to the limited number of studies with long term follow-up of patients undergoing Percutaneous Coronary Intervention (PCI), we investigated the occurrence of Major Adverse Cardiac and Cerebrovascular Events (MACCE) during 10 years of follow-up after coronary angioplasty using Random Survival Forest (RSF) and Cox proportional hazards models. Methods The current retrospective cohort study was performed on 220 patients (69 women and 151 men) undergoing coronary angioplasty from March 2009 to March 2012 in Farchshian Medical Center in Hamadan city, Iran. Survival time (month) as the response variable was considered from the date of angioplasty to the main endpoint or the end of the follow-up period (September 2019). To identify the factors influencing the occurrence of MACCE, the performance of Cox and RSF models were investigated in terms of C index, Integrated Brier Score (IBS) and prediction error criteria. Results Ninety-six patients (43.7%) experienced MACCE by the end of the follow-up period, and the median survival time was estimated to be 98 months. Survival decreased from 99% during the first year to 39% at 10 years' follow-up. By applying the Cox model, the predictors were identified as follows: age (HR = 1.03, 95% CI 1.01–1.05), diabetes (HR = 2.17, 95% CI 1.29–3.66), smoking (HR = 2.41, 95% CI 1.46–3.98), and stent length (HR = 1.74, 95% CI 1.11–2.75). The predictive performance was slightly better by the RSF model (IBS of 0.124 vs. 0.135, C index of 0.648 vs. 0.626 and out-of-bag error rate of 0.352 vs. 0.374 for RSF). In addition to age, diabetes, smoking, and stent length, RSF also included coronary artery disease (acute or chronic) and hyperlipidemia as the most important variables. Conclusion Machine-learning prediction models such as RSF showed better performance than the Cox proportional hazards model for the prediction of MACCE during long-term follow-up after PCI.


Author(s):  
Yuko Yamaguchi ◽  
Marta Zampino ◽  
Toshiko Tanaka ◽  
Stefania Bandinelli ◽  
Yusuke Osawa ◽  
...  

Abstract Background Anemia is common in older adults and associated with greater morbidity and mortality. The causes of anemia in older adults have not been completely characterized. Although elevated circulating growth and differentiation factor 15 (GDF-15) has been associated with anemia in older adults, it is not known whether elevated GDF-15 predicts the development of anemia. Methods We examined the relationship between plasma GDF-15 concentrations at baseline in 708 non-anemic adults, aged 60 years and older, with incident anemia during 15 years of follow-up among participants in the Invecchiare in Chianti (InCHIANTI) Study. Results During follow-up, 179 (25.3%) participants developed anemia. The proportion of participants who developed anemia from the lowest to highest quartile of plasma GDF-15 was 12.9%, 20.1%, 21.2%, and 45.8%, respectively. Adults in the highest quartile of plasma GDF-15 had an increased risk of developing anemia (Hazards Ratio 1.15, 95% Confidence Interval 1.09, 1.21, P&lt;.0001) compared to those in the lower three quartiles in a multivariable Cox proportional hazards model adjusting for age, sex, serum iron, soluble transferrin receptor, ferritin, vitamin B12, congestive heart failure, diabetes mellitus, and cancer. Conclusions Circulating GDF-15 is an independent predictor for the development of anemia in older adults.


2020 ◽  
Vol 41 (Supplement_2) ◽  
Author(s):  
A Funada ◽  
Y Goto ◽  
T Maeda ◽  
H Okada ◽  
M Takamura

Abstract Background/Introduction Shockable rhythm after cardiac arrest is highly expected after early initiation of bystander cardiopulmonary resuscitation (CPR) owing to increased coronary perfusion. However, the relationship between bystander CPR and initial shockable rhythm in patients with out-of-hospital cardiac arrest (OHCA) remains unclear. We hypothesized that chest-compression-only CPR (CC-CPR) before emergency medical service (EMS) arrival has an equivalent effect on the likelihood of initial shockable rhythm to the standard CPR (chest compression plus rescue breathing [S-CPR]). Purpose We aimed to examine the rate of initial shockable rhythm and 1-month outcomes in patients who received bystander CPR after OHCA. Methods The study included 59,688 patients (age, ≥18 years) who received bystander CPR after an OHCA with a presumed cardiac origin witnessed by a layperson in a prospectively recorded Japanese nationwide Utstein-style database from 2013 to 2017. Patients who received public-access defibrillation before arrival of the EMS personnel were excluded. The patients were divided into CC-CPR (n=51,520) and S-CPR (n=8168) groups according to the type of bystander CPR received. The primary end point was initial shockable rhythm recorded by the EMS personnel just after arrival at the site. The secondary end point was the 1-month outcomes (survival and neurologically intact survival) after OHCA. In the statistical analyses, a Cox proportional hazards model was applied to reflect the different bystander CPR durations before/after propensity score (PS) matching. Results The crude rate of the initial shockable rhythm in the CC-CPR group (21.3%, 10,946/51,520) was significantly higher than that in the S-CPR group (17.6%, 1441/8168, p&lt;0.0001) before PS matching. However, no significant difference in the rate of initial shockable rhythm was found between the 2 groups after PS matching (18.3% [1493/8168] vs 17.6% [1441/8168], p=0.30). In the Cox proportional hazards model, CC-CPR was more negatively associated with the initial shockable rhythm before PS matching (unadjusted hazards ratio [HR], 0.97; 95% confidence interval [CI], 0.94–0.99; p=0.012; adjusted HR, 0.92; 95% CI, 0.89–0.94; p&lt;0.0001) than S-CPR. After PS matching, however, no significant difference was found between the 2 groups (adjusted HR of CC-CPR compared with S-CPR, 0.97; 95% CI, 0.94–1.00; p=0.09). No significant differences were found between C-CPR and S-CPR in the 1-month outcomes after PS matching as follows, respectively: survival, 8.5% and 10.1%; adjusted odds ratio, 0.89; 95% CI, 0.79–1.00; p=0.07; cerebral performance category 1 or 2, 5.5% and 6.9%; adjusted odds, 0.86; 95% CI, 0.74–1.00; p=0.052. Conclusions Compared with S-CPR, the CC-CPR before EMS arrival had an equivalent multivariable-adjusted association with the likelihood of initial shockable rhythm in the patients with OHCA due to presumed cardiac causes that was witnessed by a layperson. Funding Acknowledgement Type of funding source: None


Sign in / Sign up

Export Citation Format

Share Document