scholarly journals High-dose influenza vaccination and mortality among predominantly male, white, senior veterans, United States, 2012/13 to 2014/15

2020 ◽  
Vol 25 (19) ◽  
Author(s):  
Yinong Young-Xu ◽  
Julia Thornton Snider ◽  
Salaheddin M Mahmud ◽  
Ellyn M Russo ◽  
Robertus Van Aalst ◽  
...  

Introduction It is unclear whether high-dose influenza vaccine (HD) is more effective at reducing mortality among seniors. Aim This study aimed to evaluate the relative vaccine effectiveness (rVE) of HD. Methods We linked electronic medical record databases in the Veterans Health Administration (VHA) and Medicare administrative files to examine the rVE of HD vs standard-dose influenza vaccines (SD) in preventing influenza/pneumonia-associated and cardiorespiratory mortality among VHA-enrolled veterans 65 years or older during the 2012/13, 2013/14 and 2014/15 influenza seasons. A multivariable Cox proportional hazards model was performed on matched recipients of HD vs SD, based on vaccination time, location, age, sex, ethnicity and VHA priority level. Results Among 569,552 person-seasons of observation, 207,574 (36%) were HD recipients and 361,978 (64%) were SD recipients, predominantly male (99%) and white (82%). Pooling findings from all three seasons, the adjusted rVE estimate of HD vs SD during the high influenza periods was 42% (95% confidence interval (CI): 24–59) against influenza/pneumonia-associated mortality and 27% (95% CI: 23–32) against cardiorespiratory mortality. Residual confounding was evident in both early and late influenza periods despite matching and multivariable adjustment. Excluding individuals with high 1-year predicted mortality at baseline reduced the residual confounding and yielded rVE of 36% (95% CI: 10–62) and 25% (95% CI: 12–38) against influenza/pneumonia-associated and cardiorespiratory mortality, respectively. These were confirmed by results from two-stage residual inclusion estimations. Discussion The HD was associated with a lower risk of influenza/pneumonia-associated and cardiorespiratory death in men during the high influenza period.

2018 ◽  
Vol 5 (suppl_1) ◽  
pp. S292-S293
Author(s):  
Yinong Young-Xu ◽  
Ellyn Russo ◽  
Nabin Neupane ◽  
Melissa Lewis ◽  
Yuliya Halchenko

Abstract Background Despite the widespread availability of several injectable inactivated influenza vaccines (IIV), including the trivalent standard-dose (IIV3-SD) and high-dose (IIV3-HD), and the quadrivalent (IIV4), the US Advisory Committee on Immunization Practices does not currently recommend one over another. The objective of this study was to assess the relative vaccine effectiveness (rVE) of IIV3-HD and IIV4 vs. IIV3-SD. Methods rVE was estimated from a retrospective cohort study of Veterans aged 65 years and older who received an IIV during the 2014–2015 influenza season. Veterans Health Administration (VHA) electronic medical records were linked with Centers for Medicare and Medicate Services administrative claims to capture the study outcomes of hospitalizations and baseline characteristics. The inverse probability of treatment weight (IPTW) method was used to adjust for potential confounding due to unmeasured factors associated with IIV3-SD, IIV3-HD, or IIV4 vaccination. The probability was estimated based on patient sociodemographic characteristics, comorbidities, pre-influenza season hospitalizations, prior season influenza vaccination, and use of immunosuppressive medication. Results Our study population included 782,346 VHA patients vaccinated during the 2014–2015 season. Of these, 10,543 (1%) received IIV4, while 59,536 (8%) received IIV3-HD and 712,267 (91%) received IIV3-SD. 11,626 (1.5%) were female and 588,324 (76%) were non-Hispanic white. Compared with those that received IIV3-SD vaccine, the IPTW-adjusted rVE for IIV3-HD was 7% (95% CI, 9%–21%) against all-cause, 15% (95% CI, 10%–17%) against cardiorespiratory associated, and 13% (95% CI, 8%–17%) against influenza/pneumonia-associated hospitalization. For those that received IIV4, the IPTW-adjusted rVE was 4% (95% CI, 1%–4%), 1% (95% CI, −2%–5%), and 0% (95% CI, −9%–8%), respectively. Conclusion IIV3-HD is more effective than, and IIV4 is as effective as, IIV3-SD vaccination in preventing influenza/pneumonia-associated, cardiorespiratory, and all-cause hospitalizations. Additional studies that employ methods to control for unmeasured confounding are warranted as the use of IIV4 expands. Disclosures All authors: No reported disclosures.


2012 ◽  
Vol 30 (15_suppl) ◽  
pp. 4617-4617
Author(s):  
James Lin Chen ◽  
Kimryn Rathmell ◽  
David F. McDermott ◽  
Walter Michael Stadler

4617 Background: The oral mTOR inhibitor, everolimus, affects tumor growth by targeting cellular metabolic proliferation pathways and modestly delays RCC progression. We hypothesized that circulating microRNAs, which have been associated with renal cancer and inflammation, may serve as predictive biomarkers to help better define a population more sensitive to treatment. Methods: Plasma from mRCC pts refractory to VEGF inhibition were obtained prior to treatment with standard dose everolimus as part of a clinical trial examining FDG-PET as a potential predictive biomarker. As we were specifically interested in tumor response to drug, only pts who died, remained on trial, or had radiographic progression by RECIST criteria were profiled. Pts who were unable to tolerate drug were excluded. MicroRNAs were extracted and profiled without pre-amplification using Exiqon LNA PCR panels. Crossing point (Cp) values within 5 of the negative control were removed. MicroRNAs must have been present in >90% of samples and varied at least p > 0.10 from mean to be further analyzed. Cox-proportional hazards model and Kaplan-Meier analyses were performed. Results: 28 patients had available plasma and met criteria for profiling. Pt characteristics included: 20 (71%) clear cell histology, median age 57.7 (43 – 76), median number of prior systemic therapies 2 (1 – 3). 103 microRNAs were expressed in at least 90% of all samples. Mir-21 and mir-378 were independently correlated with PFS (FDR: 0.02 and 0.06, respectively). Low circulating plasma mir-21 and mir-378 levels resulted in a median PFS prolongation of 370d vs. 101d (p=0.027) and 368d vs. 106d (p=0.001). Analysis of the clear cell cohort for mir-21 and mir-378 also demonstrated a significant median PFS difference of 350d vs. 173d (p=0.045) and 345d vs. 147d (p=0.004). Conclusions: Elevated levels of circulating mir-21 and mir-378 have been associated with systemic inflammatory states, and in our study are correlated with decreased PFS in mRCC pts undergoing everolimus therapy. Further prospective studies will be required to validate these exploratory results for their potential role as prognostic or predictive biomarkers.


2018 ◽  
Vol 36 (6_suppl) ◽  
pp. 572-572
Author(s):  
Deepak Kilari ◽  
Parameswaran Hari ◽  
Muna Qayed ◽  
Raphael Fraser ◽  
Omar Davila ◽  
...  

572 Background: Single-center observational studies have established the use of single or tandem SCT to salvage relapsed GCT but randomized trials are lacking. We analyzed outcomes and prognostic factors in 2,395 male SCT recipients for relapsed GCT between 1990 and 2015. Methods: Recipients of single or tandem SCT reported to the Center for International Blood and Marrow Transplant Research were identified. Outcomes were compared by SCT year: 1990-94 (N = 288), 1995-99 (N = 351), 2000-04 (N = 376), 2005-09 (N = 509) and 2010-15 (N = 871). A recent subset (n = 267, 2000-2015) with detailed disease- and transplant-related data was further analyzed with a multivariate (MVA) Cox proportional hazards model. Results: Median age at SCT was 31 (11-76) years and 49% received SCT within 12 months of diagnosis consistent with early relapse/primary refractory GCT. 26% had primary extragonadal GCT; 1,167 (49%) had intent to tandem transplant (TT). The median follow up was 51 (3-313) months. Day 100 non-relapse mortality was statistically similar at 8% in 1990-94 (vs. 4% in 2010-15) but 3-year progression-free survival (PFS) improved from 24 (18-31)% in 1990-94 to 47 (43-50)% in 2010-15 (p < 0.0001) and 3-year survival (OS) from 35 (29-40)% to 54 (50-57)% in 2010-15 (p < 0.0001). Compared with single SCT, TT recipients were younger 31 (16-62) vs 34 (13-76), with lower Hematopoietic Cell Transplantation-Comorbidity Index, more likely to undergo SCT after 1 line of chemotherapy (28% vs 9%), and within 1 year of diagnosis (51% vs 38%). TT was preferred over single SCT over time (48% of SCT were TT in 2000-04 vs. 81% in 2010-15). In MVA, non-seminoma histology, residual tumor at SCT, receipt of > 1 line of pre-SCT chemotherapy and single SCT (vs. TT), were associated with worse PFS and OS. Year of SCT was not significant when adjusted for these covariates. Conclusions: In this large longitudinal cohort, improvements in PFS and OS were observed in recent years. SCT earlier in disease course and tandem SCT were associated with superior outcomes. These data involving a large cohort reported from 225 centers confirm specialized centers’ date in a real-world setting.


1995 ◽  
Vol 114 (2) ◽  
pp. 361-372 ◽  
Author(s):  
C. A. Sabin ◽  
A. N. Phillips ◽  
C. A. Lee ◽  
G. Janossy ◽  
V. Emery ◽  
...  

SummaryThe effect of prior infection with cytomegalovirus (CMV) on progression of HIV disease in a cohort of 111 men with haemophilia was studied after 13 years followup. The relative hazards associated with CMV positivity on progression to AIDS, death and a CD4 count of 0·05 × 109/1 were 2·28, 2·42 and 2·34, respectively. CMV seropositive patients were significantly older than the seronegative and this was controlled for by using a Cox proportional hazards model. The relative hazards for the three endpoints decreased to 1·89, 1·82 and 1·93 respectively and were marginally non-significant (P = 0·05, 0·08 and 0·08 for the three endpoints respectively). We conclude that this cohort continues to show evidence of a ‘co-factor’ effect associated with prior infection with CMV which is confounded by age but not completely explained by age differences. The potential biological significance of these results is discussed in the context of recent controlled clinical trials which show a survival benefit from long-term high-dose acyclovir, a drug with activity in vivo against CMV and other herpesviruses.


2020 ◽  
Vol 38 (15_suppl) ◽  
pp. e14008-e14008
Author(s):  
Scott C. Howard ◽  
Nicholas Napier ◽  
Xueyuan Cao ◽  
Ryan Combs ◽  
Mark Layton Watson ◽  
...  

e14008 Background: Primary central nervous system lymphoma (PCNSL) can often be cured, especially in younger patients, but requires intense chemotherapy with high-dose methotrexate (HDMTX) and rituximab to optimize outcomes. Toxicities can lead to dose reduction or omission that may increase relapse risk, or lead clinicians to select less effective regimens that do not contain HDMTX. Methods: Anonymized, de-identified data of patients from 110 community oncology practices of the Guardian Research Network (GRN, www.GuardianResearch.org ) was analyzed to determine treatments, toxicities, and outcomes of adults with PCNSL. All data from the medical record is available from GRN (diagnoses, demographics, labs, medicines, toxicities, radiology, pathology, procedures, and encounters), so each patient’s journey can be fully characterized. Results: Of 533805 adults with cancer, 49 were treated for PCNSL with HDMTX-containing regimens (n = 35), other chemotherapy regimens (n = 3), or radiation therapy (RT) alone (n = 11). HDMTX patients received HDMTX only in 8 cases, HDMTX plus rituximab in 23 cases, addition of RT in 11 cases, and HDMTX with other chemotherapy but no rituximab in 3 cases. Survival at 5 years was 53% (standard error [SE] 8.6%) for patients treated with HDMTX versus 33% (SE 13%) for those treated with other therapies. Of those treated with HDMTX, survival was 0% for patients who experienced early toxicity that required cessation of HDMTX prior to receiving 3 doses and having response evaluated versus 62% (SE 9.1%) for patients who received 3 or more courses of HDMTX (p < 0.001). In a multivariable Cox proportional hazards model including completion of at least 3 doses of HDMTX, age, race, and sex, only lack of HDMTX toxicity was associated with survival (hazard ratio 0.22, 95% confidence interval 0.07 to 0.70, p = 0.01). Conclusions: Use of HDMTX and prevention of toxicity improves outcomes for PCNSL patients treated in the community.


Author(s):  
Ziyun Shao ◽  
Yongwen Feng ◽  
Li Zhong ◽  
Qifeng Xie ◽  
Ming Lei ◽  
...  

AbstractImportanceCoronavirus disease 2019 (COVID-19) has become pandemic, causing more than 1.5 million infections and over ten-thousands of deaths in a short period of time worldwide. However, little is known about its pathological mechanism, and reports on clinical study on specific treatment are few.ObjectiveThe purpose of this study is to determine the clinical efficacy of intravenous immunoglobulin (IVIG) therapy in COVID-19 patients.Design, setting and participantsThis multicenter retrospective cohort study enrolled 325 adult critical COVID-19 patients, including severe type and critical type, according to the clinical classification defined by National Health Commission of China, in 8 government designated treatment centers in China from Dec 23, 2019 to Mar 31, 2020. Demographic, clinical, treatment, and laboratory data as well as prognosis were extracted from electronic medical records.ExposureIVIG was exposure factor.Main outcomes and measuresPrimary outcomes were the 28-day and 60-day mortality, and secondary outcomes were the total length of in-hospital and the total duration of the disease. Meanwhile, the parameters of inflammation responses and organ functions were measured. The risk factors were determined by COX proportional hazards model. The subgroup analysis was carried out according to clinical classification of COVID-19, IVIG dosage, and timing.ResultsIn the enrolled 325 patients, 222 (68%) were severe type and 103 (32%) were critical type; 42 (13%) died in 28-day within hospitalization, and 54 (17%) died within 60-day; The death in 60-day includes 6 (3%) severe type patients and 48 (47%) critical type patients. 174 cases were used IVIG, and 151 cases were not. Compared with the baseline characteristics between two groups, the results showed that the patients in IVIG group presented higher Acute Physiology and Chronic Health Evaluation (APACHII) score and Sequential Organ Failure Assessment (SOFA) score, higher plasm levels of IL-6 and lactate, and lower lymphocyte count and oxygenation index (all P<0.05). The 28-day and 60-day mortality were not improved with IVIG in overall cohort. The in-hospital stay and the total duration of disease were longer in IVIG group (P<0.001). Risk factors were clinical classifications (hazards ratio 0.126, 95% confidence interval 0.039-0.413, P=0.001), and using IVIG (hazards ratio 0.252, 95% confidence interval 0.107-0.591, P=0.002) with COX proportional hazards model. Subgroup analysis showed that only in patients with critical type, IVIG could significantly reduce the 28-day mortality, decrease the inflammatory response, and improve some organ functions (all P<0.05); and application of IVIG in the early stage (admission≤7 days) with a high dose (>15 g/d) exhibited significant reduction of 60-day mortality in the critical type patients.Conclusions and RelevanceEarly administration of IVIG with high dose improves the prognosis of critical type patients with COVID-19. This study provides important information on clinical application of the IVIG in treatment of SARS-CoV-2 infection, including patient selection and administration timing and dosage.Key pointsQuestionIntravenous immunoglobulin (IVIG) was recommended to treat critical Coronavirus disease 2019 (COVID-19) patients in a few reviews, but the clinical study evidence on its efficacy in COVID-19 patients was lacked.FindingIn this multicenter cohort study that included 325 adult critical patients from 8 treatment centers, the results showed that early administration (admission ≤ 7 days) of IVIG with high dose (> 15 g/d) improves the prognosis of critical type patients with COVID-19.MeaningThis study provides important information on clinical application of IVIG in treatment of SARS-CoV-2 infection, including patient selection, administration timing and dosage.


Crisis ◽  
2018 ◽  
Vol 39 (1) ◽  
pp. 27-36 ◽  
Author(s):  
Kuan-Ying Lee ◽  
Chung-Yi Li ◽  
Kun-Chia Chang ◽  
Tsung-Hsueh Lu ◽  
Ying-Yeh Chen

Abstract. Background: We investigated the age at exposure to parental suicide and the risk of subsequent suicide completion in young people. The impact of parental and offspring sex was also examined. Method: Using a cohort study design, we linked Taiwan's Birth Registry (1978–1997) with Taiwan's Death Registry (1985–2009) and identified 40,249 children who had experienced maternal suicide (n = 14,431), paternal suicide (n = 26,887), or the suicide of both parents (n = 281). Each exposed child was matched to 10 children of the same sex and birth year whose parents were still alive. This yielded a total of 398,081 children for our non-exposed cohort. A Cox proportional hazards model was used to compare the suicide risk of the exposed and non-exposed groups. Results: Compared with the non-exposed group, offspring who were exposed to parental suicide were 3.91 times (95% confidence interval [CI] = 3.10–4.92 more likely to die by suicide after adjusting for baseline characteristics. The risk of suicide seemed to be lower in older male offspring (HR = 3.94, 95% CI = 2.57–6.06), but higher in older female offspring (HR = 5.30, 95% CI = 3.05–9.22). Stratified analyses based on parental sex revealed similar patterns as the combined analysis. Limitations: As only register-­based data were used, we were not able to explore the impact of variables not contained in the data set, such as the role of mental illness. Conclusion: Our findings suggest a prominent elevation in the risk of suicide among offspring who lost their parents to suicide. The risk elevation differed according to the sex of the afflicted offspring as well as to their age at exposure.


2020 ◽  
Vol 132 (4) ◽  
pp. 998-1005 ◽  
Author(s):  
Haihui Jiang ◽  
Yong Cui ◽  
Xiang Liu ◽  
Xiaohui Ren ◽  
Mingxiao Li ◽  
...  

OBJECTIVEThe aim of this study was to investigate the relationship between extent of resection (EOR) and survival in terms of clinical, molecular, and radiological factors in high-grade astrocytoma (HGA).METHODSClinical and radiological data from 585 cases of molecularly defined HGA were reviewed. In each case, the EOR was evaluated twice: once according to contrast-enhanced T1-weighted images (CE-T1WI) and once according to fluid attenuated inversion recovery (FLAIR) images. The ratio of the volume of the region of abnormality in CE-T1WI to that in FLAIR images (VFLAIR/VCE-T1WI) was calculated and a receiver operating characteristic curve was used to determine the optimal cutoff value for that ratio. Univariate and multivariate analyses were performed to identify the prognostic value of each factor.RESULTSBoth the EOR evaluated from CE-T1WI and the EOR evaluated from FLAIR could divide the whole cohort into 4 subgroups with different survival outcomes (p < 0.001). Cases were stratified into 2 subtypes based on VFLAIR/VCE-T1WIwith a cutoff of 10: a proliferation-dominant subtype and a diffusion-dominant subtype. Kaplan-Meier analysis showed a significant survival advantage for the proliferation-dominant subtype (p < 0.0001). The prognostic implication has been further confirmed in the Cox proportional hazards model (HR 1.105, 95% CI 1.078–1.134, p < 0.0001). The survival of patients with proliferation-dominant HGA was significantly prolonged in association with extensive resection of the FLAIR abnormality region beyond contrast-enhancing tumor (p = 0.03), while no survival benefit was observed in association with the extensive resection in the diffusion-dominant subtype (p=0.86).CONCLUSIONSVFLAIR/VCE-T1WIis an important classifier that could divide the HGA into 2 subtypes with distinct invasive features. Patients with proliferation-dominant HGA can benefit from extensive resection of the FLAIR abnormality region, which provides the theoretical basis for a personalized resection strategy.


Risks ◽  
2021 ◽  
Vol 9 (6) ◽  
pp. 103
Author(s):  
Morne Joubert ◽  
Tanja Verster ◽  
Helgard Raubenheimer ◽  
Willem D. Schutte

Survival analysis is one of the techniques that could be used to predict loss given default (LGD) for regulatory capital (Basel) purposes. When using survival analysis to model LGD, a proposed methodology is the default weighted survival analysis (DWSA) method. This paper is aimed at adapting the DWSA method (used to model Basel LGD) to estimate the LGD for International Financial Reporting Standard (IFRS) 9 impairment requirements. The DWSA methodology allows for over recoveries, default weighting and negative cashflows. For IFRS 9, this methodology should be adapted, as the estimated LGD is a function of in the expected credit losses (ECL). Our proposed IFRS 9 LGD methodology makes use of survival analysis to estimate the LGD. The Cox proportional hazards model allows for a baseline survival curve to be adjusted to produce survival curves for different segments of the portfolio. The forward-looking LGD values are adjusted for different macro-economic scenarios and the ECL is calculated for each scenario. These ECL values are probability weighted to produce a final ECL estimate. We illustrate our proposed IFRS 9 LGD methodology and ECL estimation on a dataset from a retail portfolio of a South African bank.


2021 ◽  
Vol 21 (1) ◽  
Author(s):  
Maryam Farhadian ◽  
Sahar Dehdar Karsidani ◽  
Azadeh Mozayanimonfared ◽  
Hossein Mahjub

Abstract Background Due to the limited number of studies with long term follow-up of patients undergoing Percutaneous Coronary Intervention (PCI), we investigated the occurrence of Major Adverse Cardiac and Cerebrovascular Events (MACCE) during 10 years of follow-up after coronary angioplasty using Random Survival Forest (RSF) and Cox proportional hazards models. Methods The current retrospective cohort study was performed on 220 patients (69 women and 151 men) undergoing coronary angioplasty from March 2009 to March 2012 in Farchshian Medical Center in Hamadan city, Iran. Survival time (month) as the response variable was considered from the date of angioplasty to the main endpoint or the end of the follow-up period (September 2019). To identify the factors influencing the occurrence of MACCE, the performance of Cox and RSF models were investigated in terms of C index, Integrated Brier Score (IBS) and prediction error criteria. Results Ninety-six patients (43.7%) experienced MACCE by the end of the follow-up period, and the median survival time was estimated to be 98 months. Survival decreased from 99% during the first year to 39% at 10 years' follow-up. By applying the Cox model, the predictors were identified as follows: age (HR = 1.03, 95% CI 1.01–1.05), diabetes (HR = 2.17, 95% CI 1.29–3.66), smoking (HR = 2.41, 95% CI 1.46–3.98), and stent length (HR = 1.74, 95% CI 1.11–2.75). The predictive performance was slightly better by the RSF model (IBS of 0.124 vs. 0.135, C index of 0.648 vs. 0.626 and out-of-bag error rate of 0.352 vs. 0.374 for RSF). In addition to age, diabetes, smoking, and stent length, RSF also included coronary artery disease (acute or chronic) and hyperlipidemia as the most important variables. Conclusion Machine-learning prediction models such as RSF showed better performance than the Cox proportional hazards model for the prediction of MACCE during long-term follow-up after PCI.


Sign in / Sign up

Export Citation Format

Share Document