scholarly journals Survival in elderly glioblastoma patients treated with bevacizumab-based regimens in the United States

2018 ◽  
Vol 5 (4) ◽  
pp. 251-261 ◽  
Author(s):  
Jessica Davies ◽  
Irmarie Reyes-Rivera ◽  
Thirupathi Pattipaka ◽  
Stephen Skirboll ◽  
Beatrice Ugiliweneza ◽  
...  

AbstractBackgroundThe efficacy of bevacizumab (BEV) in elderly patients with glioblastoma remains unclear. We evaluated the effect of BEV on survival in this patient population using the Survival, Epidemiology, and End Results (SEER)-Medicare database.MethodsThis retrospective, cohort study analyzed SEER-Medicare data for patients (aged ≥66 years) diagnosed with glioblastoma from 2006 to 2011. Two cohorts were constructed: one comprised patients who had received BEV (BEV cohort); the other comprised patients who had received any anticancer treatment other than BEV (NBEV cohort). The primary analysis used a multivariate Cox proportional hazards model to compare overall survival in the BEV and NBEV cohorts with initiation of BEV as a time-dependent variable, adjusting for potential confounders (age, gender, Charlson comorbidity index, region, race, radiotherapy after initial surgery, and diagnosis of coronary artery disease). Sensitivity analyses were conducted using landmark survival, propensity score modeling, and the impact of poor Karnofsky Performance Status.ResultsWe identified 2603 patients (BEV, n = 597; NBEV, n = 2006). In the BEV cohort, most patients were Caucasian males and were younger with fewer comorbidities and more initial resections. In the primary analysis, the BEV cohort showed a lower risk of death compared with the NBEV cohort (hazard ratio, 0.80; 95% confidence interval, 0.72–0.89; P < .01). The survival benefit of BEV appeared independent of the number of temozolomide cycles or frontline treatment with radiotherapy and temozolomide.ConclusionBEV exposure was associated with a lower risk of death, providing evidence that there might be a potential benefit of BEV in elderly patients with glioblastoma.

Circulation ◽  
2014 ◽  
Vol 129 (suppl_1) ◽  
Author(s):  
Mary Cushman ◽  
Suzanne E Judd ◽  
Virginia J Howard ◽  
Neil A Zakai ◽  
Brett Kissela ◽  
...  

Background: The Life’s Simple 7 (LSS) metric is being used by AHA to track the cardiovascular health of the United States population and move toward a 2020 impact goal for improvement. Levels of LSS are associated with mortality risk but there are limited data on whether this association differs by race or sex. Hypothesis: There will be sex and race differences in the association of LSS with mortality in the REGARDS cohort study. Methods: We studied 29,692 REGARDS participants; a population sample of black and white men and women aged 45-98 from across the US, enrolled in 2003-7. Extensive baseline risk factor data were measured in participants’ homes. The 7 LSS components (blood pressure, cholesterol, glucose, body-mass index, smoking, physical activity, diet) were each scored in AHA-defined categories of poor (0 points), intermediate (1 point) and ideal (2 points), and were summed to yield scores ranging from poor for all (0) to ideal for all (14). With 6.4 years follow up there were 3709 deaths. Results: The LSS score was normally distributed with mean (SD) of 7.9 (2.0) in whites and 6.9 (2.0) in blacks. The age, region, income and education adjusted hazard ratio (HR) of death for a 1-unit worse LSS score, stratified by race and sex, are shown in the table. Race and sex interactions were tested individually in separate models. While better scores for LSS were strongly associated with lower mortality, associations differed by race and sex, being weaker in blacks than whites and in men than women. Conclusion: There were large associations of LSS with mortality risk in the REGARDS national sample; 1 point difference in score, corresponding to movement from poor to intermediate or intermediate to ideal for 1 of the 7 factors, was associated with a 16% lower risk of death in white women, 14% lower risk in white men or black women, but only an 11% lower risk in black men. Observed differences in the association of LSS with mortality by race and sex should be considered in efforts to gauge the impact of LSS interventions on health disparities.


2020 ◽  
Vol 41 (Supplement_2) ◽  
Author(s):  
S.P Patil ◽  
K Gonuguntla ◽  
C Rojulpote ◽  
A.J Borja ◽  
V Zhang ◽  
...  

Abstract Introduction Influenza vaccination is associated with lower risk of death as well as major adverse cardiovascular events, including acute myocardial infarction (AMI), heart failure and stroke. Purpose The impact of Influenza vaccination on in-hospital mortality in patients with AMI with a prior history of percutaneous coronary intervention (PCI) or coronary artery bypass graft (CABG) is largely unknown. We hypothesize that such individuals who develop AMI have better outcomes if they had received influenza vaccine. Methods We analyzed the United States National Inpatient Sample Database from 2010–2014 to identify patients with primary discharge diagnosis of AMI (STEMI, NSTEMI) with a history of prior PCI or CABG. In this cohort, patients with influenza vaccination were identified using ICD-9 code V04.81. The primary outcome was in-hospital mortality. Chi-square test and multivariate regression model controlling for age, gender, race, type of AMI and co-morbidities were employed for statistical analysis. Results A total of 495,619 patients with ACS were identified who had prior PCI or CABG and 6525 had positive influenza vaccination status. Influenza vaccination was independently associated with lower risk of in-hospital mortality in patients with AMI (aOR = 0.253, 95% CI: 0.196–0.328; p&lt;0.001). Conclusion Vaccination against influenza was associated with lower risk of in-hospital mortality in patients with prior PCI or CABG who developed AMI. Figure 1 Funding Acknowledgement Type of funding source: None


2011 ◽  
Vol 31 (6) ◽  
pp. 663-671 ◽  
Author(s):  
Wai H. Lim ◽  
Gursharan K. Dogra ◽  
Stephen P. McDonald ◽  
Fiona G. Brown ◽  
David W. Johnson

BackgroundThe number of elderly patients with end-stage kidney disease (ESKD) is increasing worldwide, but the proportion of elderly patients commencing peritoneal dialysis (PD) is falling. The reluctance of elderly ESKD patients to consider PD may be related to a perception that PD is associated with greater rates of complications. In the present study, we compared outcomes between younger and older PD patients.MethodsUsing Australia and New Zealand Dialysis Registry data, all adult ESKD patients commencing PD between 1991 and 2007 were categorized into under 50, 50 – 64.9, and 65 years of age or older groups. Time to first peritonitis, death-censored technique failure, and peritonitis-associated and all-cause mortality were evaluated by multivariate Cox proportional hazards model analysis.ResultsOf the 12932 PD patients included in the study, 3370 (26%) were under 50 years of age, 4386 (34%) were 50 – 64.9 years of age, and 5176 (40%) were 65 years of age or older. Compared with younger patients (<50 years), elderly patients (≥65 years) had a similar peritonitis-free survival and a lower risk of death-censored technique failure [hazard ratio (HR): 0.85; 95% confidence interval (CI): 0.79 to 0.93], but they had higher peritonitis-related (HR: 2.31; 95% CI: 1.68 to 3.18) and all-cause mortality (HR: 2.90; 95% CI: 2.60 to 3.23).ConclusionsNot unexpectedly, elderly patients have higher peritonitis-related and all-cause mortality, which is likely a consequence of a greater prevalence of comorbid disease. However, compared with younger patients, elderly patients have superior technique survival and similar peritonitis-free survival, suggesting that PD is a viable renal replacement therapy in this group of patients.


2021 ◽  
pp. 088307382110001
Author(s):  
Jody L. Lin ◽  
Joseph Rigdon ◽  
Keith Van Haren ◽  
MyMy Buu ◽  
Olga Saynina ◽  
...  

Background: Gastrostomy tube (G-tube) placement for children with neurologic impairment with dysphagia has been suggested for pneumonia prevention. However, prior studies demonstrated an association between G-tube placement and increased risk of pneumonia. We evaluate the association between timing of G-tube placement and death or severe pneumonia in children with neurologic impairment. Methods: We included all children enrolled in California Children’s Services between July 1, 2009, and June 30, 2014, with neurologic impairment and 1 pneumonia hospitalization. Prior to analysis, children with new G-tubes and those without were 1:2 propensity score matched on sociodemographics, medical complexity, and severity of index hospitalization. We used a time-varying Cox proportional hazard model for subsequent death or composite outcome of death or severe pneumonia to compare those with new G-tubes vs those without, adjusting for covariates described above. Results: A total of 2490 children met eligibility criteria, of whom 219 (9%) died and 789 (32%) had severe pneumonia. Compared to children without G-tubes, children with new G-tubes had decreased risk of death (hazard ratio [HR] 0.47, 95% confidence interval [CI] 0.39-0.55) but increased risk of the composite outcome (HR 1.21, CI 1.14-1.27). Sensitivity analyses using varied time criteria for definitions of G-tube and outcome found that more recent G-tube placement had greater associated risk reduction for death but increased risk of severe pneumonia. Conclusion: Recent G-tube placement is associated with reduced risk of death but increased risk of severe pneumonia. Decisions to place G-tubes for pulmonary indications in children with neurologic impairment should weigh the impact of severe pneumonia on quality of life.


2020 ◽  
Vol 7 (Supplement_1) ◽  
pp. S489-S490
Author(s):  
John T Henderson ◽  
Evelyn Villacorta Cari ◽  
Nicole Leedy ◽  
Alice Thornton ◽  
Donna R Burgess ◽  
...  

Abstract Background There has been a dramatic rise in IV drug use (IVDU) and its associated mortality and morbidity, however, the scope of this effect has not been described. Kentucky is at the epicenter of this epidemic and is an ideal place to better understand the health complications of IVDU in order to improve outcomes. Methods All adult in-patient admissions to University of Kentucky hospitals in 2018 with an Infectious Diseases (ID) consult and an ICD 9/10 code associated with IVDU underwent thorough retrospective chart review. Demographic, descriptive, and outcome data were collected and analyzed by standard statistical analysis. Results 390 patients (467 visits) met study criteria. The top illicit substances used were methamphetamine (37.2%), heroin (38.2%), and cocaine (10.3%). While only 4.1% of tested patients were HIV+, 74.2% were HCV antibody positive. Endocarditis (41.1%), vertebral osteomyelitis (20.8%), bacteremia without endocarditis (14.1%), abscess (12.4%), and septic arthritis (10.4%) were the most common infectious complications. The in-patient death rate was 3.0%, and 32.2% of patients were readmitted within the study period. The average length of stay was 26 days. In multivariable analysis, infectious endocarditis was associated with a statistically significant increase in risk of death, ICU admission, and hospital readmission. Although not statistically significant, trends toward mortality and ICU admission were identified for patients with prior endocarditis and methadone was correlated with decreased risk of readmission and ICU stay. FIGURE 1: Reported Substances Used FIGURE 2: Comorbidities FIGURE 3: Types of Severe Infectious Complications Conclusion We report on a novel, comprehensive perspective on the serious infectious complications of IVDU in an attempt to measure its cumulative impact in an unbiased way. This preliminary analysis of a much larger dataset (2008-2019) reveals some sobering statistics about the impact of IVDU in the United States. While it confirms the well accepted mortality and morbidity associated with infective endocarditis and bacteremia, there is a significant unrecognized impact of other infectious etiologies. Additional analysis of this data set will be aimed at identifying key predictive factors in poor outcomes in hopes of mitigating them. Disclosures All Authors: No reported disclosures


2006 ◽  
Author(s):  
Genia Long ◽  
David Cutler ◽  
Ernst Berndt ◽  
Jimmy Royer ◽  
Andrée-Anne Fournier ◽  
...  

Circulation ◽  
2020 ◽  
Vol 142 (Suppl_3) ◽  
Author(s):  
Emily P Zeitler ◽  
Andrea Austin ◽  
Daniel J Friedman ◽  
Christopher G Leggett ◽  
Lauren Gilstrap ◽  
...  

Introduction: Despite growing numbers of older HF patients, clinical trials of implantable defibrillators (ICDs) and cardiac resynchronization therapy (CRT) rarely include older patients (≥75 yrs). Hypotheses: (1) Among Medicare beneficiaries, older CRT-D patients have a higher risk of procedure-related complications than older ICD patients. (2) Compared with older ICD patients, older CRT-D patients have lower risk of death. Methods: We identified Medicare beneficiaries with HF and reduced LVEF who underwent ICD or CRT-D implant based on CPT codes (1/2008-8/2015) by age group (65-74, 75-84, and 85+). After matching device groups with inverse probability weighting (IPW), we estimated the comparative hazard ratio (HR) of death by age group and device type using a Cox proportional hazards model. Results: Compared with the ICD group, the CRT-D group was older and more likely to be white and female and have atrial fibrillation; CRT-D patients were less likely to have ischemic heart disease. Use of guideline directed medical therapy was similar between groups. In all age groups, complications were more common in the CRT-D group. IPW was successful, and after matching, the HR for death was lower in the CRT-D versus the ICD group; this finding was most pronounced in the 85+ age group in which the HR for death in the CRT-D versus ICD group was 0.76 (95% CI 0.64-0.88). (Table) Conclusions: Procedure-related complications in older HF patients were higher in CRT-D versus ICD patients and generally increased with age. Overall high post-implant mortality in ICD patients (± CRT) highlights the difficulty in assessing competing mortality risk when considering patients for an ICD especially in the oldest patients in whom clinical trial data are absent. However, in matched Medicare beneficiaries, CRT-D was associated with a lower risk of mortality in all age groups compared with ICD alone. These findings support the use of CRT in eligible older patients undergoing ICD implantation.


2021 ◽  
Vol 39 (6_suppl) ◽  
pp. 240-240
Author(s):  
Neal D. Shore ◽  
Karim Fizazi ◽  
Teuvo Tammela ◽  
Murilo Luz ◽  
Manuel Philco Salas ◽  
...  

240 Background: DARO is a structurally distinct androgen receptor inhibitor approved for the treatment of non-metastatic castration-resistant prostate cancer (nmCRPC) based on significantly prolonged metastasis-free survival compared with PBO (median 40.4 vs 18.4 months; hazard ratio [HR] 0.41; 95% confidence interval [CI] 0.34–0.50; P < 0.0001) and a favorable safety profile in the phase III ARAMIS trial. Following unblinding at the primary analysis, crossover from PBO to DARO was permitted for the subsequent open-label treatment phase. Sensitivity analyses were performed to assess the effect of PBO–DARO crossover on OS benefit. Methods: Patients (pts) with nmCRPC receiving androgen deprivation therapy were randomized 2:1 to DARO (n = 955) or PBO (n = 554). In addition to OS, secondary endpoints included times to pain progression, first cytotoxic chemotherapy, first symptomatic skeletal event, and safety. The OS analysis was planned to occur after approximately 240 deaths, and secondary endpoints were evaluated in a hierarchical order. Iterative parameter estimation (IPE) and rank-preserving structural failure time (RPSFT) analyses were performed as pre-planned sensitivity analyses to adjust for the treatment effect of PBO–DARO crossover. The IPE method used a parametric model for the survival times and iteratively determined the model parameter describing the magnitude of the treatment effect, whereas a grid search and non-parametric log-rank test were used for the RPSFT analysis. The IPE and RPSFT analyses both generated a Kaplan–Meier curve for the PBO arm that predicts what would have been observed in the absence of PBO–DARO crossover. Results: After unblinding, 170 pts (30.7% of those randomized to PBO) crossed over from PBO to DARO; median treatment duration from unblinding to the final data cut-off was 11 months. Final analysis of the combined double-blind and open label periods was conducted after 254 deaths (15.5% of DARO and 19.1% of PBO pts) and showed a statistically significant OS benefit for DARO vs PBO (HR 0.69; 95% CI 0.53–0.88; P = 0.003). Results from the IPE (HR 0.66; 95% CI 0.51–0.84; P < 0.001) and RPSFT (HR 0.68; 95% CI 0.51–0.90; P = 0.007) analyses were similar to those from the intention-to-treat population, showing that the impact of PBO–DARO crossover was small. Additional analyses accounting for the effect of PBO–DARO crossover will be presented. The safety profile of DARO continued to be favorable at the final analysis, and discontinuation rates at the end of the double-blind period remained unchanged from the primary analysis (8.9% with DARO and 8.7% with PBO). Conclusions: Early treatment with DARO in men with nmCRPC is associated with significant improvement in OS regardless of pts crossing over from PBO to DARO. The safety profile of DARO remained favorable at the final analysis. Clinical trial information: NCT02200614.


2018 ◽  
Vol 28 (2) ◽  
pp. 151-156
Author(s):  
Yefei Zhang ◽  
Maha R. Boktour

Introduction: The United Network for Organ Sharing (UNOS) instituted the Share 35 policy in June 2013 in order to reduce death on liver transplant waitlist. The effect of this policy on patient survival among patients with gender- and race-mismatched donors has not been examined. Research Question: To assess the impact of Share 35 policy on posttransplantation patient survival among patients with end-stage liver disease (ESLD) transplanted with gender- and race-mismatched donors. Design: A total of 16 467 adult patients with ESLD who underwent liver transplantation between 2012 and 2015 were identified from UNOS. An overall Cox proportional hazards model adjusting for demographic, clinical, and geographic factors and separate models with a dummy variable of pre- and post-Share 35 periods as well as its interaction with other factors were performed to model the effect of gender and race mismatch on posttransplantation patient survival and to compare the patient survival differences between the first 18 months of Share 35 policy to an equivalent time period before. Results: Comparison of the pre- and post-Share 35 periods did not show significant changes in the numbers of gender- and race-mismatched transplants, or the risk of death for gender-mismatched recipients. However, black recipients with Hispanic donors (hazard ratio: 0.51, 95% confidence interval, 0.29-0.90) had significantly increased patient survival after Share 35 policy took effect. Conclusion: The Share 35 policy had a moderate impact on posttransplantation patient survival among recipients with racially mismatched donors according to the first 18-month experience. Future research is recommended to explore long-term transplantation.


Author(s):  
T. L. Dickson ◽  
F. A. Simonen

The United States Nuclear Regulatory Commission (USNRC) initiated a comprehensive project in 1999 to determine if improved technologies can provide a technical basis to reduce the conservatism in the current regulations for pressurized thermal shock (PTS) while continuing to provide reasonable assurance of adequate protection to public health and safety. A relaxation of PTS regulations could have profound implications for plant license renewal considerations. During the PTS re-evaluation study, an improved risk-informed computational methodology was developed that provides a more realistic characterization of PTS risk. This updated methodology was recently applied to three commercial PWRs. The results of this study provide encouragement that a technical basis can be established to support a relaxation of current PTS regulations. One significant model improvement applied in the PTS re-evaluation study was the development of flaw databases derived from the non-destructive and destructive examinations of material from cancelled reactor pressure vessels (RPV). Empirically-based statistical distributions derived from these databases and expert illicitation were used to postulate the number, size, and location of flaws in welded and base metal (plate and forging) regions of an RPV during probabilistic fracture mechanics (PFM) analyses of RPVs subjected to transient loading conditions such as PTS. However, limitations in the available flaw data have required assumptions to be made to complete the risk-based flaw models. Sensitivity analyses were performed to evaluate the impact of four flaw-related assumptions. Analyses addressed: 1) truncations of distributions to exclude flaws of extreme depth dimensions, 2) vessel-to-vessel differences in flaw data, 3) large flaws observed in weld repair regions, and 4) the basis for estimating the number of surface breaking flaws. None of the four alternate weld flaw models significantly impacted calculated vessel failure frequencies or invalidated the tentative conclusions derived from the USNRC PTS re-evaluation study.


Sign in / Sign up

Export Citation Format

Share Document