Overall survival (OS) with docetaxel (D) vs novel hormonal therapy (NHT) with abiraterone (A) or enzalutamide (E) after a prior NHT in patients (Pts) with metastatic prostate cancer (mPC): Results from a real-world dataset.

2020 ◽  
Vol 38 (15_suppl) ◽  
pp. 5537-5537 ◽  
Author(s):  
Umang Swami ◽  
Jennifer Anne Sinnott ◽  
Ben Haaland ◽  
Benjamin Louis Maughan ◽  
Nityam Rathi ◽  
...  

5537 Background: NHT (A and E) are approved first-line (1L) treatment (Rx) for mPC. After progression on NHT, Rx include either alternate NHT or D. However, OS from a randomized trial comparing NHT vs D after progression on 1L NHT has not been reported. Methods: Pts data were extracted from the Flatiron Health EHR-derived de-identified database. Inclusion: diagnosis of mPC; 1L Rx with single agent A or E only, single-agent Rx with alternate NHT (E or A) or D in second line (2L). Exclusion: > 180 days between date of diagnosis of mPC and date of next visit to ensure Pts were actively engaged in care at data-providing site; Rx with NHT in non-metastatic setting, any prior exposure to D. OS was compared using Cox proportional hazards model stratified by Rx propensity score. Each Pts’ probability of receiving D (rather than NHT) was modeled via a random forest based on Pts and disease characteristics which may drive treatment selection. These included pre-2L Rx ECOG scores, PSA, LDH, ALPH, Hb, age, ICD codes for liver metastasis, diabetes, neuropathy, and heart failure; insurance payer, year of start of 2L Rx, time on 1 L NHT, Gleason score, PSA at the original diagnosis of mPC. Subgroup analyses included 1L Rx duration < 12 mos. Results: 1165 Pts between 2/5/2013 to 9/27/2019 were eligible. Median follow up 8 mos (range 0.1-64.5). Median OS after 1L A was higher with E as compared to D (15.7 vs. 9.4 mos). Median OS after 1L E was higher with A as compared to D (13.3 vs. 9.7 mos) (table). Propensity distributions were overlapping among Rx arms and showed only modest imbalance. In 2L, D had a worse adjusted hazard ratio of 1.29 and 1.35 as compared to E and A respectively (p < 0.05). Similar results were seen with 1L Rx duration of < 12 mos (p < 0.05). Conclusions: These hypothesis-generating data provide real-world OS estimates with 2L D & NHT in mPC. In propensity-stratified analyses, mPC Pts who progressed on NHT had a worse OS with 2L D as compared to alternate NHT. Results were consistent in unadjusted analysis & subgroup analyses of 1L Rx < 12 mos. Results are subject to residual confounding and missingness. After prospective validation these data may aid in Rx sequencing, Pts counselling, and design of future clinical trials in this setting. [Table: see text]

2019 ◽  
Vol 98 (12) ◽  
pp. 2749-2760 ◽  
Author(s):  
Gilles Salles ◽  
Emmanuel Bachy ◽  
Lukas Smolej ◽  
Martin Simkovic ◽  
Lucile Baseggio ◽  
...  

AbstractAfter analyzing treatment patterns in chronic lymphocytic leukemia (CLL) (objective 1), we investigated the relative effectiveness of ibrutinib versus other commonly used treatments (objective 2) in patients with treatment-naïve and relapsed/refractory CLL, comparing patient-level data from two randomized registration trials with two real-world databases. Hazard ratios (HR) and 95% confidence intervals (CIs) were estimated using a multivariate Cox proportional hazards model, adjusted for differences in baseline characteristics. Rituximab-containing regimens were often prescribed in clinical practice. The most frequently prescribed regimens were fludarabine + cyclophosphamide + rituximab (FCR, 29.3%), bendamustine + rituximab (BR, 17.7%), and other rituximab-containing regimens (22.0%) in the treatment-naïve setting (n = 604), other non-FCR/BR rituximab-containing regimens (38.7%) and non-rituximab–containing regimens (28.5%) in the relapsed/refractory setting (n = 945). Adjusted HRs (95% CI) for progression-free survival (PFS) and overall survival (OS), respectively, with ibrutinib versus real-world regimens were 0.23 (0.14–0.37; p < 0.0001) and 0.40 (0.22–0.76; p = 0.0048) in the treatment-naïve setting, and 0.21 (0.16–0.27; p < 0.0001) and 0.29 (0.21–0.41; p < 0.0001) in the relapsed/refractory setting. When comparing real-world use of ibrutinib (n = 53) versus other real-world regimens in relapsed/refractory CLL (objective 3), adjusted HRs (95% CI) were 0.37 (0.22–0.63; p = 0.0003) for PFS and 0.53 (0.27–1.03; p < 0.0624) for OS. This adjusted analysis, based on nonrandomized patient data, suggests ibrutinib to be more effective than other commonly used regimens for CLL.


Blood ◽  
2021 ◽  
Vol 138 (Supplement 1) ◽  
pp. 5033-5033
Author(s):  
David L. Grinblatt ◽  
Wei Han ◽  
David Nimke ◽  
Qi Feng ◽  
Loretta Sullivan ◽  
...  

Abstract Background: FLT3 tyrosine kinase inhibitors (TKIs) have improved outcomes in clinical trials for patients with FLT3 mutation-positive acute myeloid leukemia (FLT3mut+ AML). Gilteritinib is the first FDA-approved (11/28/2018) targeted therapy for relapsed/refractory (R/R) FLT3mut+ AML in adults, although two other multikinase inhibitors, midostaurin and sorafenib, are used off-label in this population. This analysis characterized treatment duration for these three drugs in R/R FLT3mut+ AML, as little is known about the treatment of patients receiving a FLT3 inhibitor in the real world. Aim/Objective: To describe real-world treatment patterns of patients newly initiating a FLT3 TKI for R/R AML following the launch of gilteritinib. Methods: This was a post hoc analysis of a US-based retrospective cohort study using IBM MarketScan claims data (1/1/2007-10/30/2020) to evaluate real-world treatment patterns. Adult patients (≥18 years) with R/R AML who newly initiated a FLT3 TKI (ie, ≥1 claim for gilteritinib, midostaurin, or sorafenib) on or after 12/1/2018 were eligible for inclusion. Included patients were required to have continuous enrollment starting 180 days prior to first AML diagnosis through first FLT3 TKI initiated for R/R AML on or after 12/1/2018 (index date). FLT3 TKI treatment duration was calculated as the difference between the first claim and the last day of supply or end of enrollment/study, whichever occurred first. Treatments that were not discontinued prior to the end of the study period (10/30/2020) were censored. Treatment duration was estimated using Kaplan-Meier analysis, and the hazard ratio for discontinuation was estimated with a Cox proportional hazards model. Additional subgroup analyses were conducted based on prior FLT3 TKI exposure for R/R AML and use of FLT3 TKI therapy alone or in combination with chemotherapy. Data from the ProMetrics specialty pharmacy database (12/6/2018-3/3/2021) were also analyzed to establish a benchmark for gilteritinib and validate the treatment duration in the MarketScan results. Results: A total of 65 patients newly initiating FLT3 TKIs for R/R AML were identified in the MarketScan database. Mean patient age was 53.4 years and mean Quan-Charlson Comorbidity index score at baseline was 5.1. Most patients initiating FLT3 TKI therapy received gilteritinib (n=44 [68%]). Patients initiating gilteritinib compared with other FLT3 TKIs had the highest prior history of high-intensity chemotherapy (n=24 [47%]) and hematopoietic stem cell transplantation (n=16 [31%]). Midostaurin was the most common prior TKI for patients who initiated sorafenib (n=6 [50%]) or gilteritinib (n=28 [55%]). The median (95% CI) treatment duration was 150 (73-260) days for gilteritinib (n=51), 60 (15-210) days for sorafenib (n=12), and 54 (28-268) days for midostaurin (n=12). Treatment duration was significantly longer for patients receiving gilteritinib compared with midostaurin (P=.0018) and sorafenib (P=.0016) in the Cox proportional hazards model (Table) and the Kaplan-Meier analysis (P=.0021) (Figure). Differences in gilteritinib duration in patients with prior TKI treatment versus no prior TKI treatment and patients who used a FLT3 TKI alone versus in combination with chemotherapy were not statistically significant (Table). The median gilteritinib treatment duration in the MarketScan data (150 days) was aligned with the median duration in the ProMetrics specialty pharmacy data (154 days), which validates the MarketScan results. Conclusions: This early look at treatment patterns suggests a median duration of therapy for gilteritinib of 150 days. Importantly, gilteritinib treatment duration does not appear to differ based on prior TKI treatment or overlapping use with chemotherapy. Small sample sizes precluded adjusted comparisons between the treatments. Gilteritinib was the most commonly used treatment. The availability of new targeted therapies such as gilteritinib is promising for patients with R/R FLT3mut+ AML and is changing the therapeutic landscape for this aggressive AML subtype. Figure 1 Figure 1. Disclosures Grinblatt: Astellas Pharma, Inc.: Consultancy; Bristol Myers Squibb: Consultancy; Astra Zeneca: Consultancy; AbbVie: Consultancy. Han: Astellas Pharma, Inc.: Current Employment. Nimke: Astellas Pharma, Inc.: Current Employment. Feng: Astellas Pharma, Inc.: Current Employment. Sullivan: Astellas Pharma, Inc.: Current Employment. Pandya: Astellas Pharma, Inc.: Current Employment.


Circulation ◽  
2015 ◽  
Vol 132 (suppl_3) ◽  
Author(s):  
Steven Deitelzweig ◽  
Amanda Bruno ◽  
Natalie Tate ◽  
Augustina Ogbonnaya ◽  
Manan Shah ◽  
...  

Real-world evidence highlighting the risks and benefits of novel oral anticoagulants (NOCAs) is lacking. This study compared major and clinically relevant non-major (CRNM) bleeding risk and costs among non-valvular atrial fibrillation (NVAF) patients newly treated with apixaban, dabigatran, rivaroxaban, or warfarin. A retrospective analysis of NVAF patients newly treated with apixaban, dabigatran, rivaroxaban, or warfarin was conducted using PharMetrics Plus data from 1/ 2012 - 9/ 2014. Patients were indexed on the date of the first anticoagulant prescription, and were required to be ≥18 years old and have CHA 2 DS 2 -VASc score > 0 and ≥ 1 month of follow-up. Patients were followed until discontinuation (≥30-day gap in treatment), treatment switch, end of continuous enrollment, 1 year post-index, or end of study. Major and CRNM bleeding, and bleeding-related costs were measured. Cox proportional hazards model was used to examine the association between anticoagulants and risk of bleeding and GLM was used to evaluate bleeding-related costs. The study included 24,573 NVAF patients; distributed as apixaban 11.7%, dabigatran 12.0%, rivaroxaban 36.7%, and warfarin 39.6%. Mean age was 64.4 and 66.5% were males. HAS-BLED and CHA 2 DS 2 -VASc scores averaged 2.0 and 2.7, respectively. After adjusting for differences in baseline characteristics, when compared to apixaban patients, rivaroxaban (HR: 1.5; P =0.0013) and warfarin (HR: 1.7; P <0.0001) patients were more likely to have major bleeding, and dabigatran (HR: 1.3; P =0.0030), rivaroxaban (HR: 1.7; P <0.0001), and warfarin (HR: 1.4; P <0.0001) patients were more likely to have CRNM bleeding. Major bleeding risk was similar between apixaban and dabigatran patients. Major and CRNM bleeding costs, when compared to apixaban patients ($154 and $18), were significantly higher for dabigatran ($457; P <0.0001 and $39; P <0.0001), rivaroxaban ($420; P <0.0001 and $61; P <0.0001), and warfarin ($511; P <0.0001 and $63; P <0.0001) patients. Among anticoagulant-naive moderate-to-high risk NVAF patients encountered in real-world clinical setting, major bleeding was lower with apixaban compared to warfarin and rivaroxaban. Bleeding costs were lower with apixaban compared to alternative NOACs and warfarin.


2021 ◽  
Vol 21 (1) ◽  
Author(s):  
Maryam Farhadian ◽  
Sahar Dehdar Karsidani ◽  
Azadeh Mozayanimonfared ◽  
Hossein Mahjub

Abstract Background Due to the limited number of studies with long term follow-up of patients undergoing Percutaneous Coronary Intervention (PCI), we investigated the occurrence of Major Adverse Cardiac and Cerebrovascular Events (MACCE) during 10 years of follow-up after coronary angioplasty using Random Survival Forest (RSF) and Cox proportional hazards models. Methods The current retrospective cohort study was performed on 220 patients (69 women and 151 men) undergoing coronary angioplasty from March 2009 to March 2012 in Farchshian Medical Center in Hamadan city, Iran. Survival time (month) as the response variable was considered from the date of angioplasty to the main endpoint or the end of the follow-up period (September 2019). To identify the factors influencing the occurrence of MACCE, the performance of Cox and RSF models were investigated in terms of C index, Integrated Brier Score (IBS) and prediction error criteria. Results Ninety-six patients (43.7%) experienced MACCE by the end of the follow-up period, and the median survival time was estimated to be 98 months. Survival decreased from 99% during the first year to 39% at 10 years' follow-up. By applying the Cox model, the predictors were identified as follows: age (HR = 1.03, 95% CI 1.01–1.05), diabetes (HR = 2.17, 95% CI 1.29–3.66), smoking (HR = 2.41, 95% CI 1.46–3.98), and stent length (HR = 1.74, 95% CI 1.11–2.75). The predictive performance was slightly better by the RSF model (IBS of 0.124 vs. 0.135, C index of 0.648 vs. 0.626 and out-of-bag error rate of 0.352 vs. 0.374 for RSF). In addition to age, diabetes, smoking, and stent length, RSF also included coronary artery disease (acute or chronic) and hyperlipidemia as the most important variables. Conclusion Machine-learning prediction models such as RSF showed better performance than the Cox proportional hazards model for the prediction of MACCE during long-term follow-up after PCI.


Author(s):  
Yuko Yamaguchi ◽  
Marta Zampino ◽  
Toshiko Tanaka ◽  
Stefania Bandinelli ◽  
Yusuke Osawa ◽  
...  

Abstract Background Anemia is common in older adults and associated with greater morbidity and mortality. The causes of anemia in older adults have not been completely characterized. Although elevated circulating growth and differentiation factor 15 (GDF-15) has been associated with anemia in older adults, it is not known whether elevated GDF-15 predicts the development of anemia. Methods We examined the relationship between plasma GDF-15 concentrations at baseline in 708 non-anemic adults, aged 60 years and older, with incident anemia during 15 years of follow-up among participants in the Invecchiare in Chianti (InCHIANTI) Study. Results During follow-up, 179 (25.3%) participants developed anemia. The proportion of participants who developed anemia from the lowest to highest quartile of plasma GDF-15 was 12.9%, 20.1%, 21.2%, and 45.8%, respectively. Adults in the highest quartile of plasma GDF-15 had an increased risk of developing anemia (Hazards Ratio 1.15, 95% Confidence Interval 1.09, 1.21, P&lt;.0001) compared to those in the lower three quartiles in a multivariable Cox proportional hazards model adjusting for age, sex, serum iron, soluble transferrin receptor, ferritin, vitamin B12, congestive heart failure, diabetes mellitus, and cancer. Conclusions Circulating GDF-15 is an independent predictor for the development of anemia in older adults.


2020 ◽  
Vol 4 (Supplement_1) ◽  
pp. 161-161
Author(s):  
Jane Banaszak-Holl ◽  
Xiaoping Lin ◽  
Jing Xie ◽  
Stephanie Ward ◽  
Henry Brodaty ◽  
...  

Abstract Research Aims: This study seeks to understand whether those with dementia experience higher risk of death, using data from the ASPREE (ASPirin in Reducing Events in the Elderly) clinical trial study. Methods: ASPREE was a primary intervention trial of low-dose aspirin among healthy older people. The Australian cohort included 16,703 dementia-free participants aged 70 years and over at enrolment. Participants were triggered for dementia adjudication if cognitive test results were poorer than expected, self-reporting dementia diagnosis or memory problems, or dementia medications were detected. Incidental dementia was adjudicated by an international adjudication committee using the Diagnostic and Statistical Manual for Mental Disorders (DSM-IV) criteria and results of a neuropsychological battery and functional measures with medical record substantiation. Statistical analyses used a cox proportional hazards model. Results: As previously reported, 1052 participants (5.5%) died during a median of 4.7 years of follow-up and 964 participants had a dementia trigger, of whom, 575 (60%) were adjucated as having dementia. Preliminary analyses has shown that the mortality rate was higher among participants with a dementia trigger, regardless of dementia adjudication outcome, than those without (15% vs 5%, Χ2 = 205, p &lt;.001). Conclusion: This study will provide important analyses of differences in the hazard ratio for mortality and causes of death among people with and without cognitive impairment and has important implications on service planning.


Risks ◽  
2021 ◽  
Vol 9 (7) ◽  
pp. 121
Author(s):  
Beata Bieszk-Stolorz ◽  
Krzysztof Dmytrów

The aim of our research was to compare the intensity of decline and then increase in the value of basic stock indices during the SARS-CoV-2 coronavirus pandemic in 2020. The survival analysis methods used to assess the risk of decline and chance of rise of the indices were: Kaplan–Meier estimator, logit model, and the Cox proportional hazards model. We observed the highest intensity of decline in the European stock exchanges, followed by the American and Asian plus Australian ones (after the fourth and eighth week since the peak). The highest risk of decline was in America, then in Europe, followed by Asia and Australia. The lowest risk was in Africa. The intensity of increase was the highest in the fourth and eleventh week since the minimal value had been reached. The highest odds of increase were in the American stock exchanges, followed by the European and Asian (including Australia and Oceania), and the lowest in the African ones. The odds and intensity of increase in the stock exchange indices varied from continent to continent. The increase was faster than the initial decline.


BMC Nutrition ◽  
2021 ◽  
Vol 7 (1) ◽  
Author(s):  
Akiko Nakanishi ◽  
Erika Homma ◽  
Tsukasa Osaki ◽  
Ri Sho ◽  
Masayoshi Souri ◽  
...  

Abstract Background Dairy products are known as health-promoting foods. This study prospectively examined the association between milk and yogurt intake and mortality in a community-based population. Methods The study population comprised of 14,264 subjects aged 40–74 years who participated in an annual health checkup. The frequency of yogurt and milk intake was categorized as none (< 1/month), low (< 1/week), moderate (1–6/week), and high (> 1/day) intake. The association between yogurt and milk intake and total, cardiovascular, and cancer-related mortalities was determined using the Cox proportional hazards model. Results During the follow-up period, there were 265 total deaths, 40 cardiovascular deaths and 90 cancer-related deaths. Kaplan–Meier analysis showed that the total mortality in high/moderate/low yogurt intake and moderate/low milk intake groups was lower than that in none group (log-rank, P < 0.01). In the multivariate Cox proportional hazard analysis adjusted for possible confounders, the hazard ratio (HR) for total mortality significantly decreased in high/moderate yogurt intake group (HR: 0.62, 95% confidence interval [CI]: 0.42–0.91 for high intake, HR: 0.70, 95%CI: 0.49–0.99 for moderate intake) and moderate milk intake group (HR: 0.67, 95% CI: 0.46–0.97) compared with the none yogurt and milk intake groups. A similar association was observed for cancer-related mortality, but not for cardiovascular mortality. Conclusions Our study showed that yogurt and milk intake was independently associated with a decrease in total and cancer-related mortalities in the Japanese population.


2020 ◽  
Vol 41 (Supplement_2) ◽  
Author(s):  
A Fujino ◽  
H Ogawa ◽  
S Ikeda ◽  
K Doi ◽  
Y Hamatani ◽  
...  

Abstract Background Atrial fibrillation (AF) commonly progresses from paroxysmal type to sustained type in the natural course of the disease, and we previously demonstrated that the progression of AF was associated with increased risk of clinical adverse events. There are some patients, though less frequently, who regress from sustained to paroxysmal AF, but the clinical impact of the regression of AF remains unknown. Purpose We sought to investigate whether regression from sustained to paroxysmal AF is associated with better clinical outcomes. Methods Using the dataset of the Fushimi AF Registry, patients who were diagnosed as sustained (persistent or permanent) AF at baseline were studied. Conversion of sustained AF to paroxysmal AF during follow-up was defined as regression of AF. Major adverse cardiac events (MACE) were defined as the composite of cardiac death, stroke, and hospitalization for heart failure (HF). Event rates were compared between the patients with and without regression of AF. In patients with sustained AF at baseline, predictors of MACE were identified using Cox proportional hazards model. Results Among 2,253 patients who were diagnosed as sustained AF at baseline, regression of AF was observed in 9.0% (202/2,253, 2.0 per 100 patient-years) during a median follow-up of 4.0 years. Of these, 24.3% (49/202, 4.6 per 100 patient-years) of the patients finally recurred to sustained AF during follow-up. The proportion of asymptomatic patients was lower in patients with regression of AF than those without (with vs without regression; 49.0% vs 69.5%, p&lt;0.01). The percentage of beta-blocker use at baseline was similar between the two groups (37.2% vs 33.8%, p=0.34). The prevalence of patients who underwent catheter ablation or electrical cardioversion during follow-up was higher in patients with regression of AF (catheter ablation: 15.8% vs 5.5%; p&lt;0.01, cardioversion: 4.0% vs 1.4%; p&lt;0.01, respectively). The rate of MACE was significantly lower in patients with regression of AF as compared with patients who maintained sustained AF (3.7 vs 6.2 per 100 patient-years, log-rank p&lt;0.01). Figure shows the Kaplan-Meier curves for MACE, cardiac death, hospitalization for heart failure, and stroke. In patients with sustained AF at baseline, multivariable Cox proportional hazards model demonstrated that regression of AF was an independent predictor of lower MACE (adjusted hazard ratio [HR]: 0.50, 95% confidence interval [CI]: 0.28 to 0.88, p=0.02), stroke (HR: 0.51, 95% CI: 0.30 to 0.88, p=0.02), and hospitalization for HF (HR: 0.50, 95% CI: 0.29 to 0.85, p=0.01). Conclusion Regression from sustained to paroxysmal AF was associated with a lower incidence of adverse cardiac events. Funding Acknowledgement Type of funding source: None


Author(s):  
Majdi Imterat ◽  
Tamar Wainstock ◽  
Eyal Sheiner ◽  
Gali Pariente

Abstract Recent evidence suggests that a long inter-pregnancy interval (IPI: time interval between live birth and estimated time of conception of subsequent pregnancy) poses a risk for adverse short-term perinatal outcome. We aimed to study the effect of short (<6 months) and long (>60 months) IPI on long-term cardiovascular morbidity of the offspring. A population-based cohort study was performed in which all singleton live births in parturients with at least one previous birth were included. Hospitalizations of the offspring up to the age of 18 years involving cardiovascular diseases and according to IPI length were evaluated. Intermediate interval, between 6 and 60 months, was considered the reference. Kaplan–Meier survival curves were used to compare the cumulative morbidity incidence between the groups. Cox proportional hazards model was used to control for confounders. During the study period, 161,793 deliveries met the inclusion criteria. Of them, 14.1% (n = 22,851) occurred in parturient following a short IPI, 78.6% (n = 127,146) following an intermediate IPI, and 7.3% (n = 11,796) following a long IPI. Total hospitalizations of the offspring, involving cardiovascular morbidity, were comparable between the groups. The Kaplan–Meier survival curves demonstrated similar cumulative incidences of cardiovascular morbidity in all groups. In a Cox proportional hazards model, short and long IPI did not appear as independent risk factors for later pediatric cardiovascular morbidity of the offspring (adjusted HR 0.97, 95% CI 0.80–1.18; adjusted HR 1.01, 95% CI 0.83–1.37, for short and long IPI, respectively). In our population, extreme IPIs do not appear to impact long-term cardiovascular hospitalizations of offspring.


Sign in / Sign up

Export Citation Format

Share Document