Comparative effectiveness and resource utilization of nab-paclitaxel plus gemcitabine (nab-P+G) versus FOLFIRINOX (FFX) in first-line treatment of advanced pancreatic adenocarcinoma (PDAC) in a U.S. community oncology setting.

2016 ◽  
Vol 34 (4_suppl) ◽  
pp. 433-433 ◽  
Author(s):  
Fadi S. Braiteh ◽  
Manish Patel ◽  
Monika Parisi ◽  
Quanhong Ni ◽  
Si yeon Park ◽  
...  

433 Background: Both nab-P+G and FFX demonstrated superior overall survival (OS) over gemcitabine monotherapy for the treatment of PDAC; but there is no real-world effectiveness and utilization study of nab-P+G vs. FFX. The objective of this study is to compare real-world treatment patterns of patients receiving nab-P+G vs. FFX as first-line treatment for advanced PDAC. Methods: A retrospective cohort study was performed using fully de-identified data from a nationally representative electronic medical record platform of 1,300 community oncology physicians. Patients diagnosed with PDAC between September 2013 and October 2014 who received either nab-P+G or FFX as 1st line therapy were included in the analysis. We calculated median time to treatment discontinuation (TTD) and database persistence (DP), a proxy for OS, using the Kaplan Meier method, and assessed supportive care usage with Poisson regression. Results: 202 (out of 851) patients met eligibility criteria (nab-P+G, n = 122; FFX, n = 80). Patients on nab-P+G were older than patients on FFX (mean age 67.0 v 61.4; p < 0.01), but other baseline characteristics were comparable. TTD (3.4 v 3.8 mos) and DP (8.6 v 8.6 mos) were not statistically different for nab-P+G and FFX, respectively. The rate of AE-related discontinuation was similar in patients on nab-P+G and FFX (18% v 21%); however patients on nab-P+G utilized less doses of granulocyte-colony stimulating factor (GCSF) agents (2.0 v 4.4; p < 0.01), but needed more doses of erythropoietin-stimulating agents (ESA) (0.9 v 0.1; p < 0.01) and steroids (7.9 v 5.8; p < 0.01) per 100 days. Conclusions: TTD and DP for patients with advanced PDAC did not differ significantly between nab-P+G and FFX, although supportive care medications differed. Patients treated with nab-P+G required fewer doses of GCSF and more doses of ESA and steroids, though AEs leading to discontinuation was not statistically different. Management of chemotherapy related toxicities may incur substantial burden on the U.S. healthcare system, especially if an alternative therapy exists with similar clinical outcomes.

2016 ◽  
Vol 34 (4_suppl) ◽  
pp. 429-429
Author(s):  
Fadi S. Braiteh ◽  
Manish Patel ◽  
Monika Parisi ◽  
Quanhong Ni ◽  
Si yeon Park ◽  
...  

429 Background: Based on a randomized clinical trial (RCT), nab-P+G had superior overall survival (OS) compared to G for the treatment of advanced PDAC, but limited data is available comparing the effectiveness of these treatment options in a real-world setting. The objective of this study is to compare treatment patterns of patients receiving nab-P+G versus G for first-line treatment of PDAC. Methods: A retrospective cohort study was performed using fully de-identified data from a nationally representative electronic medical record platform of 1,300 community oncology physicians. Patients diagnosed with advanced PDAC between September 2013 and October 2014 who received first-line therapy with either nab-P+G or Gm were included in the analysis. We calculated the median time to treatment discontinuation (TTD) and database persistence (DP), a proxy for OS, using the Kaplan Meier method, and assessed supportive care usage with Poisson regression. Results: Out of 851 patients, 168 met eligibility criteria for the analysis (nab-P+G, n=122; G, n=46). Patients in the nab-P+G arm were younger (mean age 67.0 v 72.0; p <0.01) and mostly males (60% v 44%). Other baseline characteristics were comparable. Patients treated with nab-P+G had a statistically significant longer median TTD (3.4 v 2.2 mos; p <0.01) and median DP (8.6 v 5.3 mos; p=0.03). Patients receiving nab-P+G had fewer AEs-related to discontinuation (18% v 26%); but they utilized more doses of granulocyte-colony stimulating factor (2.02 v 0.73 doses, p<0.01), erythropoietin-stimulating agents (0.90 v 0.54, p<0.01) and steroids (7.89 v 0.58 doses, p<0.01) per 100 days compared to patients receiving G. Conclusions: Similar to the RCT comparing nab-P+G with G,patients receiving nab-P+G experienced significantly longer TTD and DP in this real-world analysis compared to patients receiving G. Additionally, patients receiving nab-P+G had fewer AEs leading to discontinuation compared to patients receiving G. More supportive care may have been used in the nab-P+G group due to longer treatment duration.


Author(s):  
Eric Nadler ◽  
Bhakti Arondekar ◽  
Kathleen Marie Aguilar ◽  
Jie Zhou ◽  
Jane Chang ◽  
...  

Abstract Purpose Treatments for advanced non-small cell lung cancer (NSCLC) have evolved to include targeted and immuno-oncology therapies, which have demonstrated clinical benefits in clinical trials. However, few real-world studies have evaluated these treatments in the first-line setting. Methods Adult patients with advanced NSCLC who initiated first-line treatment with chemotherapy, targeted therapies (TT), or immuno-oncology–based regimens in the US Oncology Network (USON) between March 1, 2015, and August 1, 2018, were included and followed up through February 1, 2019. Data were sourced from structured fields of USON electronic health records. Patient and treatment characteristics were assessed descriptively, with Kaplan-Meier methods used to evaluate time-to-event outcomes, including time to treatment discontinuation (TTD) and overall survival (OS). Adjusted Cox regression analyses and inverse probability of treatment weighting (IPTW) were performed to control for covariates that may have affected treatment selection and outcomes. Results Of 7746 patients, 75.6% received first-line systemic chemotherapy, 11.7% received immuno-oncology monotherapies, 8.5% received TT, and 4.2% received immuno-oncology combination regimens. Patients who received immuno-oncology monotherapies had the longest median TTD (3.5 months; 95% confidence interval [CI], 2.8–4.2) and OS (19.9 months; 95% CI, 16.6–24.1). On the basis of multivariable Cox regression and IPTW, immuno-oncology monotherapy was associated with reduced risk of death and treatment discontinuation relative to other treatments. Conclusion These results suggest that real-world outcomes in this community oncology setting improved with the introduction of immuno-oncology therapies. However, clinical benefits are limited in certain subgroups and tend to be reduced compared with clinical trial observations.


2021 ◽  
Vol 21 ◽  
pp. S332-S333
Author(s):  
Fadi Nasr ◽  
Intissar Yehia ◽  
Reem El Khoury ◽  
Saada Diab ◽  
Ahmad Al Ghoche ◽  
...  

Blood ◽  
2019 ◽  
Vol 134 (Supplement_1) ◽  
pp. 5571-5571
Author(s):  
Jesus D Gonzalez-Lugo ◽  
Ana Acuna-Villaorduna ◽  
Joshua Heisler ◽  
Niyati Goradia ◽  
Daniel Cole ◽  
...  

Introduction: Multiple Myeloma (MM) is a disease of the elderly; with approximately two-thirds of cases diagnosed at ages older than 65 years. However, this population has been underrepresented in clinical trials. Hence, there are no evidence-based guidelines to select the most appropriate treatment that would balance effectiveness against risk for side effects in the real world. Currently, guidelines advise that doublet regimens should be considered for frail, elderly patients; but more detailed recommendations are lacking. This study aims to describe treatment patterns in older patients with MM and compare treatment response and side effects between doublet and triplet regimens. Methods: Patients diagnosed with MM at 70 years or older and treated at Montefiore Medical Center between 2000 and 2017 were identified using Clinical Looking Glass, an institutional software tool. Recipients of autologous stem cell transplant were excluded. We collected demographic data and calculated comorbidity burden based on the age-adjusted Charlson Comorbidity Index (CCI). Laboratory parameters included cell blood counts, renal function, serum-protein electrophoresis and free kappa/lambda ratio pre and post first-line treatment. Treatment was categorized into doublet [bortezomib/dexamethasone (VD) and lenalidomide/dexamethasone (RD)] or triplet regimens [lenalidomide/bortezomib/dexamethasone (RVD) and cyclophosphamide/bortezomib/dexamethasone (CyborD)]. Disease response was reported as VGPR, PR, SD or PD using pre-established criteria. Side effects included cytopenias, diarrhea, thrombosis and peripheral neuropathy. Clinical and laboratory data were obtained by manual chart review. Event-free survival was defined as time to treatment change, death or disease progression. Data were analyzed by treatment group using Stata 14.1 Results: A total of 97 patients were included, of whom 46 (47.4%) were males, 47 (48.5%) were Non-Hispanic Black and 23 (23.7%) were Hispanic. Median age at diagnosis was 77 years (range: 70-90). Median baseline hemoglobin was 9.4 (8.5-10.5) and 14 (16.1%) had grade 3/4 anemia. Baseline thrombocytopenia and neutropenia of any grade were less common (18.4% and 17.7%, respectively) and 11 patients (20%) had GFR ≤30. Treatment regimens included VD (51, 52.6%), CyborD (18, 18.6%), RD (15, 15.5%) and RVD (13, 13.4%). Overall, doublets were more commonly used than triplets (66, 68% vs 31, 32%). Baseline characteristics were similar among treatment regimen groups. There was no difference in treatment selection among patients with baseline anemia or baseline neutropenia; however, doublets were preferred for those with underlying thrombocytopenia compared to triplets (93.8% vs 6.2%, p<0.01). Median first-line treatment duration was 4.1 months and did not differ among treatment groups (3.9 vs. 4.3 months; p=0.88 for doublets and triplets, respectively). At least a partial response was achieved in 47 cases (63.5%) and it did not differ between doublets and triplets (61.7% vs 66.7%). In general, first line treatment was changed in 50 (51.5%) patients and the change frequency was higher for triplets than doublets (71% vs 42.4%, p<0.01). Among patients that changed treatment, 17(34.7%) switched from a doublet to a triplet; 15 (30.6%) from a triplet to a doublet and 17 (34.7%) changed the regimen remaining as doublet or triplet, respectively. There was no difference in frequency of cytopenias, diarrhea, thrombosis or peripheral neuropathy among groups. Median event-free survival was longer in patients receiving doublet vs. triplet therapy, although the difference was not statistically significant (7.3 vs 4.3 months; p=0.06). Conclusions: We show a real-world experience of an inner city, elderly MM cohort, ineligible for autologous transplantation. A doublet combination and specifically the VD regimen was the treatment of choice in the majority of cases. In this cohort, triplet regimens did not show better response rates and led to treatment change more often than doublets. Among patients requiring treatment, approximately a third switched from doublet to triplet or viceversa which suggest that current evaluation of patient frailty at diagnosis is suboptimal. Despite similar frequency of side effects among groups, there was a trend towards longer event-free survival in patients receiving doublets. Larger retrospective studies are needed to confirm these results. Disclosures Verma: Janssen: Research Funding; BMS: Research Funding; Stelexis: Equity Ownership, Honoraria; Acceleron: Honoraria; Celgene: Honoraria.


Sign in / Sign up

Export Citation Format

Share Document