Differences in survival for advanced-stage colorectal cancer (CRC) patients by lines of therapy received in a U.S. local oncology practice.

2012 ◽  
Vol 30 (15_suppl) ◽  
pp. e14106-e14106
Author(s):  
Susan H Foltz Boklage ◽  
Charles Kreilick ◽  
Carl V Asche ◽  
Sally Haislip ◽  
James W. Gilmore ◽  
...  

e14106 Background: Improvements in survival for advanced-stage colorectal cancer (CRC) patients who receive chemotherapy have been reported. We compared survival rates for patients with 3+ vs. <3 lines of therapy using electronic medical records of a local oncology practice in Georgia, USA. Methods: The Georgia Cancer Specialist (GCS) EMR Database (1/1/2005–07/ 31/2010) was used. The database contains data on patient demographics, cancer diagnostic information, chemotherapy and non-chemotherapy drugs administered written prescriptions, chemotherapy/radiation protocols, chemotherapy protocol changes, office visit information, and hospitalizations. Patients newly diagnosed with CRC between 01/01/05 and 06/31/10 treated with systemic therapy for CRC were identified. Patients were followed from initial CRC diagnosis to death, loss to follow-up, or end of study. Patients were categorized by number of lines of therapy received (1, 2, 3+) and original stage at diagnosis (III b/c, IV, unknown). Survival following initial line of therapy was evaluated using Cox proportional hazards models controlling forstage at diagnosis, type of 1st line treatment, and other patient characteristics. Results: The study included 704 patients with a median age of 63 years (age range 26-85 years) at diagnosis and 49% (n=345) female. 45% (n=317) and 42% (n=296) had stage IV and III b/c CRC at diagnosis, respectively. 53% (n=373) received only 1st line treatment, 27% (n=190) received 1st and 2nd line treatment and 20% (n=141) received 3rd line and beyond. The median follow up was 431 days and death was reported in 27%(n=190) of subjects. The multivariate Cox proportional hazard analysis indicated that there was no statistical difference in survival between patients who received 2nd line of therapy vs. 3 plus lines of therapy (HR=1.42; p<0.067). Conclusions: A non-statistical significant association between 2nd and more than 3 total lines of therapy in survival was found in subjects diagnosed with stage III B/C and IV. However the trend towards survival was present, indicating that some patients could benefit from the addition of 3rd line but it would require additional studies to confirm this.

2007 ◽  
Vol 25 (18_suppl) ◽  
pp. 4048-4048
Author(s):  
Y. Yeh ◽  
Q. Cai ◽  
J. Chao ◽  
M. Russell

4048 Background: NCCN guidelines recommend assessment of =12 lymph nodes (LN) to improve accuracy in colorectal cancer (CRC) staging. Previous studies have used various cut-points to assess the relationship between the number of LN sampled and survival. The association between NCCN guideline-compliant nodal sampling and survival is assessed, while controlling for other risk factors. Methods: We selected 145,485 adult patients newly diagnosed with stage II or III from SEER during 1990–2003. Kaplan-Meier curves were compared using the log-rank test. Cox proportional hazards models were constructed to determine the effect of sampling ≥ 12 LN on survival. Results: Median patient follow-up was 5.7 years. The table shows overall survival rates in CRC patients with < 12 versus =12 LN assessed: After adjusting for age, sex, tumor size and grade, sampling ≥ 12 LN was independently associated with improved survival. For patients with =12 versus <12 LN assessed, survival increased by 13% for stage IIa [HR=0.75; 95%CI 0.72–0.78; p< .001], 16% for stage IIb [HR=0.69; 95%CI 0.67- 0.71; p< .001], 12% for stage IIIb [HR=0.75; 95%CI 0.72–0.77], and 10% for stage IIIc [HR=0.85, 95%CI 0.81–0.89]. The association was not statistically significant for stage IIIa patients. Conclusion: Consistent with previous reports, this analysis found that optimal nodal sampling increased survival across stage II and III, specifically when ≥ 12 LN are sampled and when controlling for other risk factors. Furthermore, the results underscore the need for adhering to the NCCN guidelines. The lack of a statistically significant association in stage IIIa patients may be due to small cohort size. [Table: see text] [Table: see text]


2019 ◽  
Vol 37 (15_suppl) ◽  
pp. e20015-e20015
Author(s):  
Ani John ◽  
Roma Shah ◽  
William Bruce Wong ◽  
Charles Schneider ◽  
Hamid H. Gari ◽  
...  

e20015 Background: Five-year survival rates as low as 2.8% have been reported in patients with non-small cell lung cancer (NSCLC), highlighting the need for individualized diagnosis and treatment. Companion diagnostic testing (CDx) identifies patients with molecular targets likely to respond better to particular therapies; however, not all cancer patients receive CDx in the real-world setting. This study evaluated the clinical value of CDx in the real world with respect to overall survival among patients with non-squamous advanced (Stage IIIB/IV) NSCLC (aNSCLC). Methods: Patients were from the Flatiron Health electronic health-derived database, treated with systemic therapy, and diagnosed with aNSCLC between January 1, 2011 and May 31, 2018; those who received CDx with their first line of treatment were compared with those who did not. Logistic regression using components of the modified Lung Cancer Prognostic Index (LCPI; age, sex, stage, actionable mutation(s), smoking, respiratory comorbidity; Alexander et al. Br J Cancer. 2017) and other factors were used to predict characteristics associated with receiving CDx. Overall survival was evaluated using Kaplan-Meier analysis. Unadjusted and adjusted Cox proportional hazards regression models were used to evaluate the association between CDx and overall survival. Results: A total of 17,143 patients with aNSCLC (CDx, n = 14,389; no CDx, n = 2754) and a mean (SD) age at diagnosis of 67.2 (10.0) years (CDx, 67.1 [10.1]; no CDx, 67.5 [9.2]) were included. There were more nonsmokers in the CDx group (17.4%) than the no CDx group (5.5%). Patients who were female, diagnosed after 2014, receiving multiple lines of therapy or had advanced stage at diagnosis were more likely to receive CDx. Patients receiving CDx had decreased mortality risk (unadjusted HR [95% CI] = 0.54 [0.52-0.57]) and lived longer than those not receiving CDx (median survival = 14 vs 7 months). The significant reduction in mortality associated with CDx remained after adjusting for factors included in the modified LCPI (adjusted HR [95% CI] = 0.78 [0.75-0.82]) as well as a model without actionable mutations (adjusted HR [95% CI] = 0.70 [0.66-0.73]). Conclusions: Among patients with non-squamous aNSCLC, use of CDx was associated with reduced risk of mortality compared with no CDx.


2021 ◽  
Vol 15 (Supplement_1) ◽  
pp. S201-S202
Author(s):  
M Kabir ◽  
K Curtius ◽  
P Kalia ◽  
I Al Bakir ◽  
C H R Choi ◽  
...  

Abstract Background Racial disparities in inflammatory bowel disease (IBD) phenotypic presentations and outcomes are recognised. However, there are conflicting data from Western population-based cohort studies as to whether racial differences in colitis-associated colorectal cancer (CRC) incidence exists. To our knowledge this is the first study to investigate the impact of ethnicity on the natural history of dysplasia in ulcerative colitis (UC). Methods We performed a retrospective multi-centre cohort study of adult patients with UC whose first low-grade dysplasia (LGD) diagnosis within the extent of colitis was made between 1 January 2001 and 30 December 2018. Only patients with at least one follow-up colonoscopy or colectomy by 30 August 2019 were included. The study end point was time to CRC or end of follow-up. Statistical differences between groups were evaluated using Mann-Whitney U tests and Chi-squared tests. Survival analyses were performed using Kaplan-Meier estimation and multivariate Cox proportional hazards models. Results 408 patients met the inclusion criteria (see Figure 1 for patient and clinical demographics). More patients from a Black or Asian (BAME) background progressed to CRC [13.4% vs. 6.4%; p=0.036] compared to their White Caucasian counterparts, despite having surveillance follow-up. Figure 2 displays Kaplan-Meier curves demonstrating the probability of remaining CRC-free after LGD diagnosis and categorised by ethnicity. BAME patients were more likely to have moderate-severe inflammatory activity on colonic biopsy within the 5 preceding years [42.0% vs. 28.9%; p=0.023], but no significant differences in medication use and a longer median time interval from LGD diagnosis to colectomy date [32 months vs. 11 months; p=0.021]. After adjusting for sex, age and UC duration at time of LGD diagnosis and presence of moderate-severe histological inflammation, being Black or Asian was a predictive factor for CRC progression on multivariate Cox proportional hazard analysis [HR 2.97 (95% CI 1.22 – 7.20); p = 0.016]. However, ethnicity was no longer predictive of CRC progression on sub-analysis of the 317 patients who did not have a colectomy during the follow-up period. Conclusion In this UK multi-centre cohort of UC surveillance patients diagnosed with LGD, delays in receiving cancer preventative colectomy may contribute to an increased CRC incidence in certain ethnic groups. Further work is required to elucidate whether these delays are related to institutional factors (e.g. inequity in the content of decision-making support given or access to healthcare) or cultural factors.


Blood ◽  
2015 ◽  
Vol 126 (23) ◽  
pp. 1751-1751 ◽  
Author(s):  
Anders Österborg ◽  
Anna Asklid ◽  
Joris Diels ◽  
Johanna Repits ◽  
Frans Söltoft ◽  
...  

Abstract Background Ibrutinib (Ibr), an oral, first-in-class covalent Bruton's tyrosine kinase inhibitor, showed in the Phase 3 RESONATE trial significantly improved progression-free survival (PFS, hazard ratio [HR] =0.22, p<0.001) and overall survival (OS, HR=0.39,p=0.001) compared with ofatumumab (ofa) in patients with previously treated CLL who were not eligible for chemoimmunotherapy (Byrd et al, NEJM 2013). Long-term follow-up data from a single arm Phase 2 study have also demonstrated that patients treated with ibrutinib have long durable responses with a PFS at 2.5 years of 69% (Byrd et al, Blood 2015). While ofatumumab is a licensed comparator and included in treatment guidelines, some Health Technology Assessment (HTA) bodies require comparisons with a wider range of treatments. In the absence of direct head-to-head comparison of single-agent ibrutinib with other frequently used treatments in this patient population, additional comparative evidence against standard of care as observed in clinical practice can provide useful insights on the relative efficacy of ibrutinib. Naïve (unadjusted) comparisons of outcomes from different sources are prone to bias due to confounding, as treatment assignments were not randomly assigned, and populations can vary in important prognostic factors. The objective of this analysis was to compare the relative efficacy of Ibr versus physician's choice in R/R CLL-patients based on patient-level data from RESONATE pooled with an observational cohort, adjusting for confounders using multivariate statistical modelling. Methods Patient-level data from the Phase 3 RESONATE trial (Ibr: n=195; ofa: n=196) were pooled with data from a retrospective observational study conducted in the Stockholm area in Sweden. This retrospective study collected efficacy and safety data from a detailed, in-depth retrospective review of individual patient files from 148 consecutively identified patients with R/R CLL initiated on second or later line treatment between 2002 and 2013 at the four CLL-treating centers in Stockholm, Sweden, with complete follow-up. Longitudinal follow-up in subsequent treatment lines was available for patients in 3rd (n=91), 4th (n=51), 5th (n=29), and 6+ (n=15) line, and as such individual patients could contribute information to the analysis for multiple lines of therapy, with baseline defined as the date of initiation of the actual treatment line. A multivariate cox proportional hazards model was developed to compare PFS and OS between treatments, including line of therapy, age, gender, Binet stage, ECOG, and refractory disease as covariates. Adjusted HRs and 95% CIs are presented vs. Ibr. Results Across all treatment lines, fludarabine-cyclophosphamide (FC) (n=64), chlorambucil (n=59), alemtuzumab (n=33), FC+rituximab (FCR) (n=30), bendamustine+rituximab (BR) (n=28), and other rituximab-based combination chemotherapy (n=28) were the most frequently used treatments. Line of therapy, age and gender, Binet stage, ECOG performance status, and refractory disease were all independent risk factors for worse outcome on both PFS and OS. The adjusted HR for PFS and OS pooled observational data versus Ibr were 6.80 [4.72;9.80] (p<0.0001) and 2.90 [1.80;4.69] (p<0.0001). HR's for PFS/OS versus most frequent treatment regimens ranged between 2.50/1.82 (FCR) and 14.00/5.34 (anti-CD20 Mab). Baseline adjusted results for the Ofa-arm in RESONATE were comparable for both PFS and OS to outcome data from the consecutive historical cohort, however OS outcomes for Ofa were partly confounded by cross-over to Ibr. Conclusions Comparison of results from the Phase 3 RESONATE study with treatments used as part of previous standard of care in a well-defined cohort of consecutive Swedish patients shows that ibrutinib is superior to physician's choice in patients with relapsed/refractory CLL, suggesting a more than 6 fold improvement in PFS and almost 3 fold improvement in OS. Results were consistent across all different physician chosen treatments and provides further evidence that ibrutinib improves both PFS and OS vs current and prior standard of care regimens. Figure 1. Adjusted Hazard ratio's for PFS and OS of physician's choice versus Ibrutinib (RESONATE) (Multivariate Cox proportional hazards regression) a. Progression-free survival b. Overall survival Figure 1. Adjusted Hazard ratio's for PFS and OS of physician's choice versus Ibrutinib (RESONATE) (Multivariate Cox proportional hazards regression). / a. Progression-free survival b. Overall survival Disclosures Österborg: Janssen Cilag: Research Funding. Asklid:Janssen Cilag: Research Funding. Diels:Janssen: Employment. Repits:Janssen Cilag: Employment. Söltoft:Janssen Cilag: Employment. Hansson:Jansse Cilag: Research Funding. Jäger:Janssen Cilag: Research Funding.


Cancers ◽  
2021 ◽  
Vol 14 (1) ◽  
pp. 161
Author(s):  
Troels Gammeltoft Dolin ◽  
Ib Jarle Christensen ◽  
Astrid Zedlitz Johansen ◽  
Hans Jørgen Nielsen ◽  
Henrik Loft Jakobsen ◽  
...  

The association between pre- and perioperative inflammatory biomarkers, major complications, and survival rates after resection of colorectal cancer (CRC) in older patients is largely unknown. The aim was to investigate age-dependent differences in these associations. Serum CRP, IL-6, and YKL-40 were measured preoperatively and on the first and second day after resection of CRC (stages I–III) in 210 older (≥70 years) and 191 younger patients (<70 years). The results from the complications was presented as an odds ratio (OR, with a 95% confidence interval (CI)) with logistic regression. Results from the mortality rates were presented as a hazard ratio (HR, with a 95% CI) using Cox proportional hazards regression. The preoperative inflammatory biomarkers were higher in the older vs. the younger patients. The risk of complications was increased in older patients with a high preoperative CRP (OR = 1.25, 95% CI 1.03–1.53), IL-6 (OR = 1.57, 95% CI 1.18–2.08), and YKL-40 (OR = 1.66, 95% CI 1.20–2.28), but not in younger patients. Mortality was higher in younger patients with high preoperative YKL-40 (HR = 1.66, 95% CI 1.06–2.60). This was not found in older patients. Elevated preoperative inflammatory biomarkers among older patients were associated with an increased risk of complications, but not mortality. Preoperative inflammatory biomarkers may be useful in assessing the risk of a complicated surgical course in older patients with CRC.


2019 ◽  
Vol 37 (15_suppl) ◽  
pp. e15087-e15087 ◽  
Author(s):  
Francois-Xavier Lamy ◽  
Michael Batech ◽  
Shaista Salim ◽  
Emmanuelle Boutmy ◽  
Chris Pescott ◽  
...  

e15087 Background: After an initial dose of 400 mg/m², cetuximab (CET) at a dose of 250 mg/m² in combination with chemotherapy (CT) is approved for once-weekly (q1w) use in the treatment of RAS wild-type metastatic colorectal cancer (mCRC). However, off-label use of CET 500 mg/m2 administered every other week (q2w) has been observed in clinical practice. This study aimed to test the noninferiority of q2w vs q1w administration on overall survival (OS) using US claims data. Methods: Using IBM MarketScan, a large US insurance claims database, a cohort of patients with mCRC treated with CET + CT between 2010 and 2016 was identified and classified as q1w or q2w based on observed infusion patterns. The initial CET prescription was defined as the index date, and patient death was determined using a previously published algorithm. Confounding was accounted for using high-dimensional propensity scoring (hdPS) methodology with inverse probability of treatment weighting (IPTW). OS for both groups was compared using Cox proportional hazards regression. Confounders that remained imbalanced after hdPS with IPTW were added to the Cox model. The noninferiority of the q2w regimen was tested with a margin hazard ratio (HR) of 1.25 for q2w vs q1w. Results: 2,869 patients with mCRC exposed to CET were identified of which 1,865 (65.0%) and 1,004 (35.0%) were classified in the q1w and q2w groups, respectively. The mean age of patients was 60.1±11.7 years for q1w and 58.1±11.1 years for q2w. Most patients were male: 57.5% and 60.8% in q1w and q2w, respectively. Approximately 70% of patients in both groups had received prior treatment for mCRC. The most frequently used CT with CET was irinotecan based (64.5% in q1w and 76.5% in q2w). There were 1,628 deaths observed during follow-up (56.7%). After hdPS with IPTW adjustment, differences remained in associated CT (standardized difference <0.25). Crude HR for OS was 1.05 (95% CI, 0.94-1.18), and adjusted HR for OS was 1.04 (95% CI, 0.93-1.17). The inferiority hypothesis was rejected at p<0.001. Conclusions: In this large US claims database, when assessing OS, the q2w administration schedule was found to be noninferior to the q1w schedule in patients with mCRC.


Biomedicines ◽  
2021 ◽  
Vol 9 (11) ◽  
pp. 1521
Author(s):  
Jacqueline Roshelli Baker ◽  
Sushma Umesh ◽  
Mazda Jenab ◽  
Lutz Schomburg ◽  
Anne Tjønneland ◽  
...  

A higher selenium (Se) status has been shown to be associated with lower risk for colorectal cancer (CRC), but the importance of Se in survival after CRC diagnosis is not well studied. The associations of prediagnostic circulating Se status (as indicated by serum Se and selenoprotein P (SELENOP) measurements) with overall and CRC-specific mortality were estimated using multivariable Cox proportional hazards regression among 995 CRC cases (515 deaths, 396 from CRC) in the European Prospective Investigation into Cancer and Nutrition (EPIC) cohort. Se and SELENOP serum concentrations were measured on average 46 months before CRC diagnosis. Median follow-up time was 113 months. Participants with Se concentrations in the highest quintile (≥100 µg/L) had a multivariable-adjusted hazard ratio (HR) of 0.73 (95% CI: 0.52–1.02; Ptrend = 0.06) for CRC-specific mortality and 0.77 (95% CI: 0.57–1.03; Ptrend = 0.04) for overall mortality, compared with the lowest quintile (≤67.5 µg/L). Similarly, participants with SELENOP concentrations in the highest (≥5.07 mg/L) compared with the lowest quintile (≤3.53 mg/L) had HRs of 0.89 (95% CI: 0.64–1.24; Ptrend = 0.39) for CRC-specific mortality and 0.83 (95% CI: 0.62–1.11; Ptrend = 0.17) for overall mortality. Higher prediagnostic exposure to Se within an optimal concentration (100–150 µg/L) might be associated with improved survival among CRC patients, although our results were not statistically significant and additional studies are needed to confirm this potential association. Our findings may stimulate further research on selenium’s role in survival among CRC patients especially among those residing in geographic regions with suboptimal Se availability.


2020 ◽  
Author(s):  
Ciyuan Sun ◽  
Jyming Chiang ◽  
Tseching Chen ◽  
Hsinyun Hung ◽  
Jengfu You

Abstract Background:Although hereditary non-polyposis colorectal cancer (HNPCC) could be subtyped into proficient or deficient mismatch repair gene expression (pMMR or dMMR), distinct clinical features between these two subgroups patients was rare reported. Methods:We retrospectively analyzed 175 hereditary non-polyposis colorectal cancer (HNPCC) patients between January 1995 to December 2012. Cox proportional hazards model was used to compare the differences between two subgroups. Results:Significant differences of disease free survival (DFS) and overall survival (OS) exist between dMMR and pMMR. In addition to other factors including younger mean age of diagnosis for dMMR patients (48.6 years v.s. 54.3 years), operation type (more extended colectomy for dMMR 35.8% v.s. 14.5%), tumor location (right colon predominance for dMMR 61.7% v.s. 27.3% and more rectum cases for pMMR 41.8% v.s. 11.7%), tumor differentiation (more poor differentiation for dMMR 23.3% v.s. 9.0%), N staging (more N0 cases for dMMR 70.8% v.s. 50.9%), more frequently presence of extra-colonic tumors for dMMR (16.7% v.s.1.8%) and lower recurrence rates (9.1% v.s.35.3%). Significantly different cumulative incidence of developing metachronous colorectal cancer were observed with 6.18 for pMMR patients and 20.57 person-years for dMMR patients (p<0.001). Conclusions:Distinct clinicopathological features significantly exist between dMMR and pMMR subtypes patient, MMR status should be consider to tailor operation types and follow up surveillance between these two subgroups patients who all fulfilled with Amsterdam-II criteria.


2020 ◽  
Author(s):  
Ciyuan Sun ◽  
Jyming Chiang ◽  
Tseching Chen ◽  
Hsinyun Hung ◽  
Jengfu You

Abstract Background Although hereditary non-polyposis colorectal cancer (HNPCC) could be subtyped into proficient or deficient mismatch repair gene expression (pMMR or dMMR), distinct clinical features between these two subgroups patients was rare reported. Methods We retrospectively analyzed 175 hereditary non-polyposis colorectal cancer (HNPCC) patients between January 1995 to December 2012. Cox proportional hazards model was used to compare the differences between two subgroups. Results Significant differences of disease free survival (DFS) and overall survival (OS) exist between dMMR and pMMR. In addition to other factors including younger mean age of diagnosis for dMMR patients (48.6 years v.s. 54.3 years), operation type (more extended colectomy for dMMR 35.8% v.s. 14.5%), tumor location (right colon predominance for dMMR 61.7% v.s. 27.3% and more rectum cases for pMMR 41.8% v.s. 11 .7%), tumor differentiation (more poor differentiation for dMMR 23.3% v.s. 9.0%), N staging (more N0 cases for dMMR 70.8% v.s. 50 .9%), more frequently presence of extra-colonic tumors for dMMR (16.7% v.s. 1.8%) and lower recurrence rates (9.1% v.s.35 .3%). Significantly different cumulative incidence of developing metachronous colorectal cancer were observed with 6.18 for pMMR patients and 20.57 person-years for dMMR patients ( p <0.001). Conclusions Distinct clinicopathological features significantly exist between dMMR and pMMR subtypes patient, MMR status should be consider to tailor operation types and follow up surveillance between these two subgroups patients who all fulfilled with Amsterdam-II criteria.


Cancers ◽  
2019 ◽  
Vol 11 (10) ◽  
pp. 1468 ◽  
Author(s):  
Yi-Chun Kuan ◽  
Kuang-Wei Huang ◽  
Cheng-Li Lin ◽  
Jiing-Chyuan Luo ◽  
Chia-Hung Kao

Background: The effect of clopidogrel, whose mechanism of action differs from that of aspirin, on CRC risk remains unknown. We investigated the effects of clopidogrel and aspirin, either as monotherapy or combined, on colorectal cancer (CRC) risk in patients with Type 2 diabetes mellitus (T2DM). Methods: We conducted a cohort study using Taiwan National Health Insurance Research Database. Four groups comprising 218,903 patients using aspirin monotherapy, 20,158 patients using clopidogrel monotherapy, 42,779 patients using dual antiplatelet therapy, and 281,840 nonuser matched controls were created using propensity score matching. Cox proportional hazards regression was used to evaluate the CRC risk during follow-up. Results: During the 13-year follow-up period, we found 9431 cases of CRC over 3,409,522 person-years. The overall incidence rates of CRC were 2.04, 3.45, 1.55, and 3.52 per 1000 person-years in the aspirin, clopidogrel, dual antiplatelet, and nonuser cohorts, respectively. The adjusted hazard ratios (aHRs) were 0.59 (95% confidence interval [CI], 0.56–0.61), 0.77 (95% CI, 0.68–0.87), and 0.37 (95% CI, 0.33–0.40) for the aspirin, clopidogrel, and dual antiplatelet cohorts, respectively. Dose- and duration-dependent chemopreventive effects were observed in the three cohorts.


Sign in / Sign up

Export Citation Format

Share Document