scholarly journals Vitamin D Status in Patients With Stage IV Colorectal Cancer: Findings From Intergroup Trial N9741

2011 ◽  
Vol 29 (12) ◽  
pp. 1599-1606 ◽  
Author(s):  
Kimmie Ng ◽  
Daniel J. Sargent ◽  
Richard M. Goldberg ◽  
Jeffrey A. Meyerhardt ◽  
Erin M. Green ◽  
...  

Purpose Previous studies have suggested that higher plasma 25-hydroxyvitamin D3 [25(OH)D] levels are associated with decreased colorectal cancer risk and improved survival, but the prevalence of vitamin D deficiency in advanced colorectal cancer and its influence on outcomes are unknown. Patients and Methods We prospectively measured plasma 25(OH)D levels in 515 patients with stage IV colorectal cancer participating in a randomized trial of chemotherapy. Vitamin D deficiency was defined as 25(OH)D lower than 20 ng/mL, insufficiency as 20 to 29 ng/mL, and sufficiency as ≥ 30 ng/mL. We examined the association between baseline 25(OH)D level and selected patient characteristics. Cox proportional hazards models were used to calculate hazard ratios (HR) for death, disease progression, and tumor response, adjusted for prognostic factors. Results Among 515 eligible patients, 50% of the study population was vitamin D deficient, and 82% were vitamin D insufficient. Plasma 25(OH)D levels were lower in black patients compared to white patients and patients of other race (median, 10.7 v 21.1 v 19.3 ng/mL, respectively; P < .001), and females compared to males (median, 18.3 v 21.7 ng/mL, respectively; P = .0005). Baseline plasma 25(OH)D levels were not associated with patient outcome, although given the distribution of plasma levels in this cohort, statistical power for survival analyses were limited. Conclusion Vitamin D deficiency is highly prevalent among patients with stage IV colorectal cancer receiving first-line chemotherapy, particularly in black and female patients.

2022 ◽  
Author(s):  
Samo Rozman ◽  
Nina Ružić Gorenjec ◽  
Barbara Jezeršek Novaković

Abstract This retrospective study was undertaken to investigate the association of relative dose intensity (RDI) with the outcome of Hodgkin lymphoma (HL) patients with advanced stage disease receiving ABVD (doxorubicin, bleomycin, vinblastine, dacarbazine) and escalated BEACOPP regimen (bleomycin, etoposide, doxorubicin, cyclophosphamide, vincristine, procarbazine, prednisone). A total of 114 HL patients treated between 2004 and 2013 were enrolled for evaluation. RDI calculations were based on a Hryniuk's model. The association of variables with overall survival (OS) and progression-free survival (PFS) was analysed using univariate and multivariate Cox proportional hazards models. The median age of patients was 39 years, majority of patients were males and had stage IV disease. Fifty-four patients received ABVD and 60 received BEACOPP chemotherapy with 24 and 4 deaths, respectively. Patients in BEACOPP group were significantly younger with lower Charlson comorbidity index (CCI) in comparison with ABVD group, making the comparison of groups impossible. In ABVD group, RDI was not significantly associated with OS (p=0.590) or PFS (p=0.354) in a multivariate model where age was controlled. The low number of events prevented the analysis in the BEACOPP group. Patients' age was strongly associated with both OS and PFS: all statistically significant predictors for OS and PFS from univariate analyses (chemotherapy regimen, CCI, RDI) lost its effect in multivariate analyses where age was controlled. Based on our observations, we can conclude that RDI is not associated with the OS or PFS after the age is controlled, neither in all patients combined nor in individual chemotherapy groups.


2021 ◽  
Vol 39 (15_suppl) ◽  
pp. 4578-4578
Author(s):  
Bradley Alexander McGregor ◽  
Daniel M. Geynisman ◽  
Mauricio Burotto ◽  
Camillo Porta ◽  
Cristina Suarez Rodriguez ◽  
...  

4578 Background: Nivolumab in combination with cabozantinib (N+C) has demonstrated significantly improved progression-free survival (PFS), objective response rate (ORR), and overall survival (OS), compared with sunitinib as a first-line (1L) treatment for aRCC in the phase 3 CheckMate (CM) 9ER trial. As there are no head-to-head trials comparing N+C with pembrolizumab in combination with axitinib (P+A), this study compared the efficacy of N+C with P+A as 1L treatment in aRCC. Methods: An MAIC was conducted using individual patient data on N+C (N = 323) from the CM 9ER trial (median follow-up: 23.5 months) and published data on P+A (N = 432) from the KEYNOTE (KN)-426 trialof P+A (median follow-up: 30.6 months). Individual patients within the CM 9ER trial population were reweighted to match the key patient characteristics published in KN-426 trial, including age, gender, previous nephrectomy, International Metastatic RCC Database Consortium risk score, and sites of metastasis. After weighting, hazards ratios (HR) of PFS, duration of response (DoR), and OS comparing N+C vs. P+A were estimated using weighted Cox proportional hazards models, and ORR was compared using a weighted Wald test. All comparisons were conducted using the corresponding sunitinib arms as an anchor. Results: After weighting, patient characteristics in the CM 9ER trial were comparable to those in the KN-426 trial. In the weighted population, N+C had a median PFS of 19.3 months (95% CI: 15.2, 22.4) compared to a median PFS of 15.7 months (95% CI: 13.7, 20.6) for P+A. Using sunitinib as an anchor arm, N+C was associated with a 30% reduction in risk of progression or death compared to P+A, (HR: 0.70, 95% CI: 0.53, 0.93; P = 0.015; table). In addition, N+C was associated with numerically, although not statistically, higher improvement in ORR vs sunitinib (difference: 8.4%, 95% CI: -1.7%, 18.4%; P = 0.105) and improved DoR (HR: 0.79; 95% CI: 0.47, 1.31; P = 0.359). Similar OS outcomes were observed for N+C and P+A (HR: 0.99; 95% CI: 0.67, 1.44; P = 0.940). Conclusions: After adjusting for cross-trial differences, N+C had a more favorable efficacy profile compared to P+A, including statistically significant PFS benefits, numerically improved ORR and DoR, and similar OS.[Table: see text]


2019 ◽  
Vol 35 (3) ◽  
pp. 488-495 ◽  
Author(s):  
Thijs T Jansz ◽  
Marlies Noordzij ◽  
Anneke Kramer ◽  
Eric Laruelle ◽  
Cécile Couchoud ◽  
...  

Abstract Background Previous US studies have indicated that haemodialysis with ≥6-h sessions [extended-hours haemodialysis (EHD)] may improve patient survival. However, patient characteristics and treatment practices vary between the USA and Europe. We therefore investigated the effect of EHD three times weekly on survival compared with conventional haemodialysis (CHD) among European patients. Methods We included patients who were treated with haemodialysis between 2010 and 2017 from eight countries providing data to the European Renal Association–European Dialysis and Transplant Association Registry. Haemodialysis session duration and frequency were recorded once every year or at every change of haemodialysis prescription and were categorized into three groups: CHD (three times weekly, 3.5–4 h/treatment), EHD (three times weekly, ≥6 h/treatment) or other. In the primary analyses we attributed death to the treatment at the time of death and in secondary analyses to EHD if ever initiated. We compared mortality risk for EHD to CHD with causal inference from marginal structural models, using Cox proportional hazards models weighted for the inverse probability of treatment and censoring and adjusted for potential confounders. Results From a total of 142 460 patients, 1338 patients were ever treated with EHD (three times, 7.1 ± 0.8 h/week) and 89 819 patients were treated exclusively with CHD (three times, 3.9 ± 0.2 h/week). Crude mortality rates were 6.0 and 13.5/100 person-years. In the primary analyses, patients treated with EHD had an adjusted hazard ratio (HR) of 0.73 [95% confidence interval (CI) 0.62–0.85] compared with patients treated with CHD. When we attributed all deaths to EHD after initiation, the HR for EHD was comparable to the primary analyses [HR 0.80 (95% CI 0.71–0.90)]. Conclusions EHD is associated with better survival in European patients treated with haemodialysis three times weekly.


2007 ◽  
Vol 25 (18_suppl) ◽  
pp. 4048-4048
Author(s):  
Y. Yeh ◽  
Q. Cai ◽  
J. Chao ◽  
M. Russell

4048 Background: NCCN guidelines recommend assessment of =12 lymph nodes (LN) to improve accuracy in colorectal cancer (CRC) staging. Previous studies have used various cut-points to assess the relationship between the number of LN sampled and survival. The association between NCCN guideline-compliant nodal sampling and survival is assessed, while controlling for other risk factors. Methods: We selected 145,485 adult patients newly diagnosed with stage II or III from SEER during 1990–2003. Kaplan-Meier curves were compared using the log-rank test. Cox proportional hazards models were constructed to determine the effect of sampling ≥ 12 LN on survival. Results: Median patient follow-up was 5.7 years. The table shows overall survival rates in CRC patients with < 12 versus =12 LN assessed: After adjusting for age, sex, tumor size and grade, sampling ≥ 12 LN was independently associated with improved survival. For patients with =12 versus <12 LN assessed, survival increased by 13% for stage IIa [HR=0.75; 95%CI 0.72–0.78; p< .001], 16% for stage IIb [HR=0.69; 95%CI 0.67- 0.71; p< .001], 12% for stage IIIb [HR=0.75; 95%CI 0.72–0.77], and 10% for stage IIIc [HR=0.85, 95%CI 0.81–0.89]. The association was not statistically significant for stage IIIa patients. Conclusion: Consistent with previous reports, this analysis found that optimal nodal sampling increased survival across stage II and III, specifically when ≥ 12 LN are sampled and when controlling for other risk factors. Furthermore, the results underscore the need for adhering to the NCCN guidelines. The lack of a statistically significant association in stage IIIa patients may be due to small cohort size. [Table: see text] [Table: see text]


2021 ◽  
Vol 19 (3) ◽  
pp. 307-318
Author(s):  
Johannes Uhlig ◽  
Michael Cecchini ◽  
Amar Sheth ◽  
Stacey Stein ◽  
Jill Lacy ◽  
...  

Background: This study sought to assess microsatellite and KRAS status, prevalence, and impact on outcome in stage IV colorectal cancer (CRC). Materials and Methods: The 2010 to 2016 US National Cancer Database was queried for adult patients with stage IV CRC. Prevalence of microsatellite status (microsatellite instability–high [MSI-H] or microsatellite stable [MSS]) and KRAS status (KRAS mutation or wild-type) of the primary CRC was assessed. Overall survival (OS) was evaluated using multivariable Cox proportional hazards models in patients with complete data on both microsatellite and KRAS status and information on follow-up. Results: Information on microsatellite and KRAS status was available for 10,844 and 25,712 patients, respectively, and OS data were available for 5,904 patients. The overall prevalence of MSI-H status and KRAS mutation was 3.1% and 42.4%, respectively. Prevalence of MSI-H ranged between 1.6% (rectosigmoid junction) and 5.2% (transverse colon), and between 34.7% (sigmoid colon) and 58.2% (cecum) for KRAS mutation. MSI-H rates were highest in East North Central US states (4.1%), and KRAS mutation rates were highest in West South Central US states (44.1%). Multivariable analyses revealed longer OS for patients with KRAS wild-type versus mutation status (hazard ratio [HR], 0.91; 95% CI, 0.85–0.97; P=.004), those with MSS versus MSI-H status (HR, 0.75; 95% CI, 0.62–0.9; P=.003), and those with left-sided versus right-sided CRC (multivariable HR, 0.65; 95% CI, 0.6–0.7; P<.001). The effect of KRAS mutation further varied with CRC site and microsatellite status (P=.002 for interaction). Conclusions: Depending on the primary site and US geography, stage IV CRC shows distinct mutational behavior. KRAS mutation, MSI-H, and primary CRC sidedness independently affect OS and interact with distinct prognostic profiles. Generically classifying adenocarcinomas at different sites as CRC might deprecate this diversity.


2012 ◽  
Vol 30 (15_suppl) ◽  
pp. e14016-e14016
Author(s):  
Brian S. Seal ◽  
Benjamin Chastek ◽  
Mahesh Kulakodlu ◽  
Satish Valluri

e14016 Background: Improvements in survival for advanced-stage CRC patients who receive chemotherapy have been reported. We compared survival rates for patients with 3+ vs. <3 lines of therapy. Methods: Adult patients with a diagnosis of CRC between 01/01/05 and 05/31/10 were identified from the Impact Intelligence Oncology Management (IIOM) registry. Patients with either stage 4 CRC at original diagnosis or development of metastasis were included. Registry data included original stage and date of diagnosis. Linked healthcare claims from the Life Sciences Research Database, a large US health insurance database affiliated with OptumInsight, were used to identify lines of therapy after metastases and patient characteristics. Death data were obtained from the Social Security Administration’s master death file. Patients were categorized by number of lines of therapy received (0, 1, 2, 3+) and original stage at diagnosis (0-2, 3, 4, unknown). Survival following metastases was evaluated using Cox proportional hazards models controlling for lines of therapy received, stage, and other patient characteristics. Results: 598 patients, followed for a mean of 653 days after becoming metastatic, were included. Mean unadjusted length of follow-up was lowest among patients who received no chemotherapy (516 days) or only 1 line (511 days), and increased to 627 days for those with 2 lines and 930 days for those with 3+ lines. However, multivariate analysis indicated that patients with 3+ lines had comparable survival vs. those with 0 (HR=0.79), 1 (HR=1.59), or 2 (HR=1.15) lines of therapy (p>0.05 for all comparisons). Compared to patients who presented with stage 4 CRC, those who progressed from stage 0-2 (HR=1.22), stage 3 (HR=0.83), or unknown stage (HR=1.18) had similar survival after metastases (p>0.05 for all comparisons). After excluding 94 patients who didn’t receive chemotherapy, patients treated with an oxaliplatin-based regimen (HR=1.28; p=0.24) in first line had similar survival compared to patients treated with an irinotecan-based or anti-EGFR regimen in first line. Conclusions: Lines of therapy received and initial stage were not associated with survival after development of metastases.


2016 ◽  
Vol 34 (4_suppl) ◽  
pp. 421-421
Author(s):  
Mariam F. Eskander ◽  
Gyulnara G. Kasumova ◽  
Chun Li ◽  
Sing Chau Ng ◽  
Rebecca A. Miksad ◽  
...  

421 Background: There are increasing therapeutic options for patients with advanced pancreatic cancer but it is unknown whether the overall prognosis of unresectable patients is improving. Here, we examine trends in treatment and survival in Stage III/IV pancreatic cancer. Methods: National Cancer DataBase 1998-2012 queried for unresected pancreatic adenocarcinoma patients from Commission on Cancer hospitals with Stage III and IV disease. Trends in stage at diagnosis and type of chemotherapy (single vs. multi-agent) assessed via Cochran Armitage trend tests. Timing of treatment compared by Kruskal-Wallis. Kaplan-Meier analysis and Cox proportional hazards models used to assess the association between 2-year time intervals (1998-2011) and survival. Results: 34,163 unresected patients with Stage III and 100,396 with stage IV identified. Rates of chemotherapy increased over time for stage III (p<0.0001) and stage IV (p<0.0001). Among patients who received systemic therapy, rates of multiagent chemotherapy have increased for both stage III (p<0.0001) and IV (p<0.0001). Time from diagnosis to treatment did not change (p=0.5121). Overall survival differed by year group for stage III (5.2 mos in 1998-1999 vs. 9.0 mos 2010-2011, log-rank p<0.0001) and stage IV (3.1 vs. 3.6 mos; log-rank p<0.0001). Among patients who received chemotherapy, overall survival also differed (Stage III, 7.6 vs. 11.4 mos, log-rank p<0.0001; Stage IV, 5.0 vs. 6.0 mos, log-rank p<0.0001). After stratification by clinical stage, type of chemotherapy, tumor location, and facility type, year remained a significant predictor of survival (p<0.0001). Conclusions: Survival of patients with Stage III and IV pancreatic cancer has significantly improved over the last fifteen years. This improvement in survival is not fully explained by changes in chemotherapy. [Table: see text]


2019 ◽  
Vol 189 (3) ◽  
pp. 224-234
Author(s):  
Jamie M Madden ◽  
Finbarr P Leacy ◽  
Lina Zgaga ◽  
Kathleen Bennett

Abstract Studies have shown that accounting for time-varying confounding through time-dependent Cox proportional hazards models may provide biased estimates of the causal effect of treatment when the confounder is also a mediator. We explore 2 alternative approaches to addressing this problem while examining the association between vitamin D supplementation initiated after breast cancer diagnosis and all-cause mortality. Women aged 50–80 years were identified in the National Cancer Registry Ireland (n = 5,417) between 2001 and 2011. Vitamin D use was identified from linked prescription data (n = 2,570). We sought to account for the time-varying nature of vitamin D use and time-varying confounding by bisphosphonate use using 1) marginal structural models (MSMs) and 2) G-estimation of structural nested accelerated failure-time models (SNAFTMs). Using standard adjusted Cox proportional hazards models, we found a reduction in all-cause mortality in de novo vitamin D users compared with nonusers (hazard ratio (HR) = 0.84, 95% confidence interval (CI): 0.73, 0.99). Additional adjustment for vitamin D and bisphosphonate use in the previous month reduced the hazard ratio (HR = 0.45, 95% CI: 0.33, 0.63). Results derived from MSMs (HR = 0.44, 95% CI: 0.32, 0.61) and SNAFTMs (HR = 0.45, 95% CI: 0.34, 0.52) were similar. Utilizing MSMs and SNAFTMs to account for time-varying bisphosphonate use did not alter conclusions in this example.


Author(s):  
Barbara Putman ◽  
Lies Lahousse ◽  
David G. Goldfarb ◽  
Rachel Zeig-Owens ◽  
Theresa Schwartz ◽  
...  

The factors that predict treatment of lung injury in occupational cohorts are poorly defined. We aimed to identify patient characteristics associated with initiation of treatment with inhaled corticosteroid/long-acting beta-agonist (ICS/LABA) >2 years among World Trade Center (WTC)-exposed firefighters. The study population included 8530 WTC-exposed firefighters. Multivariable logistic regression assessed the association of patient characteristics with ICS/LABA treatment for >2 years over two-year intervals from 11 September 2001–10 September 2017. Cox proportional hazards models measured the association of high probability of ICS/LABA initiation with actual ICS/LABA initiation in subsequent intervals. Between 11 September 2001–1 July 2018, 1629/8530 (19.1%) firefighters initiated ICS/LABA treatment for >2 years. Forced Expiratory Volume in 1 s (FEV1), wheeze, and dyspnea were consistently and independently associated with ICS/LABA treatment. High-intensity WTC exposure was associated with ICS/LABA between 11 September 2001–10 September 2003. The 10th percentile of risk for ICS/LABA between 11 September 2005–10 Septmeber 2007 was associated with a 3.32-fold increased hazard of actual ICS/LABA initiation in the subsequent 4 years. In firefighters with WTC exposure, FEV1, wheeze, and dyspnea were independently associated with prolonged ICS/LABA treatment. A high risk for treatment was identifiable from routine monitoring exam results years before treatment initiation.


2017 ◽  
Vol 2017 ◽  
pp. 1-9 ◽  
Author(s):  
Chih-Pei Su ◽  
Jr-Hau Wu ◽  
Mei-Chueh Yang ◽  
Ching-Hui Liao ◽  
Hsiu-Ying Hsu ◽  
...  

The outcome of patients suffering from out-of-hospital cardiac arrest (OHCA) is very poor, and postresuscitation comorbidities increase long-term mortality. This study aims to analyze new-onset postresuscitation comorbidities in patients who survived from OHCA for over one year. The Taiwan National Health Insurance (NHI) Database was used in this study. Study and comparison groups were created to analyze the risk of suffering from new-onset postresuscitation comorbidities from 2011 to 2012 (until December 31, 2013). The study group included 1,346 long-term OHCA survivors; the comparison group consisted of 4,038 matched non-OHCA patients. Demographics, patient characteristics, and risk of suffering comorbidities (using Cox proportional hazards models) were analyzed. We found that urinary tract infections (n=225, 16.72%), pneumonia (n=206, 15.30%), septicemia (n=184, 13.67%), heart failure (n=111, 8.25%) gastrointestinal hemorrhage (n=108, 8.02%), epilepsy or recurrent seizures (n=98, 7.28%), and chronic kidney disease (n=62, 4.61%) were the most common comorbidities. Furthermore, OHCA survivors were at much higher risk (than comparison patients) of experiencing epilepsy or recurrent seizures (HR = 20.83; 95% CI: 12.24–35.43), septicemia (HR = 8.98; 95% CI: 6.84–11.79), pneumonia (HR = 5.82; 95% CI: 4.66–7.26), and heart failure (HR = 4.88; 95% CI: 3.65–6.53). Most importantly, most comorbidities occurred within the first half year after OHCA.


Sign in / Sign up

Export Citation Format

Share Document