Is more aggressive lymph node assessment associated with improved survival in stage II-III colorectal cancer? Evidence from the Surveillance, Epidemiology and End Results (SEER) cancer registry

2007 ◽  
Vol 25 (18_suppl) ◽  
pp. 4048-4048
Author(s):  
Y. Yeh ◽  
Q. Cai ◽  
J. Chao ◽  
M. Russell

4048 Background: NCCN guidelines recommend assessment of =12 lymph nodes (LN) to improve accuracy in colorectal cancer (CRC) staging. Previous studies have used various cut-points to assess the relationship between the number of LN sampled and survival. The association between NCCN guideline-compliant nodal sampling and survival is assessed, while controlling for other risk factors. Methods: We selected 145,485 adult patients newly diagnosed with stage II or III from SEER during 1990–2003. Kaplan-Meier curves were compared using the log-rank test. Cox proportional hazards models were constructed to determine the effect of sampling ≥ 12 LN on survival. Results: Median patient follow-up was 5.7 years. The table shows overall survival rates in CRC patients with < 12 versus =12 LN assessed: After adjusting for age, sex, tumor size and grade, sampling ≥ 12 LN was independently associated with improved survival. For patients with =12 versus <12 LN assessed, survival increased by 13% for stage IIa [HR=0.75; 95%CI 0.72–0.78; p< .001], 16% for stage IIb [HR=0.69; 95%CI 0.67- 0.71; p< .001], 12% for stage IIIb [HR=0.75; 95%CI 0.72–0.77], and 10% for stage IIIc [HR=0.85, 95%CI 0.81–0.89]. The association was not statistically significant for stage IIIa patients. Conclusion: Consistent with previous reports, this analysis found that optimal nodal sampling increased survival across stage II and III, specifically when ≥ 12 LN are sampled and when controlling for other risk factors. Furthermore, the results underscore the need for adhering to the NCCN guidelines. The lack of a statistically significant association in stage IIIa patients may be due to small cohort size. [Table: see text] [Table: see text]

2012 ◽  
Vol 30 (26) ◽  
pp. 3229-3233 ◽  
Author(s):  
Hamed Khalili ◽  
Edward S. Huang ◽  
Shuji Ogino ◽  
Charles S. Fuchs ◽  
Andrew T. Chan

Purpose Bisphosphonates are used for the treatment of bone metastases and have been associated with a lower risk of breast cancer. A recent case-control study showed an inverse association between bisphosphonate use and colorectal cancer. Data from prospective cohorts are lacking. Patients and Methods We prospectively examined the relationship between bisphosphonate use and risk of colorectal cancer among 86,277 women enrolled onto the Nurses Health Study (NHS). Since 1998, participants have returned biennial questionnaires in which they were specifically queried about the regular use of bisphosphonates. We used Cox proportional hazards models to calculate hazard ratios (HRs) and 95% CIs for risk of colorectal cancer. Results Through 2008, we documented 801 cases of colorectal cancer over 814,406 person-years of follow-up. The age-adjusted HR for women who regularly used bisphosphonates was 0.92 (95% CI, 0.73 to 1.14) and was further attenuated after adjustment for other risk factors (multivariate HR, 1.04; 95% CI, 0.82 to 1.33). The risk was not influenced by duration of use (Ptrend = 0.79). Compared with nonusers, the multivariate-adjusted HRs of colorectal cancer were 1.24 (95% CI, 0.94 to 1.64) for women with 1 to 2 years of use, 1.16 (95% CI, 0.79 to 1.69) for 3 to 4 years of use, and 0.97 (95% CI, 0.60 to 1.56) for ≥ 5 years of use. There was no association between bisphosphonate use and colorectal cancer within strata of other risk factors. Conclusion In a large prospective cohort, we did not observe an association between long-term use of bisphosphonates and risk of colorectal cancer.


Gerontology ◽  
2021 ◽  
pp. 1-9
Author(s):  
Feng Cheng Lin ◽  
Chih Yin Chen ◽  
Chung Wei Lin ◽  
Ming Tsang Wu ◽  
Hsuan Yu Chen ◽  
...  

<b><i>Introduction:</i></b> Dementia is one of the major causes of disability and dependency among older people worldwide. Alz­heimer’s disease (AD), the most common cause of dementia among the elderly, has great impact on the health-care system of developed nations. Several risk factors are suggestive of an increased risk of AD, including APOE-ε4, male, age, diabetes mellitus, hypertension, and low social engagement. However, data on risk factors of AD progression are limited. Air pollution is revealed to be associated with increasing dementia incidence, but the relationship between air pollution and clinical AD cognitive deterioration is unclear. <b><i>Methods:</i></b> We conducted a case-control and city-to-city study to compare the progression of AD patients in different level of air-polluted cities. Clinical data of a total of 704 AD patients were retrospectively collected, 584 residences in Kaohsiung and 120 residences in Pingtung between 2002 and 2018. An annual interview was performed with each patient, and the Clinical Dementia Rating score (0 [normal] to 3 [severe stage]) was used to evaluate their cognitive deterioration. Air pollution data of Kaohsiung and Pingtung city for 2002–2018 were retrieved from Taiwan Environmental Protection Administration. Annual Pollutant Standards Index (PSI) and concentrations of particulate matter (PM<sub>10</sub>), sulfur dioxide (SO<sub>2</sub>), ozone (O<sub>3</sub>), nitrogen dioxide (NO<sub>2</sub>), and carbon monoxide (CO) were obtained. <b><i>Results:</i></b> The PSI was higher in Kaohsiung and compared with Pingtung patients, Kaohsiung patients were exposed to higher average annual concentrations of CO, NO<sub>2</sub>, PM<sub>10</sub>, and SO<sub>2</sub>. AD patients living in Kaohsiung suffered from faster cognitive deterioration in comparison with Pingtung patients (log-rank test: <i>p</i> = 0.016). When using multivariate Cox proportional hazards regression analysis, higher levels of CO, NO<sub>2</sub>, PM<sub>10</sub>, and SO<sub>2</sub> exposure were associated with increased risk of AD cognitive deterioration. Among all these air pollutants, high SO<sub>2</sub> exposure has the greatest impact while O<sub>3</sub> has a neutral effect on AD cognitive deterioration. <b><i>Conclusions:</i></b> Air pollution is an environment-related risk factor that can be controlled and is associated with cognitive deterioration of AD. This finding could contribute to the implementation of public intervention strategies of AD.


Circulation ◽  
2012 ◽  
Vol 125 (suppl_10) ◽  
Author(s):  
Todd M Brown ◽  
Joshua Richman ◽  
Vera Bittner ◽  
Cora E Lewis ◽  
Jenifer Voeks ◽  
...  

Background: Some individuals classified as having metabolic syndrome (MetSyn) are centrally obese while others are not with unclear implications for cardiovascular (CV) risk. Methods: REGARDS is following 30,239 individuals ≥45 years of age living in 48 states recruited from 2003-7. MetSyn risk factors were defined using the AHA/NHLBI/IDF harmonized criteria with central obesity being defined as ≥88 cm in women and ≥102 cm in men. Participants with and without central obesity were stratified by whether they met >2 or ≤2 of the other 4 MetSyn criteria, resulting in the creation of 4 groups. To ascertain CV events, participants are telephoned every 6 months with expert adjudication of potential events following national consensus recommendations and based on medical records, death certificates, and interviews with next-of-kin or proxies. Acute coronary heart disease (CHD) was defined as definite or probable myocardial infarction or acute CHD death. To determine the association between these 4 groups and incident acute CHD, we constructed Cox proportional hazards models in those free of CHD at baseline by race/gender group, adjusting for sociodemographic variables. Results: A total of 20,018 individuals with complete data on MetSyn components were free of baseline CHD. Mean age was 64+/−9 years, 58% were women, and 42% were African American. Over a mean follow-up of 3.4 (maximum 5.9) years, there were 442 acute CHD events. In the non-centrally obese with>2 other risk factors, risk for CHD was higher for all but AA men, though significant only for white men. In contrast, in the centrally obese with >2 other risk factors, risk was doubled for women, but only non-significantly and modestly increased for men. Only AA women with central obesity and ≤2 other risk factors had increased CHD risk (Table). Conclusion: The CHD risk associated with the MetSyn varies by the presence of central obesity as well as the race and gender of the individual.


2012 ◽  
Vol 30 (15_suppl) ◽  
pp. e14106-e14106
Author(s):  
Susan H Foltz Boklage ◽  
Charles Kreilick ◽  
Carl V Asche ◽  
Sally Haislip ◽  
James W. Gilmore ◽  
...  

e14106 Background: Improvements in survival for advanced-stage colorectal cancer (CRC) patients who receive chemotherapy have been reported. We compared survival rates for patients with 3+ vs. <3 lines of therapy using electronic medical records of a local oncology practice in Georgia, USA. Methods: The Georgia Cancer Specialist (GCS) EMR Database (1/1/2005–07/ 31/2010) was used. The database contains data on patient demographics, cancer diagnostic information, chemotherapy and non-chemotherapy drugs administered written prescriptions, chemotherapy/radiation protocols, chemotherapy protocol changes, office visit information, and hospitalizations. Patients newly diagnosed with CRC between 01/01/05 and 06/31/10 treated with systemic therapy for CRC were identified. Patients were followed from initial CRC diagnosis to death, loss to follow-up, or end of study. Patients were categorized by number of lines of therapy received (1, 2, 3+) and original stage at diagnosis (III b/c, IV, unknown). Survival following initial line of therapy was evaluated using Cox proportional hazards models controlling forstage at diagnosis, type of 1st line treatment, and other patient characteristics. Results: The study included 704 patients with a median age of 63 years (age range 26-85 years) at diagnosis and 49% (n=345) female. 45% (n=317) and 42% (n=296) had stage IV and III b/c CRC at diagnosis, respectively. 53% (n=373) received only 1st line treatment, 27% (n=190) received 1st and 2nd line treatment and 20% (n=141) received 3rd line and beyond. The median follow up was 431 days and death was reported in 27%(n=190) of subjects. The multivariate Cox proportional hazard analysis indicated that there was no statistical difference in survival between patients who received 2nd line of therapy vs. 3 plus lines of therapy (HR=1.42; p<0.067). Conclusions: A non-statistical significant association between 2nd and more than 3 total lines of therapy in survival was found in subjects diagnosed with stage III B/C and IV. However the trend towards survival was present, indicating that some patients could benefit from the addition of 3rd line but it would require additional studies to confirm this.


2019 ◽  
Vol 37 (15_suppl) ◽  
pp. e18250-e18250
Author(s):  
Jifang Zhou ◽  
Karen Sweiss ◽  
Pritesh Rajni Patel ◽  
Edith Nutescu ◽  
Naomi Ko ◽  
...  

e18250 Background: Adjuvant intravenous bisphosphonates (IV BP) reduce the risk of skeletal-related events (SRE) in patients with multiple myeloma (MM). We examined the effects of bisphosphonate utilization patterns (adherence, cumulative dose and frequency) on risk of SRE. Methods: Patients aged 65 years or older and diagnosed with first primary MM between 2001 and 2011 were identified using the Surveillance, Epidemiology and End Results (SEER)-Medicare linked database. Patients receiving at least one dose of IV BP after MM diagnosis were identified and 5-year SRE-free survival was estimated using the Kaplan-Meier method stratified by demographic groups and compared with the log rank test. Cox proportional hazards models were fit to determine the association between IV BP utilization patterns and SRE after propensity score matching. We investigated the outcome of multiple recurrent SRE using the approach of Andersen-Gill, and estimated subdistribution hazard ratios (SHR) and 95% confidence intervals for risk of first SRE, accounting for death as competing risk. Results: The final cohort included 9176 MM patients with a median age of 76 years. The adjusted 5-year competing-risk SRE model showed a 48% reduction in risk of SRE (95% CI 0.49-0.55) with use of IV BP. In multivariable analyses taking into account competing risks, greater adherence to IV BP, higher cumulative IV BP dose and more frequent administration were all associated with a statistically significant reduction in SRE risks (See Table). Conclusions: Use of IV BP in patients with MM was associated with significant reduction in SRE risk over the 5-year period after MM diagnosis. The effectiveness of IV BP therapy was greater with increasing cumulative dose, adherence to and greater frequency of IV BP administration. [Table: see text]


2011 ◽  
Vol 29 (12) ◽  
pp. 1599-1606 ◽  
Author(s):  
Kimmie Ng ◽  
Daniel J. Sargent ◽  
Richard M. Goldberg ◽  
Jeffrey A. Meyerhardt ◽  
Erin M. Green ◽  
...  

Purpose Previous studies have suggested that higher plasma 25-hydroxyvitamin D3 [25(OH)D] levels are associated with decreased colorectal cancer risk and improved survival, but the prevalence of vitamin D deficiency in advanced colorectal cancer and its influence on outcomes are unknown. Patients and Methods We prospectively measured plasma 25(OH)D levels in 515 patients with stage IV colorectal cancer participating in a randomized trial of chemotherapy. Vitamin D deficiency was defined as 25(OH)D lower than 20 ng/mL, insufficiency as 20 to 29 ng/mL, and sufficiency as ≥ 30 ng/mL. We examined the association between baseline 25(OH)D level and selected patient characteristics. Cox proportional hazards models were used to calculate hazard ratios (HR) for death, disease progression, and tumor response, adjusted for prognostic factors. Results Among 515 eligible patients, 50% of the study population was vitamin D deficient, and 82% were vitamin D insufficient. Plasma 25(OH)D levels were lower in black patients compared to white patients and patients of other race (median, 10.7 v 21.1 v 19.3 ng/mL, respectively; P < .001), and females compared to males (median, 18.3 v 21.7 ng/mL, respectively; P = .0005). Baseline plasma 25(OH)D levels were not associated with patient outcome, although given the distribution of plasma levels in this cohort, statistical power for survival analyses were limited. Conclusion Vitamin D deficiency is highly prevalent among patients with stage IV colorectal cancer receiving first-line chemotherapy, particularly in black and female patients.


2009 ◽  
Vol 24 (S1) ◽  
pp. 1-1
Author(s):  
J. DiFranza

Aims:The risk factors for trying a cigarette are well known, however we were interested in the factors that determine which youths become addicted to nicotine once they have tried it.Method:To investigate this we followed a cohort of 1246 students (mean baseline age of 12.2 years) over 4 years. Subjects underwent 11 interviews during which we assessed 45 risk factors, measured diminished autonomy over tobacco with the Hooked On Nicotine Checklist, and evaluated tobacco dependence using the International Classification of Diseases-10th revision. Cox proportional hazards models were used.Results:Among 217 youths who had inhaled from a cigarette, the loss of autonomy over tobacco was predicted by feeling relaxed the first time inhaling from a cigarette (adJusted Hazard Ratio (HR)=3.26; 95% CI, 1.95-5.46; P< .001) and depressed mood (HR=1.29; 1.09-1.54; P=.004). Tobacco dependence was predicted by feeling relaxed (HR=2.43; 1.27-4.65; P=.007), familiarity with Joe Camel (HR=2.19; 1.11-4.32; P=.02), novelty seeking (HR=1.56; 1.06-2.29; P=.02), and depressed mood (HR=1.17; 1.04-1.30; P=.007).Conclusion:Once exposure to nicotine had occurred, remarkably few risk factors for smoking consistently contributed to individual differences in susceptibility to the development of dependence. An experience of relaxation in response to the first dose of nicotine was the strongest predictor of both dependence and lost autonomy. This association was not explained by trait anxiety or many other psychosocial factors. These results are discussed in relation to the theory that addiction is initiated by the first dose of nicotine.


2021 ◽  
Vol 5 (Supplement_2) ◽  
pp. 470-470
Author(s):  
Claudia Martinez ◽  
Eduardo Ortíz-Panozo ◽  
Dalia Stern ◽  
Adrián Cortés ◽  
Josiemer Mattei ◽  
...  

Abstract Objectives To examine the relation between breakfast frequency and incidence of diabetes in middle-aged women. Methods The Mexican Teacher´s Cohort is a prospective study in women. We included 71,373 participants at baseline (2006–2008). Participants were classified according to breakfast frequency 0, 1–3, 4–6, or 7 d/wk; and meal frequency 1–2, 3–4, or ≥5 meals/d. Diabetes was self-reported. We used Cox proportional hazards models to calculate hazard ratios (HR) and 95% confidence intervals (CI) to estimate the association between breakfast frequency and diabetes incidence. Models were adjusted for sociodemographic and lifestyle confounders that are associated with breakfast consumption and are risk factors for diabetes. Stratified analyses were performed for age, birth weight, indigenous background, and physical activity. Results We identified 3,613 new diabetes cases during a median of 2.2 years of follow-up. Prevalence of daily breakfast consumers was 25%. After adjustment for known risk factors for diabetes, compared to 0 d/wk, women who eat daily breakfast had 12% lower rate of diabetes (HR = 0.88; 95% CI 0.78, 0.99; p-trend = 0.0018). One day additional per week having breakfast decreased the risk of diabetes (HR = 0.98; CI 0.97, 0.99). In stratified analysis, women with indigenous background who consumed breakfast 4–6 d/wk and 7 d/wk vs. 0 d/wk shown lower risk (HR = 0.68; 95% CI 0.47, 0.98) and HR = 0.76; 95% CI 0.76 (0.51, 1.15) respectively; p-interaction = 0.197). Conclusions Daily breakfast was associated with a lower incidence of diabetes, independently of dietary and lifestyle factors. Likely effect modifiers as ethnicity warrants more research. Daily breakfast consumption is a potential component of diabetes prevention. Funding Sources This work is supported by the American Institute for Cancer Research (05B047) and Consejo Nacional de Ciencia y Tecnología (CONACyT) grant S0008-2009-1: 000000000115312.


2021 ◽  
Author(s):  
Irene Ruano ◽  
Valentín Pando ◽  
Felipe Bravo

Abstract Background: There is growing interest in mixed-species forests but a lack of studies that analyse them for regeneration phases or any stage other than mature stands. Information is scarce about relatively unproductive species such as Pinus pinaster and Pinus halepensis in Mediterranean ecosystems. The objective of this study was to investigate inter- and intra-specific interactions of both species in different tree densities during the first years of establishment. Five Nelder wheel plots were planted to test densities ranging from 1000 to 80000 seedlings/ha and simulate establishment sub-processes at high densities. Pinus pinaster and Pinus halepensis were mixed along the spokes, to obtain three mixture levels in which 100%, 80% or 60% of the seedlings were of the same species. Cox proportional-hazards models and binomial logistic regressions were fitted to analyse seedling survival. Early growth (basal diameter and height at one and four years after plantation) was analysed by fitting linear mixed-effects models. Results: Pinus halepensis showed higher survival rates and basal diameter increments but more time is needed to know how Pinus pinaster responds to density and mixture. Conclusions: Both competitive and facilitating seedling interactions were observed at higher densities, which facilitate seedling survival but decrease early growth. Pinus halepensis showed higher survival rates and basal diameter increments but more time is necessary to determine Pinus pinaster response to density and mixture.


2021 ◽  
Vol 8 ◽  
Author(s):  
Qiu-hong Tan ◽  
Lin Liu ◽  
Yu-qing Huang ◽  
Yu-ling Yu ◽  
Jia-yi Huang ◽  
...  

Background: Limited studies focused on the association between serum uric acid (SUA) change with ischemic stroke, and their results remain controversial. The present study aimed to investigate the relationship between change in SUA with ischemic stroke among hypertensive patients.Method: This was a retrospective cohort study. We recruited adult hypertensive patients who had two consecutive measurements of SUA levels from 2013 to 2014 and reported no history of stroke. Change in SUA was assessed as SUA concentration measured in 2014 minus SUA concentration in 2013. Multivariable Cox proportional hazards models were used to estimate adjusted hazard ratios (HRs) and 95% confidence intervals (CIs). The Kaplan–Meier analysis and log-rank test were performed to quantify the difference in cumulative event rate. Additionally, subgroup analysis and interaction tests were conducted to investigate heterogeneity.Results: A total of 4,628 hypertensive patients were included, and 93 cases of ischemic stroke occurred during the mean follow-up time of 3.14 years. Participants were categorized into three groups according to their SUA change tertiles [low (SUA decrease substantially): &lt;-32.6 μmol/L; middle (SUA stable): ≥-32.6 μmol/L, &lt;40.2 μmol/L; high (SUA increase substantially): ≥40.2 μmol/L]. In the fully adjusted model, setting the SUA stable group as reference, participants in the SUA increase substantially group had a significantly elevated risk of ischemic stroke [HR (95% CI), 1.76 (1.01, 3.06), P = 0.0451], but for the SUA decrease substantially group, the hazard effect was insignificant [HR (95% CI), 1.31 (0.75, 2.28), P = 0.3353]. Age played an interactive role in the relationship between SUA change and ischemic stroke. Younger participants (age &lt; 65 years) tended to have a higher risk of ischemic stroke when SUA increase substantially.Conclusion: SUA increase substantially was significantly correlated with an elevated risk of ischemic stroke among patients with hypertension.


Sign in / Sign up

Export Citation Format

Share Document