Impact of clinical and pathological variables on stage I non-small cell lung cancer outcomes.

2020 ◽  
Vol 38 (15_suppl) ◽  
pp. e21071-e21071
Author(s):  
Matthew C Lee ◽  
Dimitre C Stefanov ◽  
Mallorie B Angert ◽  
Erica C Cohn ◽  
Nina Kohn ◽  
...  

e21071 Background: Stage I patients (pts) have 5-year survival ranging 50-75% suggesting heterogeneity within. While American Joint Committee on Cancer 8th edition upstages tumors with visceral pleural invasion (VPI) to IB, other histological features namely lymphovascular invasion (LVI), micropapillary pattern (MIP), spread through airspace (STAS) & neuroendocrine differentiation (NE) may also affect prognosis. This retrospective single institution study evaluated influence of these factors along with pt variables age, gender, smoking, Charleston comorbidity index (CCI) & chemotherapy (CT) on recurrence & mortality. Methods: 351 resected stage I cases from 2015-2019 were included. Data was summarized as means (standard deviation/SD) or percentages. Association between variables & outcomes (measured from diagnosis till event or last visit if no event) were investigated using Univariate & Multiple Cox proportional hazards models. Survival curves were compared using the Log-Rank test when the assumption for the proportional hazards was not satisfied. All predictors were included in the multiple Cox regression models based on their clinical importance. P < 0.05 was considered statistically significant. SAS 9.4 (SAS Institute, Cary, NC) was used for the analysis. Results: Mean age was 69.62 years (9.83). Majority were female (57.3%), smokers (76.9%), & had adenocarcinoma (AC) (78.6%). 39% had COPD & mean CCI was 6.3 (1.74). 193 (55%) pts had lobectomy or larger procedure while 158 (45%) had sub-lobar resection. 45 (12.8%) pts received CT. Recurrence & death occurred in 33 (9.4%) & 15 (4.3%) pts respectively. Univariate models indicated higher recurrence risk with NE (HR = 4.18 95% CI 1.47-11.9, p = 0.0075), LVI (HR = 2.68, 95% CI 1.03-6.94, p = 0.0423), COPD (HR = 3.28 95% CI 1.56-6.9, p = 0.0017), age (HR = 1.05 95% CI 1.01-1.09, p = 0.0212), & CCI (HR = 1.57 95% CI 1.35-1.83, p < .0001). CT was also associated with increased recurrence risk (HR = 8.61, 95% CI 4.28-17.33, p < .0001). Multivariable model for recurrence retained significance for CT & CCI. Age (HR = 1.07 95% CI 1.01-1.14, p = 0.0312), CCI (HR = 1.27 95 % CI 1.02-1.59, p = 0.0347) were associated with mortality in univariate models. Multivariate analysis for mortality wasn’t feasible due to few events. Conclusions: Histological features other than VPI may be associated with recurrence. Pts who received CT had increased recurrence but they possibly had multiple risk factors or other adverse features not assessed here. Limitations included retrospective nature, limited sample size & small number of events.

2020 ◽  
Vol 5 (4) ◽  
pp. 598-616 ◽  
Author(s):  
Austin C Doctor

Abstract Why do rebel organizations splinter into competing factions during civil war? To explain this outcome, I leverage variation in rebel leadership. I argue that rebel leaders draw on their pre-war experiences—i.e., their military and political experiences—to manage their organizations during conflict. These experiences bear unique patterns of rebel management and, thus, corresponding risks of fragmentation. Empirical evidence comes from a two-stage research design and original data featuring over 200 rebel leaders from 1989 to 2014. In the first stage, I estimate the probability of group fragmentation with a series of logistic regression models. In the second stage, I use Cox proportional-hazards models to estimate leadership effects on the rate of group fragmentation. Results indicate that variation in rebel leadership corresponds with unique risks of fragmentation. In particular, the results suggest that leaders with real military experience are best equipped to maintain group cohesion. This study offers insight into the processes by which rebel groups splinter into armed factions. In addition, it makes an important contribution to the broader discussion on the roles of structure and agency in shaping the dynamics of civil war.


2007 ◽  
Vol 25 (18_suppl) ◽  
pp. 4048-4048
Author(s):  
Y. Yeh ◽  
Q. Cai ◽  
J. Chao ◽  
M. Russell

4048 Background: NCCN guidelines recommend assessment of =12 lymph nodes (LN) to improve accuracy in colorectal cancer (CRC) staging. Previous studies have used various cut-points to assess the relationship between the number of LN sampled and survival. The association between NCCN guideline-compliant nodal sampling and survival is assessed, while controlling for other risk factors. Methods: We selected 145,485 adult patients newly diagnosed with stage II or III from SEER during 1990–2003. Kaplan-Meier curves were compared using the log-rank test. Cox proportional hazards models were constructed to determine the effect of sampling ≥ 12 LN on survival. Results: Median patient follow-up was 5.7 years. The table shows overall survival rates in CRC patients with < 12 versus =12 LN assessed: After adjusting for age, sex, tumor size and grade, sampling ≥ 12 LN was independently associated with improved survival. For patients with =12 versus <12 LN assessed, survival increased by 13% for stage IIa [HR=0.75; 95%CI 0.72–0.78; p< .001], 16% for stage IIb [HR=0.69; 95%CI 0.67- 0.71; p< .001], 12% for stage IIIb [HR=0.75; 95%CI 0.72–0.77], and 10% for stage IIIc [HR=0.85, 95%CI 0.81–0.89]. The association was not statistically significant for stage IIIa patients. Conclusion: Consistent with previous reports, this analysis found that optimal nodal sampling increased survival across stage II and III, specifically when ≥ 12 LN are sampled and when controlling for other risk factors. Furthermore, the results underscore the need for adhering to the NCCN guidelines. The lack of a statistically significant association in stage IIIa patients may be due to small cohort size. [Table: see text] [Table: see text]


2019 ◽  
Vol 37 (15_suppl) ◽  
pp. e18250-e18250
Author(s):  
Jifang Zhou ◽  
Karen Sweiss ◽  
Pritesh Rajni Patel ◽  
Edith Nutescu ◽  
Naomi Ko ◽  
...  

e18250 Background: Adjuvant intravenous bisphosphonates (IV BP) reduce the risk of skeletal-related events (SRE) in patients with multiple myeloma (MM). We examined the effects of bisphosphonate utilization patterns (adherence, cumulative dose and frequency) on risk of SRE. Methods: Patients aged 65 years or older and diagnosed with first primary MM between 2001 and 2011 were identified using the Surveillance, Epidemiology and End Results (SEER)-Medicare linked database. Patients receiving at least one dose of IV BP after MM diagnosis were identified and 5-year SRE-free survival was estimated using the Kaplan-Meier method stratified by demographic groups and compared with the log rank test. Cox proportional hazards models were fit to determine the association between IV BP utilization patterns and SRE after propensity score matching. We investigated the outcome of multiple recurrent SRE using the approach of Andersen-Gill, and estimated subdistribution hazard ratios (SHR) and 95% confidence intervals for risk of first SRE, accounting for death as competing risk. Results: The final cohort included 9176 MM patients with a median age of 76 years. The adjusted 5-year competing-risk SRE model showed a 48% reduction in risk of SRE (95% CI 0.49-0.55) with use of IV BP. In multivariable analyses taking into account competing risks, greater adherence to IV BP, higher cumulative IV BP dose and more frequent administration were all associated with a statistically significant reduction in SRE risks (See Table). Conclusions: Use of IV BP in patients with MM was associated with significant reduction in SRE risk over the 5-year period after MM diagnosis. The effectiveness of IV BP therapy was greater with increasing cumulative dose, adherence to and greater frequency of IV BP administration. [Table: see text]


2021 ◽  
Vol 8 ◽  
Author(s):  
Qiu-hong Tan ◽  
Lin Liu ◽  
Yu-qing Huang ◽  
Yu-ling Yu ◽  
Jia-yi Huang ◽  
...  

Background: Limited studies focused on the association between serum uric acid (SUA) change with ischemic stroke, and their results remain controversial. The present study aimed to investigate the relationship between change in SUA with ischemic stroke among hypertensive patients.Method: This was a retrospective cohort study. We recruited adult hypertensive patients who had two consecutive measurements of SUA levels from 2013 to 2014 and reported no history of stroke. Change in SUA was assessed as SUA concentration measured in 2014 minus SUA concentration in 2013. Multivariable Cox proportional hazards models were used to estimate adjusted hazard ratios (HRs) and 95% confidence intervals (CIs). The Kaplan–Meier analysis and log-rank test were performed to quantify the difference in cumulative event rate. Additionally, subgroup analysis and interaction tests were conducted to investigate heterogeneity.Results: A total of 4,628 hypertensive patients were included, and 93 cases of ischemic stroke occurred during the mean follow-up time of 3.14 years. Participants were categorized into three groups according to their SUA change tertiles [low (SUA decrease substantially): &lt;-32.6 μmol/L; middle (SUA stable): ≥-32.6 μmol/L, &lt;40.2 μmol/L; high (SUA increase substantially): ≥40.2 μmol/L]. In the fully adjusted model, setting the SUA stable group as reference, participants in the SUA increase substantially group had a significantly elevated risk of ischemic stroke [HR (95% CI), 1.76 (1.01, 3.06), P = 0.0451], but for the SUA decrease substantially group, the hazard effect was insignificant [HR (95% CI), 1.31 (0.75, 2.28), P = 0.3353]. Age played an interactive role in the relationship between SUA change and ischemic stroke. Younger participants (age &lt; 65 years) tended to have a higher risk of ischemic stroke when SUA increase substantially.Conclusion: SUA increase substantially was significantly correlated with an elevated risk of ischemic stroke among patients with hypertension.


2021 ◽  
Author(s):  
Je Hun Song ◽  
Hyuk Huh ◽  
Eunjin Bae ◽  
Jeonghwan Lee ◽  
Jung Pyo Lee ◽  
...  

Abstract Background: Hyperhomocysteinemia (HHcy) is considered a risk factor for cardiovascular disease (CVD) including chronic kidney disease (CKD). In this study, we investigated the association between serum homocysteine (Hcy) level and mortality according to the presence of CKD.Methods: Our study included data of 9,895 participants from the 1996–2016 National Health and Nutrition Examination Surveys (NHANES). Moreover, linked mortality data were included and classified into four groups according to the Hcy level. Multivariable-adjusted Cox proportional hazards models using propensity-score were used to examine dose-response associations between Hcy level and mortality.Results: Of 9,895 participants, 1032 (21.2%) participants were diagnosed with CKD. In a multivariate Cox regression analysis including all participants, Hcy level was associated with all-cause mortality, compared with the 1st quartile in Model 3 (2nd quartile: hazard ratio (HR) 1.751, 95% confidence interval (CI) 1.348-2.274, p<0.001; 3rd quartile: HR 2.220, 95% CI 1.726-2.855, p<0.001; 4th quartile: HR 3.776, 95% CI 2.952-4.830, p<0.001). In the non-CKD group, there was a significant association with all-cause mortality; however, this finding was not observed in the CKD group. The observed pattern was similar after propensity score matching. In the non-CKD group, overall mortality increased in proportion to Hcy concentration (2nd quartile: HR 2.195, 95% CI 1.299-3.709, p = 0.003; 3rd quartile: HR 2.607, 95% CI 1.570-4.332, p<0.001; 4th quartile: HR 3.720, 95% CI 2.254-6.139, p<0.001). However, the risk of all-cause mortality according to the quartile of Hcy level did not increase in the CKD groupConclusion: This study found a correlation between the Hcy level and mortality rate only in the non-CKD group. This altered risk factor patterns may be attributed to protein-energy wasting or chronic inflammation status that is accompanied by CKD.


2021 ◽  
Vol 11 ◽  
Author(s):  
Chengxin Weng ◽  
Jiarong Wang ◽  
Jichun Zhao ◽  
Ding Yuan ◽  
Bin Huang ◽  
...  

BackgroundThe appropriate surgical procedure for early-stage retroperitoneal sarcoma (RPS) is unclear. Thus, we used a national database to compare the outcomes of radical and non-radical resection in patients with early stage RPS.MethodsThis retrospective study included 886 stage I RPS patients from 2004 to 2015 in the SEER database. Outcomes were compared using the multivariate Cox proportional hazards models and the results were presented as adjusted hazards ratio (AHR) with corresponding 95% confidence intervals (95%CIs). Propensity score-matched analyses were also performed for sensitive analyses.ResultsFor the 886 stage I RPS patients, 316 underwent radical resection, and 570 underwent non-radical resection, with a median follow-up of 4.58 (2.73-8.35) years. No difference was observed in overall mortality (AHR 0.84, 95%CI 0.62-1.15; P = 0.28) or RPS-specific mortality (AHR 0.88, 95%CI 0.57-1.36; P = 0.56) between groups. The results were similar in propensity score-matching analyses. However, subgroup analysis revealed that radical resection was associated with significantly decreased risks of overall mortality in male (AHR 0.61, 95%CI 0.38-0.98; P = 0.04) and in patients with radiotherapy (AHR 0.56, 95%CI 0.32-0.98; P = 0.04).ConclusionRadical resection did not improve midterm survival outcomes compared with non-radical resection in overall patients with early stage RPS. However, male patients or patients who received radiotherapy might benefit from radical resection with improved overall survival.


2019 ◽  
Author(s):  
Ming-Chao Tsai ◽  
Yi-Hao Yen ◽  
Kuo-Chin Chang ◽  
Chao-Hung Hung ◽  
Chien-Hung Chen ◽  
...  

Abstract Background Urokinase plasminogen activator (uPA) is an extracellular matrix-degrading protease that is involved in the invasiveness and progression of cancer. There is good evidence that uPA expression is a clinically relevant biomarker in some solid tumors, but its role in hepatocellulcar carcinoma (HCC) is uncertain. We evaluated the prognostic value of serum uPA before surgery in HCC patients receiving curative resection.Methods Serum uPA levels were determined by enzyme-linked immunosorbent assay in 282 HCC patients who received complete liver resections at Kaohsiung Chang Gung Memorial Hospital. Overall survival (OS) curves were constructed using the Kaplan-Meier method and compared using the log-rank test. A Cox proportional -hazards regression model was used to identify independent prognostic factors. The median follow-up time was 52 months.Results Patients with higher pretreatment serum uPA (≥1 ng/ml) had significantly shorter OS (p = 0.002). Patients with liver cirrhosis, hypoalbuminemia, and thrombocytopenia were significantly more likely to present with elevated uPA levels. Multivariate Cox regression analyses indicated that high pretreatment serum uPA [hazard ratio (HR), 1.848, p = 0.006], vascular invasion (HR, 2.940, p <0.001), and pathology stage III/IV (HR, 3.517, p<0.001) were independent prognostic factors for OS. In further stratified analyses, the combination of serum uPA and AFP had more capacity to predict OS.Conclusions We conclude that uPA is a clinically relevant biomarker in HCC patients receiving curative resection, with higher expression of uPA being associated with higher mortality. This also highlights the potential utility of uPA as a therapeutic target for improved treatment strategies.


Crisis ◽  
2016 ◽  
Vol 37 (4) ◽  
pp. 281-289 ◽  
Author(s):  
Adriana Farré ◽  
Maria J. Portella ◽  
Luis De Angel ◽  
Ana Díaz ◽  
Javier de Diego-Adeliño ◽  
...  

Abstract. Background: The effectiveness of suicide intervention programs has not been assessed with experimental designs. Aim: To determine the risk of suicide reattempts in patients engaged in a secondary prevention program. Method: We included 154 patients with suicidal behavior in a quasi-experimental study with a nontreatment concurrent control group. In all, 77 patients with suicidal behavior underwent the Suicide Behavior Prevention Program (SBPP), which includes specialized early assistance during a period of 3–6 months. A matched sample of patients with suicidal behavior (n = 77) was selected without undergoing any specific suicide prevention program. Data on sociodemographics, clinical characteristics, and suicidal behavior were collected at baseline (before SBPP) and at 12 months. Results: After 12 months, SBPP patients showed a 67% lower relative risk of reattempt (χ2 = 11.75, p = .001, RR = 0.33 95% CI = 0.17–0.66). Cox proportional hazards models revealed that patients under SBPP made a new suicidal attempt significantly much later than control patients did (Cox regression = 0.293, 95% CI = 0.138–0.624, p = .001). The effect was even stronger among first attempters. Limitations: Sampling was naturalistic and patients were not randomized. Conclusion: The SBPP was effective in delaying and preventing suicide reattempts at least within the first year after the suicide behavior. In light of our results, implementation of suicide prevention programs is strongly advisable.


Open Medicine ◽  
2020 ◽  
Vol 15 (1) ◽  
pp. 850-859
Author(s):  
Bing Wang ◽  
Yang Zhang

AbstractBackgroundAs one of the most common malignant tumors worldwide, the morbidity and mortality of gastric carcinoma (GC) are gradually increasing. The aim of this study was to construct a signature according to immune-relevant genes to predict the survival outcome of GC patients using The Cancer Genome Altas (TCGA).MethodsUnivariate Cox regression analysis was used to assess the relationship between immune-relevant genes regarding the prognosis of patients with GC. The least absolute shrinkage and selection operator (LASSO) Cox regression model was used to select prognostic immune-relevant genes and to establish the signature for the prognostic evaluation of patients with GC. Multivariate Cox regression analysis and Kaplan–Meier survival analysis were used to assess the independent prognostic ability of the immune-relevant gene signature.ResultsA total of 113 prognostic immune-relevant genes were identified using univariate Cox proportional hazards regression analysis. A signature of nine immune-relevant genes was constructed using the LASSO Cox regression. The GC samples were assigned to two groups (low- and high risk) according to the optimal cutoff value of the signature score. Compared with the patients in the high-risk group, patients in the low-risk group had a significantly better prognosis in the TCGA and GSE84437 cohorts (log-rank test P < 0.001). Multivariate Cox regression analysis demonstrated that the signature of nine immune-relevant genes might serve as an independent predictor of GC.ConclusionsOur results showed that the signature of nine immune-relevant genes may potentially serve as a prognostic prediction for patients with GC, which may contribute to the decision-making of personalized treatment for the patients.


Stroke ◽  
2016 ◽  
Vol 47 (suppl_1) ◽  
Author(s):  
Erin L MacDougal ◽  
Jeffrey J Wing ◽  
William H Herman ◽  
Lewis B Morgenstern ◽  
Lynda D Lisabeth

Background and Purpose: Diabetes mellitus (DM) is a well-established risk factor for ischemic stroke (IS), but the literature is inconsistent on the effect of DM on outcomes after IS. We sought to determine if DM increases the risk of mortality and recurrence after IS, and if these associations are greater in Mexican Americans (MA) than non-Hispanic whites (NHW). Methods: IS cases, all-cause mortality, and recurrent strokes were identified from the Brain Attack Surveillance in Corpus Christi (BASIC) project (2006-2012). Sociodemographics and clinical data were obtained from medical records and interviews. Cumulative mortality and stroke recurrence risk were estimated at 30 days and 1 year using Kaplan-Meier analysis and Cox proportional hazards models. Effect modification by ethnicity was examined. Results: There were 1,301 IS cases, 46% with a history of DM, median age 70 (IQR: 58-81), and 61% MA. Patients with DM were younger and more likely to be MA compared to patients without DM. Risk of 30-day and 1-year mortality was 8.4% and 20.5% for those with DM and 9.5% and 20.8% for those without DM, respectively. Risk of 30-day and 1-year stroke recurrence was 1.2% and 7.5% for those with DM and 1.5% and 5.8% for those without DM, respectively. Unadjusted, DM was not a significant predictor of mortality or recurrence (see table). After adjustment, DM predicted mortality (30-day HR=1.58, 95% CI: 0.98-2.53; 1-year HR=1.48, 95% CI: 1.10-2.00) but not stroke recurrence (1-year HR=1.28, 95% CI: 0.78-2.08). Effect modification by ethnicity was not significant (p>0.2 for all models). Conclusions: Given that patients with DM were significantly younger than patients without DM, the crude association between DM and mortality revealed no difference. However, after accounting for age and other factors, patients with DM were 50% more likely to die at 1 year after IS compared to patients without DM.


Sign in / Sign up

Export Citation Format

Share Document