Type I Versus Type II Endometrial Cancer: Differential Impact of Comorbidity

2018 ◽  
Vol 28 (3) ◽  
pp. 586-593 ◽  
Author(s):  
Mette Calundann Noer ◽  
Sofie Leisby Antonsen ◽  
Bent Ottesen ◽  
Ib Jarle Christensen ◽  
Claus Høgdall

ObjectiveTwo distinct types of endometrial carcinoma (EC) with different etiology, tumor characteristics, and prognosis are recognized. We investigated if the prognostic impact of comorbidity varies between these 2 types of EC. Furthermore, we studied if the recently developed ovarian cancer comorbidity index (OCCI) is useful for prediction of survival in EC.Materials and MethodsThis nationwide register-based cohort study was based on data from 6487 EC patients diagnosed in Denmark between 2005 and 2015. Patients were assigned a comorbidity index score according to the Charlson comorbidity index (CCI) and the OCCI. Kaplan-Meier survival statistics and adjusted multivariate Cox regression analyses were used to investigate the differential association between comorbidity and overall survival in types I and II EC.ResultsThe distribution of comorbidities varied between the 2 EC types. A consistent association between increasing levels of comorbidity and poorer survival was observed for both types. Cox regression analyses revealed a significant interaction between cancer stage and comorbidity indicating that the impact of comorbidity varied with stage. In contrast, the interaction between comorbidity and EC type was not significant. Both the CCI and the OCCI were useful measurements of comorbidity, but the CCI was the strongest predictor in this patient population.ConclusionsComorbidity is an important prognostic factor in type I as well as in type II EC although the overall prognosis differs significantly between the 2 types of EC. The prognostic impact of comorbidity varies with stage but not with type of EC.

2021 ◽  
Author(s):  
Huy Gia Vuong ◽  
Hieu Trong Le ◽  
Tam N.M. Ngo ◽  
Kar-Ming Fung ◽  
James D. Battiste ◽  
...  

Abstract Introduction: H3K27M-mutated diffuse midline gliomas (H3-DMGs) are aggressive tumors with a fatal outcome. This study integrating individual patient data (IPD) from published studies aimed to investigate the prognostic impact of different genetic alterations on survival of these patients.Methods: We accessed PubMed and Web of Science to search for relevant articles. Studies were included if they have available data of follow-up and additional molecular investigation of H3-DMGs. For survival analysis, Kaplan-Meier analysis and Cox regression models were utilized, and corresponding hazard ratios (HR) and 95% confidence intervals (CI) were computed to analyze the impact of genetic events on overall survival (OS).Result: We included 30 studies with 669 H3-DMGs. TP53 mutations were the most common second alteration among these neoplasms. In univariate Cox regression model, TP53 mutation was an indicator of shortened survival (HR = 1.446; 95% CI = 1.143-1.829) whereas ACVR1 (HR = 0.712; 95% CI = 0.518-0.976) and FGFR1 mutations (HR = 0.408; 95% CI = 0.208-0.799) conferred prolonged survival. In addition, ATRX loss was also associated with a better OS (HR = 0.620; 95% CI = 0.386-0.996). Adjusted for age, gender, tumor location, and the extent of resection, the presence of TP53 mutations, the absence of ACVR1 or FGFR1 mutations remained significantly poor prognostic factors.Conclusions: We outlined the prognostic importance of additional genetic alterations in H3-DMGs and recommended that these neoplasms should be further molecularly segregated. It could help neuro-oncologists better evaluate the risk stratification of patients and consider pertinent treatments.


2012 ◽  
Vol 30 (15_suppl) ◽  
pp. 3550-3550
Author(s):  
Karsten Schulmann ◽  
Sven Koepnick ◽  
Christoph Engel ◽  
Christiane Bernhardt ◽  
Verena Steinke ◽  
...  

3550 Background: Previous studies showed conflicting results regarding the value of ACT in MSI-H CC. A recent study reported differential benefits from 5-FU-based ACT comparing suspected sporadic vs suspected hereditary MSI-H CC. We sought to evaluate the prognostic impact of ACT in a large cohort of Lynch syndrome (LS) patients (pts) with stage II CC. Methods: To minimize selection bias diagnoses >2 years prior to registration in the database of the German HNPCC consortium were excluded. 278 patients (61% male, mean age 42.9y, 13% stage IIB, 51% with MMR gene mutation) were eligible. Overall Survival (OS), CC-specific Survival (CSS), and Disease Free Survival (DFS) were analyzed using Kaplan-Meier and Cox Regression analyses. Results: 5y OS, CSS and DFS were 95%, 95% and 93% respectively. Right-sided CC was independently associated with lower DFS in stage II and IIA. Increasing age was associated with lower OS, CSS and DFS in stage IIA, however we observed only trends in the multivariate analysis. Surgery alone (without ACT) was associated with a slightly lower OS in stage IIA (univariate HR 3,659; 95% CI 0,81-16,5; P=0.092); but not with lower DFS and CSS. Prognosis was not different comparing FOLFOX vs. 5-FU-based ACT. Conclusions: Our data suggest that LS pts with stage II CC do not benefit from ACT. FOLFOX was not superior to 5-FU-based ACT. If our results are confirmed, LS pts with stage IIA CC should not receive ACT. The group of stage IIB CC was too small to make definite conclusions. [Table: see text]


2020 ◽  
Vol 38 (4_suppl) ◽  
pp. 682-682
Author(s):  
Brian Cox ◽  
Nicholas Manguso ◽  
Humair Quadri ◽  
Jessica Crystal ◽  
Katelyn Mae Atkins ◽  
...  

682 Background: Lymph node (LN) metastases affect overall survival (OS) in pancreatic cancer (PC). However, a LN sampling threshold does not exist. We examined the impact of nodal sampling on overall survival (OS). Methods: Patients with Stage I-III PC ≥55 years old who underwent curative resection from 2004-2016 were identified from the National Cancer Database (NCDB). After adjusting for age, gender, grade, stage, and Charlson-Deyo score, multiple binomial logistic regression analyses assessed the impact of the LN ratio (LNR) on OS. LNR was defined as the number of positive LN over the number of LN examined. Regression analyses, a Cox-Regression, and a Kaplan-Meier survival curve assessed how many LN should be sampled. Results: A total of 13,673 patients, median age 69 years (55-90), were included. Most were Caucasian (86.6%) males with Charlson-Deyo scores ≤ 1 (90.3%) and moderately to poorly differentiated PC (90.1%). Median number of LN examined was 15 (1-75) with a median of 1 positive LN (0-35). As expected, increased number of positive LNs was associated with reduced OS, p < 0.001. After data normalization, an increasing LNR was associated with a 12-fold likelihood of death [OR: 11.9, p < 0.001 (CI 6.0, 23.7)]. Subsequent regression models established evaluation of ≥ 16 LNs as the greatest predictor of OS. A regression model evaluating < or ≥ 16 lymph nodes was performed to ascertain the effects of age, gender, ethnicity, grade, stage, and LN examined on OS. The logistic regression model correctly classified 74.5% of cases with a specificity of 99.6% (p < 0.001). Examination of < 16 LN, Caucasian race, grade, stage, and higher Charlson-Deyo scores were significantly associated with decreased OS. If ≥ 16 LNs were examined, patients had a 1.5-fold likelihood of better OS, p < 0.001 (CI 1.4, 1.6). An adjusted Cox Regression showed increased HR of 1.2, p < 0.001 (CI 1.1, 1.2) and an unadjusted Kaplan Meier survival curve predicted ≥ 16 LN examined are associated with an increase in OS of 2.8 months [log-rank: 32.0, p < 0.001]. Conclusions: Patients undergoing curative intent resection for PC should have adequate nodal sampling. Stratification of patients by LNR may provide useful information of OS. Examination of ≥ 16 LNs impacts OS in patients with Stage I-III PC.


2021 ◽  
pp. 1-10
Author(s):  
Li Chen ◽  
Weichen Zhang ◽  
Jinyun Tan ◽  
Min Hu ◽  
Weihao Shi ◽  
...  

<b><i>Background:</i></b> Neointimal hyperplasia (NIH) is believed to be the main reason for arteriovenous fistula (AVF) dysfunction, but other mechanisms are also recognized to be involved in the pathophysiological process. This study investigated whether different morphological types of AVF lesions are associated with the patency rate after percutaneous transluminal angioplasty (PTA). <b><i>Methods:</i></b> This retrospective study included 120 patients who underwent PTA for autogenous AVF dysfunction. All the cases were evaluated under Doppler ultrasound (DU) before intervention and divided into 3 types: Type I (NIH type), Type II (non-NIH type), and Type III (mixed type). Prognostic and clinical data were analyzed by Kaplan-Meier analysis and the Cox proportional hazards model. <b><i>Results:</i></b> There was no statistical difference in baseline variables among groups, except for lumen diameter. The primary patency rates in Type I, Type II, and Type III groups were 78.4, 93.2, and 83.2% at 6 months and 59.5, 84.7, and 75.5% at 1 year, respectively. The secondary patency rates in Type I, Type II, and Type III groups were 94.4, 97.1, and 100% at 6 months and 90.5, 97.1, and 94.7% at 1 year, respectively. The Kaplan-Meier curve showed that the primary and secondary patency rates of Type I group were lower than those of Type II group. Multivariable Cox regression analysis demonstrated that postoperative primary patency was correlated with end-to-end anastomosis (hazard ratio [HR] = 2.997, <i>p</i> = 0.008, 95% confidence interval [CI]: 1.328–6.764) and Type I lesion (HR = 5.395, <i>p</i> = 0.004, 95% CI: 1.730–16.824). <b><i>Conclusions:</i></b> NIH-dominant lesions of AVF evaluated by DU preoperatively were a risk factor for poor primary and secondary patency rate after PTA in hemodialysis patients.


Blood ◽  
2006 ◽  
Vol 108 (11) ◽  
pp. 3064-3064
Author(s):  
Julia Stumm ◽  
Jens Dreyhaupt ◽  
Martin Kornacker ◽  
Manfred Hensel ◽  
Michael Kneba ◽  
...  

Abstract Although auto-SCT has been in use for treatment of advanced FL since many years, little is known about the course of those who relapse after this procedure. Because these patients may be candidates for aggressive salvage approaches, we sought to study the outcome of patients with FL relapsing after auto-SCT with particular focus on factors predicting for survival. Methods: Relapse cases were identified retrospectively from 244 patients autografted for FL between August 1990 and November 2002 in 3 institutions. Overall survival after relapse (OS) was calculated according to Kaplan-Meier and analyzed for the prognostic impact of pre-relapse variables as well as of post-relapse salvage treatment by univariate log rank comparisons and Cox regression analyses. Results: With a median follow-up of 88 (5–186) months post auto-SCT, 104 relapses occurred, corresponding to a 10-year relapse probability of 0.47 (95%CI 0.4–0.53). Median age of relapsed patients was 48 (22–65) years. FLIPI score at diagnosis was low in 18%, intermediate in 58%, and high in 24%. In 51%, auto-SCT had been given as part of first-line treatment, and 45% had been in complete remission at auto-SCT. Myeloablation included total body irradiation (TBI) in 57% of the cases. Median time from auto-SCT to relapse was 19 (2–128) months, with only 2 relapses occurring later than 6 years post transplant. Transformed FL was present in 14% of those 87 patients who had relapse histology available. Rituximab-containing salvage therapy was given to 50% of the patients after relapse. With 45 (1–139) months of follow-up, median OS after relapse was 100 months. Log rank comparisons identified auto-SCT as part of salvage treatment, time to relapse <12 months, and salvage without rituximab as factors adversely influencing OS, while all other variables listed above had no impact. Cox analysis considering sex, age, salvage auto-SCT, TBI, time to relapse, and rituximab salvage confirmed a possible adverse impact of time to relapse <12 months (hazard ratio 2.58 (95%CI 0.99–6.82); p 0.055) but none of the other covariates on OS. Conclusions: The prognosis of patients relapsing after auto-SCT for FL is surprisingly good. However, those whose disease recurs within the first post-transplant year tend to have a dismal outcome and might benefit from experimental salvage approaches, such as allogeneic SCT.


2020 ◽  
Vol 35 (Supplement_3) ◽  
Author(s):  
Mimoza Milenkova ◽  
Adrijana Spasovska Vasilova ◽  
Aleksandra Canevska ◽  
Vladimir Pushevski ◽  
GJULSEN SELIM ◽  
...  

Abstract Background and Aims The life expectancy in dialysis patients depends on patients’ age and comorbidities. Frailty in elderly patients is a state of impaired homeostasis with loss of physiologic reserve and a consequent impaired responses to dialysis burden. In this study we assessed the impact of age, comorbidities and frailty on dialysis patients’ survival. Method The study enrolled 162 prevalent patients on chronic hemodialysis with mean dialysis vintage of 100 months, 55% were women and 21 % had diabetes. Patients were divided into three groups by the Khan Comorbidity index score, highest score was considered worse. Frailty was assessed by presence of 3 or more symptoms (unintentional weight loss, feeling exhausted, weak grip strength, slow walking speed and low physical activity) and expressed as absolute number. Estimates of five years life expectancy were assessed by Kaplan Meier survival log-rank test and Cox regression hazard model. Results There were 26 (16%) with lowest score, 85 (52%) with medium score and worst highest score in 51 (31%). During the 5 years of follow up 69(43%) patients died of all-cause mortality. There were no deaths in the group with lowest score and mortality rates in the intermediate and worse score group increased by double (0; 30%; 69%, respectively). Significantly higher mean life expectancy was found in lower Khan Score groups: 60mo; 48.40 ± 18.51; 32.44 ± 22.06, log-rank: p &lt; 0.012. Patients that scored worse had four folds higher risk for death HR 4.2 (95% CI: 2.72 – 6.36), p=0.0001. In the multivariate model Khan Score was a more powerful predictor of mortality than frailty in elderly, with HR 3.2 (95% CI: 2.88 – 5.41), p=0.0001. Conclusion Comorbidities and age outperforms frailty burden as a predictor of mortality in dialysis patients.


2017 ◽  
Vol 35 (4_suppl) ◽  
pp. 694-694 ◽  
Author(s):  
Richard M. Lee-Ying ◽  
Nicholas Bosma ◽  
Patricia A. Tang

694 Background: The impact of primary tumour sidedness has recently been demonstrated in patients with metastatic colorectal cancer (mCRC). Differences in right (R) versus left (L) sided mCRC may be due to differences in consensus molecular subtyping. Clinically predictive mutations in ras, ( kras, nras and braf) may also help drive some of the differences in outcome. However, patients with mCRC who undergo surgical resection of CRLM often have a good prognosis. The aim of this study was to assess the impact of tumour sidedness on OS after resection of CRLM. Methods: Patients who underwent resection of CRLM in the province of Alberta, Canada were identified from 2004-2016. Tumour sidedness was determined by chart review, with R from the cecum to transverse and L from splenic flexure to sigmoid. Where available, ras mutational status was collected. OS was measured from the time of CRLM resection to death or last follow-up using the Kaplan-meier method. R and L were compared using the log-rank test and a Cox regression model. Results: 471 patients were identified who underwent resection of CRLM for mCRC, including 204 R and 267 L. Median age was 65, 63% male, with 54% synchronous metastatic disease, and 67% with a Charleson comorbidity index of 0. All ras wildtype was present in 22% of cases, any ras mutation was detected in 21% and 57% were unknown at the time of analysis. The median OS of R was 45 months, compared to 72 months for L, log-rank p = 0.01. After adjusting for potential confounders with a Cox-proportional hazard model, R compared to L remained significant, with a HR of 1.4 (95% CI 1.0-1.9, p = 0.02). ras mutational status was also significant for ras mutant, HR 2.4 (95% CI 1.7-3.3) and ras unknown HR 2.2 (95% CI 1.5-3.1) compared to ras wildtype p < 0.01. Conclusions: Primary tumour sidedness continues to have an impact on OS in mCRC, even when disease is managed surgically with resection of CRLM. Though limited by numbers, the impact remained significant even after controlling for potential confounders, including ras mutational status. Additional ras testing is underway. Further molecular classification may provide a biologic rationale for the observed differences.


2020 ◽  
Vol 41 (Supplement_2) ◽  
Author(s):  
R Arroyo-Espliguero ◽  
M.C Viana-Llamas ◽  
A Silva-Obregon ◽  
A Estrella-Alonso ◽  
C Marian-Crespo ◽  
...  

Abstract Background Malnutrition and sarcopenia are common features of frailty. Prevalence of frailty among ST-segment elevation myocardial infarction (STEMI) patients is higher in women than men. Purpose Assess gender-based differences in the impact of nutritional risk index (NRI) and frailty in one-year mortality rate among STEMI patients following primary angioplasty (PA). Methods Cohort of 321 consecutive patients (64 years [54–75]; 22.4% women) admitted to a general ICU after PA for STEMI. NRI was calculated as 1.519 × serum albumin (g/L) + 41.7 × (actual body weight [kg]/ideal weight [kg]). Vulnerable and moderate to severe NRI patients were those with Clinical Frailty Scale (CFS)≥4 and NRI&lt;97.5, respectively. We used Kaplan-Meier survival model. Results Baseline and mortality variables of 4 groups (NRI-/CFS-; NRI+/CFS-; NRI+/CFS- and NRI+/CFS+) are depicted in the Table. Prevalence of malnutrition, frailty or both were significantly greater in women (34.3%, 10% y 21.4%, respectively) than in men (28.9%, 2.8% y 6.0%, respectively; P&lt;0.001). Women had greater mortality rate (20.8% vs. 5.2%: OR 4.78, 95% CI, 2.15–10.60, P&lt;0.001), mainly from cardiogenic shock (P=0.003). Combination of malnutrition and frailty significantly decreased cumulative one-year survival in women (46.7% vs. 73.3% in men, P&lt;0.001) Conclusion Among STEMI patients undergoing PA, the prevalence of malnutrition and frailty are significantly higher in women than in men. NRI and frailty had an independent and complementary prognostic impact in women with STEMI. Kaplan-Meier and Cox survival curves Funding Acknowledgement Type of funding source: None


2021 ◽  
pp. oemed-2020-106903
Author(s):  
Julio González Martin-Moro ◽  
Marta Chamorro Gómez ◽  
Galicia Dávila Fernández ◽  
Ana Elices Apellaniz ◽  
Ana Fernández Hortelano ◽  
...  

ObjectivesReverse transcriptase PCR (RT-PCR) is considered the gold standard in diagnosing COVID-19. Infected healthcare workers do not go back to work until RT-PCR has demonstrated that the virus is no longer present in the upper respiratory tract. The aim of this study is to determine the most efficient time to perform RT-PCR prior to healthcare workers’ reincorporation.Materials and methodsThis is a cohort study of healthcare workers with RT-PCR-confirmed COVID-19. Data were collected using the medical charts of healthcare workers and completed with a telephone interview. Kaplan-Meier curves were used to determine the influence of several variables on the time to RT-PCR negativisation. The impact of the variables on survival was assessed using the Breslow test. A Cox regression model was developed including the associated variables.Results159 subjects with a positive RT-PCR out of 374 workers with suspected COVID-19 were included. The median time to negativisation was 25 days from symptom onset (IQR 20–35 days). Presence of IgG, dyspnoea, cough and throat pain were associated with significant longer time to negativisation. Cox logistic regression was used to adjust for confounding variables. Only dyspnoea and cough remained in the model as significant determinants of prolonged negativisation time. Adjusted HRs were 0.68 (0.48–096) for dyspnoea and 0.61 (0.42–0.88) for dry cough.ConclusionsRT-PCR during the first 3 weeks leads to a high percentage of positive results. In the presence of respiratory symptoms, negativisation took nearly 1 week more. Those who developed antibodies needed longer time to negativisate.


2003 ◽  
Vol 10 (3) ◽  
pp. 424-432 ◽  
Author(s):  
Chuh K. Chong ◽  
Thien V. How ◽  
Geoffrey L. Gilling-Smith ◽  
Peter L. Harris

Purpose: To investigate the effect on intrasac pressure of stent-graft deployment within a life-size silicone rubber model of an abdominal aortic aneurysm (AAA) maintained under physiological conditions of pressure and flow. Methods: A commercial bifurcated device with the polyester fabric preclotted with gelatin was deployed in the AAA model. A pump system generated physiological flow. Mean and pulse aortic and intrasac pressures were measured simultaneously using pressure transducers. To simulate a type I endoleak, plastic tubing was placed between the aortic wall and the stent-graft at the proximal anchoring site. Type II endoleak was simulated by means of side branches with set inflow and outflow pressures and perfusion rates. Type IV endoleak was replicated by removal of gelatin from the graft fabric. Results: With no endoleak, the coated graft reduced the mean and pulse sac pressures to negligible values. When a type I endoleak was present, mean sac pressure reached a value similar to mean aortic pressure. When net flow through the sac due to a type II endoleak was present, mean sac pressure was a function of the inlet pressure, while pulse pressure in the sac was dependent on both inlet and outlet pressures. As perfusion rates increased, both mean and pulse sac pressures decreased. When there was no outflow, mean sac pressure was similar to mean aortic pressure. In the presence of both type I and type II endoleaks, mean sac pressure reached mean aortic pressure when the net perfusion rate was low. Conclusions: In vitro studies are useful in gaining an understanding of the impact of different types of endoleaks, in isolation and in combination, on intrasac pressure after aortic stent-graft deployment.


Sign in / Sign up

Export Citation Format

Share Document