scholarly journals Comparing the Characteristics and Predicting the Survival of Head and Neck Melanoma versus Body Melanoma: A Population-based Study

Author(s):  
Yuxin Ding ◽  
Runyi Jiang ◽  
Yuhong Chen ◽  
Jing Jing ◽  
Xiaoshuang Yang ◽  
...  

Abstract Background: Previous studies have reported poorer survival in head and neck melanoma (HNM) than in body melanoma (BM). Individualized tools to predict the prognosis for patients with HNM or BM remain insufficient. We aim to compare the characteristics of HNM and BM, and establish and validate the nomograms for predicting the 3-, 5- and 10-year survival of patients with HNM or BM.Methods: We studied patients with HNM or BM from 2004 to 2015 in the Surveillance, Epidemiology, and End Results (SEER) database. The HNM group and BM group were randomly divided into training and validation cohorts. We performed the Kaplan-Meier method for survival analysis, and used multivariate Cox proportional hazards models to identify independent prognostic factors. Nomograms for HNM patients or BM patients were developed via the rms package, and were measured by the concordance index (C-index), the area under the receiver operator characteristic (ROC) curve (AUC) and calibration plots.Results: Of 70605 patients acquired, 21% (n=15071) had HNM and 79% (n=55534) had BM. The HNM group contained more older patients, male patients, and lentigo maligna melanoma, and more frequently had thicker tumors and metastases than the BM group. The 5-year CSS and OS rates were 88.1±0.3% and 74.4±0.4% in the HNM group and 92.5±0.1% and 85.8±0.2% in the BM group, respectively. Eight independent prognostic factors (age, sex, histology, thickness, ulceration, stage, metastases, and surgery) were identified to construct nomograms for HNM patients or BM patients. The performance of the nomograms were excellent: the C-index of the CSS prediction for HNM patients and BM patients in the training cohort were 0.839 and 0.895, respectively; in the validation cohort, they were 0.848 and 0.888, respectively; the AUCs for the 3-, 5- and 10-year CSS rates of HNM were 0.871, 0.865 and 0.854 (training), and 0.881, 0.879 and 0.861 (validation), respectively; of BM, the AUCs were 0.924, 0.918 and 0.901 (training) and 0.916, 0.908 and 0.893 (validation), respectively; and the calibration plots showed great consistency.Conclusions: The characteristics of HNM and BM are heterogeneous, and we constructed and validated specific nomograms as practical prognostic tools for patients with HNM or BM.

2021 ◽  
Author(s):  
Yuxin Ding ◽  
Runyi Jiang ◽  
Yuhong Chen ◽  
Jing Jing ◽  
Xiaoshuang Yang ◽  
...  

Abstract Background Previous studies have reported poorer survival in head and neck melanoma (HNM) than in body melanoma (BM). Individualized tools to predict the prognosis for patients with HNM or BM remain insufficient. Objectives To compare the characteristics of HNM and BM, and to establish and validate the nomograms for predicting the 3-, 5- and 10-year survival of patients with HNM or BM. Methods We studied patients with HNM or BM from 2004 to 2015 in the Surveillance, Epidemiology, and End Results (SEER) database. The HNM group and BM group were randomly divided into training and validation cohorts. We performed the Kaplan-Meier method for survival analysis, and used multivariate Cox proportional hazards models to identify independent prognostic factors. Nomograms for HNM patients or BM patients were developed via the rms package, and were measured by the concordance index (C-index), the area under the receiver operator characteristic (ROC) curve (AUC) and calibration plots. Results Of 70605 patients acquired, 21% (n = 15071) had HNM and 79% (n = 55534) had BM. The HNM group contained more older patients, male patients, and lentigo maligna melanoma, and more frequently had thicker tumors and metastases than the BM group. The 5-year CSS and OS rates were 88.1 ± 0.3% and 74.4 ± 0.4% in the HNM group and 92.5 ± 0.1% and 85.8 ± 0.2% in the BM group, respectively. Eight independent prognostic factors (age, sex, histology, thickness, ulceration, stage, metastases, and surgery) were identified to construct nomograms for HNM patients or BM patients. The performance of the nomograms were excellent: the C-index of the CSS prediction for HNM patients and BM patients in the training cohort were 0.839 and 0.895, respectively; in the validation cohort, they were 0.848 and 0.888, respectively; the AUCs for the 3-, 5- and 10-year CSS rates of HNM were 0.871, 0.865 and 0.854 (training), and 0.881, 0.879 and 0.861 (validation), respectively; of BM, the AUCs were 0.924, 0.918 and 0.901 (training) and 0.916, 0.908 and 0.893 (validation), respectively; and the calibration plots showed great consistency. Conclusions The characteristics of HNM and BM are heterogeneous, and we constructed and validated specific nomograms as practical prognostic tools for patients with HNM or BM.


2021 ◽  
Vol 19 (1) ◽  
Author(s):  
Zhixiu Xia ◽  
Changliang Wang ◽  
Hong Zhang

Abstract Purpose Colon cancer (CC) is a very common gastrointestinal tumor that is prone to invasion and metastasis in the late stage. This study aims to observe the expression of Na+/Ca2+ exchangers (NCXs) and analyze the correlation between NCXs and the prognosis of CC. Methods Specimens of 111 stage II–IV CC patients were collected. We used western blotting, qPCR, and immunohistochemical staining to observe the distributions and expression levels of NCX isoforms (NCX1, NCX2, and NCX3) in CC and distal normal tissues. Cox proportional hazards models were used to assess prognostic factors for patients. Results The expression of NCXs in most tumor specimens was lower than that in normal tissues. The NCX expression levels in tumor tissues from the primary tumor, local lymph node metastasis sites, and distant liver metastasis sites were increasingly significantly lower than those in normal tissues. The results of the Kaplan-Meier survival curves showed that the downregulation of any NCX isoform was closely related to the worse prognosis of advanced CC. Conclusion NCXs can be used as independent prognostic factors for CC. Our research results are expected to provide new targets for the treatment of CC.


2009 ◽  
Vol 27 (15_suppl) ◽  
pp. 9081-9081
Author(s):  
T. K. Eigentler ◽  
A. Figl ◽  
D. Krex ◽  
P. Mohr ◽  
P. Kurschat ◽  
...  

9081 Background: This multicenter study aimed to identify prognostic factors in patients with brain metastases from malignant melanoma (BM-MM). Methods: In a retrospective survey in nine cancer centres of the German Cancer Society 692 patients were identified with BM-MM during the period 1986–2007. Overall survival was analysed using Kaplan-Meier estimator and compared by log-rank analysis. Cox proportional hazards models were used to identify prognostic factors significant for survival. Results: The median overall survival of the entire cohort was 5.0 months (95%CI: 4–5). Prognostic factors in the univariate Kaplan- Meier analysis were: Karnofsky Performance Status (≥ 70 vs. <70; p<0.001), number of BM-MM (single vs. multiple; p<0.001), pre-treatment levels of serum LDH (normal vs. elevated; p<0.001), pre-treatment levels of S100 (normal vs. elevated; p<0.001), Prognostic groups according to Radiation Therapy Oncology Group (Class I vs. Class II vs. Class III; p=0.0485), kind of applied treatment (for the cohort with single BM-MM, only) (stereotactic radiotherapy or neurosurgical metastasectomy vs. others; p=0.036). Cox proportional hazards models revealed pre-treatment elevated level of serum LDH (HR: 1.6, 95%CI: 1.3–2.0; p=0.00013) and number of BM-MM (HR: 1.6, 95%CI: 1.3–2.0; p=0.00011) in the whole cohort of patients as independent prognostic variables, whereas in patients with single BM-MM the kind of applied treatment (stereotactic radiotherapy or neurosurgical metastasectomy vs. others; HR: 1.5, 95%CI: 1.1–1.9; p=0.0061) was identified as unique prognostic factor. Conclusions: Overall survival of patients with BM-MM mainly depends on the number of metastases and pre-treatment levels of LDH. In case of a single brain metastasis the application of stereotactic radiotherapy or neurosurgical metastasectomy is by far the most important factor for improving survival. No significant financial relationships to disclose.


2012 ◽  
Vol 30 (4_suppl) ◽  
pp. 284-284
Author(s):  
Mark Vikas Mishra ◽  
Colin Eamon Champ ◽  
Timothy Norman Showalter ◽  
Scott W. Keith ◽  
Pramila R. Anne ◽  
...  

284 Background: The purpose of this study is to evaluate conditional survival (CS) probabilities for patients with resected pancreatic adenocarcinoma (PC) amongst patients who have survived ≥1 more year(s) after diagnosis. Methods: Patients with resected PC from 1998-2008 were identified from the Surveillance, Epidemiology and End Results Database. Data on patient, tumor and treatment characteristics were extracted. Overall survival (OS) rates were calculated using the Kaplan-Meier method. A multivariate analysis (MVA) at different time points from survival was performed to determine independent prognostic factors associated with all-cause mortality hazard ratios (HRs) using Cox proportional hazards models. Results: A total of 4,883 patients with resected PC were identified. Fourteen percent of patients had Stage IA/B disease, 23% Stage IIA, 53% Stage IIB, and 2% Stage IIIA, and 6% with unknown stage. The 1-, 3-, and 5-year survival estimates for patients at diagnosis were 67%, 29%, and 21%, respectively. One, 3- and 5-year survival probabilities conditional upon number of years already survived are shown in table 1. Prognostic factors significantly correlated with improved OS at the time of diagnosis on MVA include: earlier stage, younger age, later year of diagnosis, white race, female gender, and residence in a high income district (p < 0.05). After already surviving 3 years following diagnosis, younger age was the only prognostic factor correlated with improved OS (p<0.05). Conclusions: CS estimates provide additional prognostic information that may be used to counsel PC patients on how their prognosis may change over time. Further research utilizing prospectively-collected data is warranted to help determine recommended follow-up intervals and benchmarks for future clinical trials. [Table: see text]


Author(s):  
Majdi Imterat ◽  
Tamar Wainstock ◽  
Eyal Sheiner ◽  
Gali Pariente

Abstract Recent evidence suggests that a long inter-pregnancy interval (IPI: time interval between live birth and estimated time of conception of subsequent pregnancy) poses a risk for adverse short-term perinatal outcome. We aimed to study the effect of short (<6 months) and long (>60 months) IPI on long-term cardiovascular morbidity of the offspring. A population-based cohort study was performed in which all singleton live births in parturients with at least one previous birth were included. Hospitalizations of the offspring up to the age of 18 years involving cardiovascular diseases and according to IPI length were evaluated. Intermediate interval, between 6 and 60 months, was considered the reference. Kaplan–Meier survival curves were used to compare the cumulative morbidity incidence between the groups. Cox proportional hazards model was used to control for confounders. During the study period, 161,793 deliveries met the inclusion criteria. Of them, 14.1% (n = 22,851) occurred in parturient following a short IPI, 78.6% (n = 127,146) following an intermediate IPI, and 7.3% (n = 11,796) following a long IPI. Total hospitalizations of the offspring, involving cardiovascular morbidity, were comparable between the groups. The Kaplan–Meier survival curves demonstrated similar cumulative incidences of cardiovascular morbidity in all groups. In a Cox proportional hazards model, short and long IPI did not appear as independent risk factors for later pediatric cardiovascular morbidity of the offspring (adjusted HR 0.97, 95% CI 0.80–1.18; adjusted HR 1.01, 95% CI 0.83–1.37, for short and long IPI, respectively). In our population, extreme IPIs do not appear to impact long-term cardiovascular hospitalizations of offspring.


2020 ◽  
Vol 41 (Supplement_2) ◽  
Author(s):  
S Kochav ◽  
R.C Chen ◽  
J.M.D Dizon ◽  
J.A.R Reiffel

Abstract Background Theoretical concern exists regarding AV block (AVB) with class I antiarrhythmics (AADs) when bundle branch block (BBB) is present. Whether this is substantiated in real-world populations is unknown. Purpose To determine the relationship between type of AAD and incidence of AVB in patients with preexisting BBB. Methods We retrospectively studied all patients with BBB who received class I and III AADs between 1997–2019 to compare incidence of AVB. We defined index time as first exposure to either drug class and excluded patients with prior AVB or exposed to both classes. Time-at-risk window ended at first outcome occurrence or when patients were no longer observed in the database. We estimated hazard ratios for incident AVB using Cox proportional hazards models with propensity score stratification, adjusting for over 32,000 covariates from the electronic health record. Kaplan-Meier methods were used to determine treatment effects over time. Results Of 40,120 individuals with BBB, 148 were exposed to a class I AAD and 2401 to a class III AAD. Over nearly 4,200 person-years of follow up, there were 22 and 620 outcome events in the class I and class III cohorts, respectively (Figure). In adjusted analyses, AVB risk was markedly lower in patients exposed to class I AADs compared with class III (HR 0.48 [95% CI 0.30–0.75]). Conclusion Among patients with BBB, exposure to class III AADs was strongly associated with greater risk of incident AVB. This likely reflects differences in natural history of patients receiving class I vs class III AADs rather than adverse class III effects, however, the lack of worse outcomes acutely with class I AADs suggests that they may be safer in BBB than suspected. Funding Acknowledgement Type of funding source: None


Author(s):  
Joshua R Ehrlich ◽  
Bonnielin K Swenor ◽  
Yunshu Zhou ◽  
Kenneth M Langa

Abstract Background Vision impairment (VI) is associated with incident cognitive decline and dementia. However, it is not known whether VI is associated only with the transition to cognitive impairment, or whether it is also associated with later transitions to dementia. Methods We used data from the population-based Aging, Demographics and Memory Study (ADAMS) to investigate the association of visual acuity impairment (VI; defined as binocular presenting visual acuity &lt;20/40) with transitions from cognitively normal (CN) to cognitive impairment no dementia (CIND) and from CIND to dementia. Multivariable Cox proportional hazards models and logistic regression were used to model the association of VI with cognitive transitions, adjusted for covariates. Results There were 351 participants included in this study (weighted percentages: 45% male, 64% age 70-79 years) with a mean follow-up time of 4.1 years. In a multivariable model, the hazard of dementia was elevated among those with VI (HR=1.63, 95%CI=1.04-2.58). Participants with VI had a greater hazard of transitioning from CN to CIND (HR=1.86, 95%CI=1.09-3.18). However, among those with CIND and VI a similar percentage transitioned to dementia (48%) and remained CIND (52%); there was no significant association between VI and transitioning from CIND to dementia (HR=0.94, 95%CI=0.56-1.55). Using logistic regression models, the same associations between VI and cognitive transitions were identified. Conclusions Poor vision is associated with the development of CIND. The association of VI and dementia appears to be due to the higher risk of dementia among individuals with CIND. Findings may inform the design of future interventional studies.


2017 ◽  
Vol 117 (06) ◽  
pp. 1072-1082 ◽  
Author(s):  
Xiaoyan Li ◽  
Steve Deitelzweig ◽  
Allison Keshishian ◽  
Melissa Hamilton ◽  
Ruslan Horblyuk ◽  
...  

SummaryThe ARISTOTLE trial showed a risk reduction of stroke/systemic embolism (SE) and major bleeding in non-valvular atrial fibrillation (NVAF) patients treated with apixaban compared to warfarin. This retrospective study used four large US claims databases (MarketScan, PharMetrics, Optum, and Humana) of NVAF patients newly initiating apixaban or warfarin from January 1, 2013 to September 30, 2015. After 1:1 warfarin-apixaban propensity score matching (PSM) within each database, the resulting patient records were pooled. Kaplan-Meier curves and Cox proportional hazards models were used to estimate the cumulative incidence and hazard ratios (HRs) of stroke/SE and major bleeding (identified using the first listed diagnosis of inpatient claims) within one year of therapy initiation. The study included a total of 76,940 (38,470 warfarin and 38,470 apixaban) patients. Among the 38,470 matched pairs, 14,563 were from MarketScan, 7,683 were from PharMetrics, 7,894 were from Optum, and 8,330 were from Humana. Baseline characteristics were balanced between the two cohorts with a mean (standard deviation [SD]) age of 71 (12) years and a mean (SD) CHA2DS2-VASc score of 3.2 (1.7). Apixaban initiators had a significantly lower risk of stroke/SE (HR: 0.67, 95 % CI: 0.59–0.76) and major bleeding (HR: 0.60, 95 % CI: 0.54–0.65) than warfarin initiators. Different types of stroke/SE and major bleeding – including ischaemic stroke, haemorrhagic stroke, SE, intracranial haemorrhage, gastrointestinal bleeding, and other major bleeding – were all significantly lower for apixaban compared to warfarin treatment. Subgroup analyses (apixaban dosage, age strata, CHA2DS2-VASc or HAS-BLED score strata, or dataset source) all show consistently lower risks of stroke/SE and major bleeding associated with apixaban as compared to warfarin treatment. This is the largest “real-world” study on apixaban effectiveness and safety to date, showing that apixaban initiation was associated with significant risk reductions in stroke/SE and major bleeding compared to warfarin initiation after PSM. These benefits were consistent across various high-risk subgroups and both the standard-and low-dose apixaban dose regimens.Note: The review process for this manuscript was fully handled by Christian Weber, Editor in Chief.Supplementary Material to this article is available online at www.thrombosis-online.com.


2021 ◽  
Vol 23 (Supplement_6) ◽  
pp. vi48-vi48
Author(s):  
James Cantrell ◽  
Pawan Acharya ◽  
Sara Vesely ◽  
Michael Confer ◽  
Ozer Algan ◽  
...  

Abstract BACKGROUND Chordomas are rare tumors arising from the embryonal notochord presenting at the base of skull, spine, or sacrum. Pediatric chordomas (PC) comprise less than 5% of all chordomas and are more likely to be atypical or dedifferentiated. Evidence for management is limited to single institution series with 5-year overall survival (OS) between 60-100%. METHODS Using the NCDB Participant User File, a retrospective observational cohort study was performed. The cohort was defined using the bone-soft-tissue, brain, and central nervous system databases selecting for cases with chordoma ICD-03 codes and age ≤ 25 years. Kaplan Meier method, log-rank test, and Cox proportional hazards regression were performed. RESULTS 297 patients from 2004-2017 met inclusion criteria for descriptive analysis with 269 cases included for survival analysis. Mean age was 16.9 years, with 10% less than age 5. The cohort was 55% female, 8% Black, and 79% White. Primary sites included bones of the skull (70%), spine (22%), and pelvis (6%). Regarding treatment, 7% had no resection, 49% sub-total resection (STR), 33% gross-total resection (GTR), and 11% unspecified resection. 51% received radiation therapy with 46% of those receiving proton therapy. 7% received chemotherapy. The 1, 3, 5, and 10-year OS was 95%, 86%, 77%, and 72%. Selected prognostic factors from univariable OS model for OS analysis included: age &gt; 5 (HR 0.30 (95% CI 0.16-0.57) p = 0.0002), surgical resection [GTR (HR 0.28 (95% CI 0.12-0.63) p = 0.0023) and STR (HR 0.27 (95% CI 0.12-0.59) p = 0.0011)], and radiation dose ≥ 7200cGy (HR 0.40 (95% CI 0.16-0.99) p = 0.047). CONCLUSION In the largest cohort reported for PC, 3 and 10-year OS was 86% and 72%. Age, surgery, and radiation dose are important prognostic factors. Propensity score analysis to gauge effect of treatment, tumor, and patient characteristics on OS is forthcoming.


Circulation ◽  
2016 ◽  
Vol 133 (suppl_1) ◽  
Author(s):  
Faye L Norby ◽  
Lindsay G Bengtson ◽  
Lin Y Chen ◽  
Richard F MacLehose ◽  
Pamela L Lutsey ◽  
...  

Background: Rivaroxaban is a novel oral anticoagulant approved in the US in 2011 for prevention of stroke and systemic embolism in patients with non-valvular atrial fibrillation (NVAF). Information on risks and benefits among rivaroxaban users in real-world populations is limited. Methods: We used data from the US MarketScan Commercial and Medicare Supplemental databases between 2010 and 2013. We selected patients with a history of NVAF and initiating rivaroxaban or warfarin. Rivaroxaban users were matched with up to 5 warfarin users by age, sex, database enrollment date and drug initiation date. Ischemic stroke, intracranial bleeding (ICB), myocardial infarction (MI), and gastrointestinal (GI) bleeding outcomes were defined by ICD-9-CM codes in an inpatient claim after drug initiation date. Cox proportional hazards models were used to assess the association between rivaroxaban vs. warfarin use and outcomes adjusting for age, sex, and CHA2DS2-VASc score. Separate models were used to compare a) new rivaroxaban users with new warfarin users, and b) switchers from warfarin to rivaroxaban to continuous warfarin users. Results: Our analysis included 34,998 rivaroxaban users matched to 102,480 warfarin users with NVAF (39% female, mean age 71), in which 487 ischemic strokes, 179 ICB, 647 MI, and 1353 GI bleeds were identified during a mean follow-up of 9 months. Associations of rivaroxaban vs warfarin were similar in new users and switchers; therefore we pooled both analyses. Rivaroxaban users had lower rates of ICB (hazard ratio (HR) (95% confidence interval (CI)) = 0.72 (0.46, 1.12))) and ischemic stroke (HR (95% CI) = 0.88 (0.68, 1.13)), but higher rates of GI bleeding (HR (95% CI) = 1.15 (1.01, 1.33)) when compared to warfarin users (table). Conclusion: In this large population-based study of NVAF patients, rivaroxaban users had a non-significant lower risk of ICB and ischemic stroke than warfarin users, but a higher risk of GI bleeding. These real-world findings are comparable to results reported in published clinical trials.


Sign in / Sign up

Export Citation Format

Share Document