CA19-9 as an independent risk factor for survival in hepatocellular carcinoma.

2013 ◽  
Vol 31 (4_suppl) ◽  
pp. 253-253
Author(s):  
Christine Cho-Shing Hsu ◽  
Abhishek Goyal ◽  
Rosa Hidalgo ◽  
Elizabeth Verna ◽  
Jean Emond ◽  
...  

253 Background: High alpha-fetoprotein (AFP) is associated with worse survival in hepatocellular carcinoma (HCC) patients. CA19-9 is a glycoprotein biomarker of biliary and pancreatic cancer. To our knowledge, no prospective study has examined CA19-9 as a prognostic biomarker in HCC. We hypothesized that it may add to our ability to risk stratify patients. Methods: We prospectively enrolled 122 patients with newly diagnosed HCC between 10/2008 and 7/2012 and followed until July 23, 2012. HCC was defined by AASLD criteria. CA 19-9 and AFP were measured on enrollment: 50% patients had CA19-9 >37 U/mL and 22% had CA 19-9>100. Pathology specimens were available for 50% of patients and none were consistent with mixed HCC-cholangiocarcinoma (HCC-CC) or pure CC. We evaluated the relationship between CA19-9 and overall survival using Kaplan Meier curves, univariate, and multivariate Cox Proportional Hazards Models. Results: The mean age of the cohort was 62.1 and 79% were male. 59% had underlying HCV and 16% had HBV. In univariate analysis, CA19-9>100 predicted worse survival (HR=3.44, 95% CI: 1.79-6.61, p=0.0002). Other cut-points for CA19-9 yielded similar results, although not all were significant. In a multivariate model including CP Score (B vs. A), Milan (outside vs. within), and AFP (>100 vs. ≤100), CA19-9 >100 remained an independent risk factor for increased mortality (HR 3.95, 95% CI: 1.95-7.97, p=0.0001). CP score and Milan status had HRs of 2.57 (95% CI: 1.31-5.02, p=0.006) and 5.27 (95% CI: 2.05-13.57 p=0.0006). AFP was not an independent risk factor for worse survival with a HR of 1.69 (95% CI: 0.85-3.37, p=0.14). We also looked at models using different cut-points for AFP; CA19-9 added additional risk stratification in all models. Among patients with AFP>100, median survival was 15.5 months vs. 3.8 months when comparing those with CA19-9 ≤100 with CA19-9>100 (p=0.01, log-rank). Conclusions: CA 19-9 is a readily available biomarker that may be a novel predictor of survival in HCC patients. In our multivariate model, CA 19-9>100 almost quadrupled mortality risk. It might be particularly useful to risk stratify patients with normal AFP and those with very high AFP being considered for transplant.

2019 ◽  
Vol 8 (11) ◽  
pp. 1924
Author(s):  
Abecassis ◽  
Wainstock ◽  
Sheiner ◽  
Pariente

The aim of this study was to evaluate perinatal outcome and long-term offspring gastrointestinal morbidity of women with celiac disease. Perinatal outcomes, as well as long-term gastrointestinal morbidity of offspring of mothers with and without celiac disease were assessed. The study groups were followed until 18 years of age for gastrointestinal-related morbidity. For perinatal outcomes, generalized estimation equation (GEE) models were used. A Kaplan–Meier survival curve was used to compare cumulative incidence of long-term gastrointestinal morbidity, and Cox proportional hazards models were constructed to control for confounders. During the study period, 243,682 deliveries met the inclusion criteria, of which 212 (0.08%) were to mothers with celiac disease. Using GEE models, maternal celiac disease was noted as an independent risk factor for low birth weight and cesarean delivery. Offspring born to mothers with celiac disease had higher rates of gastrointestinal related morbidity (Kaplan–Meier log rank test P < 0.001). Using a Cox proportional hazards model, being born to a mother with celiac disease was found to be an independent risk factor for long-term gastrointestinal morbidity of the offspring. Pregnancy of women with celiac disease is independently associated with adverse perinatal outcome as well as higher risk for long-term gastrointestinal morbidity of offspring.


Blood ◽  
2009 ◽  
Vol 114 (22) ◽  
pp. 992-992
Author(s):  
Henna Malik ◽  
Lucas Wong ◽  
Sarju Waghela ◽  
Lisa Go ◽  
William Koss ◽  
...  

Abstract Abstract 992 Poster Board I-14 Background: Acute myeloid leukemia (AML) is a common disease in individuals ≥ 65 years old. Overall survival (OS) is significantly shorter in older patients compared with younger patients. Many patients do not receive chemotherapy due to age or co-morbidities. The aim of our study is to investigate the biologic characteristics of AML in the elderly using variables on survival. Methods: For this single-center, retrospective study, authors analyzed the following variables for the outcome patients ≥ 65 years old: age at diagnosis, gender, WBC, HGB, LDH, % blasts, risk factors, chemotherapy, co-morbidities, cytogenetics, and documented MDS/cancer. Statistical Analysis: All variables were summarized using descriptive statistics: mean (SD) for continuous variables and frequency (percent) for categorical variables. Kaplan-Meier survival curves were obtained, and univariate Cox proportional hazards models and multivariate Cox proportional hazards models were applied. A p-value of less than 0.05 indicated a statistical significance. SAS 9.1.3 (SAS Institute INC, Cary, NC) was used for data management and statistical analysis. Results: Seventy-four patients 65 or older were included for the analysis by Kaplan – Meier survival. The median survival time was 3.8 months. Seventy patients have died and 4 have survived until 1/2009. Patients over 80 years old had the worst survival, 0.7 month, compare to age 65 – 70 group which was 4 months and 71 – 80 group which was 4.6 months. Results with univariate Cox proportional hazards model shows WBC group (p=0.0390), LDH group (p=0.0153), and chemotherapy (p<0.0001) were significant variables on survival. LDH group and cytogenetics were not included in the multivariate model due to many missing measurements (43%; 32 out of 74) and (27%; 17 out of 74), respectively. Final multivariate model including all significant variables revealed significant effect of WBC group (p=0.0074) and chemotherapy (p<0.0001) on survival. Discussion and Conclusions: Prior results from clinical trials and single-center studies evaluating the prognostic factors in older patients are conflicting. In our study, patients who received chemotherapy (standard or intensive chemotherapy) had better survival (median 5.2 months) compare to untreated patients with median survival 0.4 months (p<0.0001). The tendency is to exclude elderly patients for the treatment because of poor performance status (PS), organ dysfunction, and co-morbid conditions. The approach to withhold chemotherapy in elderly patients is not supported by our results. To the contrary, it appears that chemotherapy should be pursued and may offer longer survival except for elderly patients over the age of 80. High WBC ' 10 × 103 /μL at presentation had an adverse impact on survival rates (p=0.034). Other studies have shown mixed results in regards to survival. LDH > 300 U/L was an adverse prognostic factor for survival. A higher leukocyte count probably is representative of high tumor burden and more aggressive disease biology. Cytogenetics (with MDS and without MDS) at diagnosis was not predictive of survival but our cytogenetic evaluation was incomplete due to missing data. Co-morbidities such as cardiovascular disease, diabetes, hepatic disease, pulmonary disease, and cancer did not impact on the survival. We observed adverse impact of increasing age on survival only in patients over 80. Some investigators reported no effects, and others showed increasing age as a poor prognostic factor for both CR and survival or survival alone. The cause of this discrepancy is not clear. Patients > 80 years comprise 28% of our study group and exhibited the worst survival time; they may represent a different patient population with distinct biological features. We conclude that age, biological features, chemotherapy and leukocyte count are the most important determinants of survival. Disclosures: No relevant conflicts of interest to declare.


2019 ◽  
Vol 6 (Supplement_2) ◽  
pp. S638-S639
Author(s):  
Jackrapong Bruminhent ◽  
Thanate Dajsakdipon ◽  
Siriorn Watcharananan

Abstract Background Cytomegalovirus (CMV) infection is one of the leading causes of morbidity and mortality in kidney transplant (KT) recipients. We investigated the association of CMV serostatus and allograft outcome within the first year after KT. Methods All KT recipients from 2007 to 2017 were derived from the Thai Transplant Registry. The prevalence of allograft loss and mortality within the first year after KT was estimated by Kaplan–Meier analysis. CMV serostatus of the donor (D) and the recipient (R) was assessed as a prognostic factor of allograft loss and mortality by Cox proportional hazards models. Results During a 10-year study period, the population consisted of 4,556 KT recipients with a mean ± SD age of 43 ± 14 years and 63% were male. Fifty-two percent underwent deceased donor KT and 58% received induction therapy. Among 3,907 evaluable patients, the CMV seroprevalence were D+/R+ (88.9%), D+/R− (6.1%), D−/R+ (2.9%), and D−/R− (1.9%). The estimated prevalence of allograft loss and mortality within the first year were 3.8 and 2.8%, respectively. In univariate analysis, CMV D+/R- was significantly associated with mortality within the first year after KT [hazard ratio (HR), 2.10; 95% confidence interval [CI], 1.18–3.75 (P = 0.01)] however not with an allograft loss [HR, 1.51; 95% CI, 0.85–2.66 (P = 0.16)]. In multivariate analysis, CMV D+/R- serostatus was associated with mortality within the first year after KT [HR, 2.04; 95% CI, 1.05–3.95 (P = 0.04)]. Other independent prognostic factors for mortality were older recipient age, deceased donor KT, and hemodialysis after KT (Table 1). Conclusion In the setting where the donor and recipient CMV seropositivity is predominant, CMV seromismatch still negatively affects patient survival within the first year after KT. Disclosures All authors: No reported disclosures.


2020 ◽  
Vol 41 (Supplement_2) ◽  
Author(s):  
S Kochav ◽  
R.C Chen ◽  
J.M.D Dizon ◽  
J.A.R Reiffel

Abstract Background Theoretical concern exists regarding AV block (AVB) with class I antiarrhythmics (AADs) when bundle branch block (BBB) is present. Whether this is substantiated in real-world populations is unknown. Purpose To determine the relationship between type of AAD and incidence of AVB in patients with preexisting BBB. Methods We retrospectively studied all patients with BBB who received class I and III AADs between 1997–2019 to compare incidence of AVB. We defined index time as first exposure to either drug class and excluded patients with prior AVB or exposed to both classes. Time-at-risk window ended at first outcome occurrence or when patients were no longer observed in the database. We estimated hazard ratios for incident AVB using Cox proportional hazards models with propensity score stratification, adjusting for over 32,000 covariates from the electronic health record. Kaplan-Meier methods were used to determine treatment effects over time. Results Of 40,120 individuals with BBB, 148 were exposed to a class I AAD and 2401 to a class III AAD. Over nearly 4,200 person-years of follow up, there were 22 and 620 outcome events in the class I and class III cohorts, respectively (Figure). In adjusted analyses, AVB risk was markedly lower in patients exposed to class I AADs compared with class III (HR 0.48 [95% CI 0.30–0.75]). Conclusion Among patients with BBB, exposure to class III AADs was strongly associated with greater risk of incident AVB. This likely reflects differences in natural history of patients receiving class I vs class III AADs rather than adverse class III effects, however, the lack of worse outcomes acutely with class I AADs suggests that they may be safer in BBB than suspected. Funding Acknowledgement Type of funding source: None


2017 ◽  
Vol 117 (06) ◽  
pp. 1072-1082 ◽  
Author(s):  
Xiaoyan Li ◽  
Steve Deitelzweig ◽  
Allison Keshishian ◽  
Melissa Hamilton ◽  
Ruslan Horblyuk ◽  
...  

SummaryThe ARISTOTLE trial showed a risk reduction of stroke/systemic embolism (SE) and major bleeding in non-valvular atrial fibrillation (NVAF) patients treated with apixaban compared to warfarin. This retrospective study used four large US claims databases (MarketScan, PharMetrics, Optum, and Humana) of NVAF patients newly initiating apixaban or warfarin from January 1, 2013 to September 30, 2015. After 1:1 warfarin-apixaban propensity score matching (PSM) within each database, the resulting patient records were pooled. Kaplan-Meier curves and Cox proportional hazards models were used to estimate the cumulative incidence and hazard ratios (HRs) of stroke/SE and major bleeding (identified using the first listed diagnosis of inpatient claims) within one year of therapy initiation. The study included a total of 76,940 (38,470 warfarin and 38,470 apixaban) patients. Among the 38,470 matched pairs, 14,563 were from MarketScan, 7,683 were from PharMetrics, 7,894 were from Optum, and 8,330 were from Humana. Baseline characteristics were balanced between the two cohorts with a mean (standard deviation [SD]) age of 71 (12) years and a mean (SD) CHA2DS2-VASc score of 3.2 (1.7). Apixaban initiators had a significantly lower risk of stroke/SE (HR: 0.67, 95 % CI: 0.59–0.76) and major bleeding (HR: 0.60, 95 % CI: 0.54–0.65) than warfarin initiators. Different types of stroke/SE and major bleeding – including ischaemic stroke, haemorrhagic stroke, SE, intracranial haemorrhage, gastrointestinal bleeding, and other major bleeding – were all significantly lower for apixaban compared to warfarin treatment. Subgroup analyses (apixaban dosage, age strata, CHA2DS2-VASc or HAS-BLED score strata, or dataset source) all show consistently lower risks of stroke/SE and major bleeding associated with apixaban as compared to warfarin treatment. This is the largest “real-world” study on apixaban effectiveness and safety to date, showing that apixaban initiation was associated with significant risk reductions in stroke/SE and major bleeding compared to warfarin initiation after PSM. These benefits were consistent across various high-risk subgroups and both the standard-and low-dose apixaban dose regimens.Note: The review process for this manuscript was fully handled by Christian Weber, Editor in Chief.Supplementary Material to this article is available online at www.thrombosis-online.com.


2020 ◽  
Vol 41 (Supplement_2) ◽  
Author(s):  
T Sonoda ◽  
D Kanda ◽  
K Anzaki ◽  
R Arikawa ◽  
A Tokushige ◽  
...  

Abstract Background In patients undergo PCI for coronary artery disease, target lesion calcification is associated with major cardiac events. Malnutrition is the important factor to cause frailty and sarcopenia which affect prognosis of cardiovascular diseases. However, the relationship between morphology in target lesions and malnutrition in patients undergo PCI is still uncertain. Purpose The aim of the present study was to investigate how malnutrition affects prognosis of stable angina patients underwent PCI and morphology in target lesions. Methods The subject was 206 consecutive stable angina patients undergone successful PCI using second-generation drug eluting stents and intravascular ultrasound (IVUS). The study patients were divided into two groups based on malnutrition or non-malnutrition. Nutritional status was assessed by Geriatric Nutritional Risk Index (GNRI), and patients with GNRI&lt;92 at admission were defined as malnutrition group (MG). We investigated the association between malnutrition on admission and outcome, and morphology in target lesions assessed by IVUS. Target lesion morphology were divided into moderate/severe calcified group and none/mild calcified group. Results All-cause death and MACCE (major cardiovascular and cerebrovascular events) ≤3 years after PCI were 15 cases (7%) and 33 cases (16%). MG had higher rate of all-cause death (20 vs. 6%, p=0.001) and MACCE (37 vs. 10%, p&lt;0.001) than those of non-MG. Kaplan Meier analysis elucidated that survival rate was significantly lower in MG compared to that in non-MG (p&lt;0.001). As a result of cox proportional hazards analysis, all-cause death was associated with age [hazard ratio (HR): 1.05, 95% confidence interval (CI): 1.01–1.10, p=0.006)], hs-CRP (HR: 1.03, 95% CI: 1.03–1.12, p&lt;0.001), hemodialysis (HR: 2.25, 95% CI: 1.08–4.68, p=0.029), left ventricular ejection fraction (LVEF) (HR: 0.97, 95% CI: 0.95–0.99, p=0.017) and malnutrition (HR: 4.38, 95% CI: 2.11–9.09, p&lt;0.001) in the univariate analysis. Similarly, cox proportional hazards analysis revealed that age (HR: 1.04, 95% CI: 1.01–1.07, p=0.018), hs-CRP (HR: 1.08, 95% CI: 1.03–1.11, p&lt;0.001), hemodialysis (HR: 2.68, 95% CI: 1.45–4.94, p=0.002), LVEF (HR: 0.97, 95% CI: 0.95–0.99, p=0.002) and malnutrition (HR: 4.14, 95% CI: 2.23–7.67, p&lt;0.001) were significantly associated with MACCE. Multivariate analysis for all-cause death and MACCE revealed that malnutrition was an independent risk factor (HR: 3.47, 95% CI: 1.52–7.94, p=0.003, HR: 3.76, 95% CI: 1.87–7.58, p&lt;0.001). Furthermore, MG was significantly associated with moderate/severe target calcified lesions assessed by IVUS compared to those of patients in non-MG (67 vs. 27%, p&lt;0.001) regardless with or without hemodialysis. Conclusions Malnutrition was a crucial independent risk factor for stable angina patients who underwent PCI and was significantly associated with moderate/severe target calcified lesions. Funding Acknowledgement Type of funding source: None


2006 ◽  
Vol 24 (18_suppl) ◽  
pp. 560-560 ◽  
Author(s):  
D. A. Patt ◽  
Z. Duan ◽  
G. Hortobagyi ◽  
S. H. Giordano

560 Background: Adjuvant chemotherapy for breast cancer is associated with the development of secondary AML, but this risk in an older population has not been previously quantified. Methods: We queried data from the Surveillance, Epidemiology, and End Results-Medicare (SEER-Medicare) database for women who were diagnosed with nonmetastatic breast cancer from 1992–1999. We compared the risk of AML in patients with and without adjuvant chemotherapy (C), and by differing C regimens. The primary endpoint was a claim with an inpatient or outpatient diagnosis of AML (ICD-09 codes 205–208). Risk of AML was estimated using the method of Kaplan-Meier. Cox proportional hazards models were used to determine factors independently associated with AML. Results: 36,904 patients were included in this observational study, 4,572 who had received adjuvant C and 32,332 who had not. The median patient age was 75.3 (66.0–103.3). The median follow up was 63 months (13–132). Patients who received C were significantly younger, had more advanced stage disease, and had lower comorbidity scores (p<0.001). The unadjusted risk of developing AML at 10 years after any adjuvant C for breast cancer was 1.6% versus 1.1% for women who had not received C. The adjusted HR for AML with adjuvant C was 1.72 (1.16–2.54) compared to women who did not receive C. HR for radiation was 1.21 (0.86–1.70). HR was higher with increasing age but p>0.05. An analysis was performed among women who received C. When compared to other C regimens, anthracycline-based therapy (A) conveyed a significantly higher hazard for AML HR 2.17 (1.08–4.38), while patients who received A plus taxanes (T) did not have a significant increase in risk HR1.29 (0.44–3.82) nor did patients who received T with some other C HR 1.50 (0.34–6.67). Another significant independent predictor of AML included GCSF use HR 2.21 (1.14–4.25). In addition, increasing A dose was associated with higher risk of AML (p<0.05). Conclusions: There is a small but real increase in AML after adjuvant chemotherapy for breast cancer in older women. The risk appears to be highest from A-based regimens, most of which also contained cyclophosphamide, and may be dose-dependent. T do not appear to increase risk. The role of GCSF should be further explored. No significant financial relationships to disclose.


2015 ◽  
Vol 33 (3_suppl) ◽  
pp. 453-453
Author(s):  
Kelly Elizabeth Orwat ◽  
Samuel Lewis Cooper ◽  
Michael Ashenafi ◽  
M. Bret Anderson ◽  
Marcelo Guimaraes ◽  
...  

453 Background: Systemic therapies for unresectable liver malignancies may provide a survival benefit, but eventually prove intolerable or ineffective. TARE provides an additional liver-directed treatment option to improve local control for these patients, but there is limited data on patient factors associated with survival. Methods: All patients that received TARE at the Medical University of South Carolina from March 2006 through May of 2014 were included in this analysis of overall survival (OS) and toxicity. Kaplan-Meier estimates of OS from date of first procedure are reported. Potential prognostic factors for OS were evaluated using log rank tests and Cox proportional hazards models. Results: In 114 patients that received TARE at our institution, median follow-up was 6.4 months [range 0-86], with the following tumor histology: colorectal (CR) n=55, hepatocellular (HC) n=20, cholangiocarcinoma (CC) n=16, neuroendocrine (NE) n=12, breast (BR) n=6, other n=5. At least 1 line of systemic therapy prior to TARE was noted in 79% of patients. Median OS was 6.6 months and 1-year OS was 30.7%. The percentage of patients who died within 3 months of TARE were 46.2% for patients with albumin < 3 but only 20.3% for patients with albumin ≥ 3. Grade ≥ 2 toxicity was observed in 22 patients (19.3%) including 9 (7.9%) with Grade 3 and 1 (0.9%) with Grade 4 toxicity. A single patient with a pre-existing pulmonary arteriovenous malformation experienced Grade 3 pneumonitis that resolved with steroids. No deaths were attributed to radiation-induced liver disease. Conclusions: TARE is a relatively safe and effective treatment for unresectable intrahepatic malignancies. Patients with NE or BR histology as well as those with better hepatic synthetic function were associated with significantly better survival. Our data suggest that patients with albumin below 3 may not benefit from TARE. [Table: see text]


2019 ◽  
Vol 37 (15_suppl) ◽  
pp. 4529-4529
Author(s):  
Bernadett Szabados ◽  
Marlon Rebelatto ◽  
Craig Barker ◽  
Alvin Milner ◽  
Arthur Lewis ◽  
...  

4529 Background: The biomarkers PD-L1, FOXP3, and CD8 have been explored in pts with advanced UC who progressed after platinum-based chemotherapy (CTx). However, their relevance earlier in the disease process is less well understood. Methods: The Phase 2/3 LaMB study (NCT00949455) compared maintenance lapatinib vs placebo after first-line (1L) platinum-based CTx in pts with HER1/HER2-overexpressing stage IV advanced UC. Pre-CTx archival samples from this study were retrospectively analyzed and included both randomized and screen failure pts. PD-L1 expression was assessed (VENTANA SP263 Assay) and categorized as high (≥25% of tumor cells [TC] and/or immune cells [IC]) or low/negative ( < 25% TC and IC). Overall survival (OS) and progression-free survival (PFS) were estimated via Kaplan-Meier method; results were stratified by PD-L1 expression. The exploratory biomarkers CD8 and FOXP3 were also analyzed. The prognostic significance of the biomarkers was explored by multivariable Cox proportional hazards models and a bootstrap method for model selection. Results: Of 446 pts (232 randomized; 214 screened), 243 (54.5%) were assessed for PD-L1 expression, with 61 (25.1%) PD-L1 high and 158 (65.0%) PD-L1 low/negative. In PD-L1 high and low/negative pts, respectively, median OS (95% CI) was 12.0 (9.4–19.7) vs 12.5 months (10.4–15.5); median PFS (95% CI) was 6.5 (3.5–8.8) vs 5.0 months (4.3–6.3). PD-L1 expression was not associated with OS or PFS in univariate analysis or in a multivariate model for OS (hazard ratio [HR] for PD-L1 high vs low/negative 1.4 [95% CI, 0.8–2.3]). In a multivariate model for PFS, PD-L1 expression improved accuracy of the model by 23% and was a significant variable (HR, 2.1 [95% CI, 1.2–3.5]). Results of analyses of CD8 and FOXP3 will also be reported. Conclusions: Overall, these data suggest a lack of association between PD-L1 expression and survival in pts receiving 1L platinum-based CTx. Mechanisms underlying the potential association of PD-L1 expression with PFS remain unclear. CD8 and FoxP3 exploratory analyses may help to elucidate these results. Clinical trial information: NCT00949455.


2015 ◽  
Vol 2015 ◽  
pp. 1-10 ◽  
Author(s):  
Tong-min Xue ◽  
Li-de Tao ◽  
Miao Zhang ◽  
Jie Zhang ◽  
Xia Liu ◽  
...  

miRNA-20b has been shown to be aberrantly expressed in several tumor types. However, the clinical significance of miRNA-20b in the prognosis of patients with hepatocellular carcinoma (HCC) is poorly understood, and the exact role of miRNA-20b in HCC remains unclear. The aim of the present study was to investigate the association of the expression of miR-20b with clinicopathological characteristics and overall survival of HCC patients analyzed by Kaplan-Meier analysis and Cox proportional hazards regression models. Meanwhile, the HIF-1αand VEGF targets of miR-20b have been confirmed. We found not only miR-20b regulation of HIF-1αand VEGF in normal but also regulation of miR-20b in hypoxia. This mechanism would help the tumor cells adapt to the different environments thus promoting the tumor invasion and development. The whole study suggests that miR-20b, HIF-1α, and VEGF serve as a potential therapeutic agent for hepatocellular carcinoma.


Sign in / Sign up

Export Citation Format

Share Document