Response and survival rates with frontline hypomethylating agent (HMAs) in favorable risk AML.

2017 ◽  
Vol 35 (15_suppl) ◽  
pp. 7039-7039
Author(s):  
Michael Richard Grunwald ◽  
Daniel T. Peters ◽  
Myra M. Robinson ◽  
Michael Keith Allister Zimmerman ◽  
Jing Ai ◽  
...  

7039 Background: HMAs are an accepted frontline therapy for AML patients (pts) who are unfit for intensive induction therapy (IIT), particularly pts with unfavorable cytogenetics and/or p53 mutations. However, little is known about the response of favorable risk AML to HMAs. We previously reported that NPM1 mutated and/or CD34- AML status were predictors of response to HMAs. Here, we evaluated responses to frontline HMAs in AML. Methods: A total of 117 patients with de novo AML diagnosed between 7/2013 and 9/2016 were evaluated based on pt and disease related variables, overall response rate (ORR = CR + CR with incomplete count recovery + hematologic remission (ANC > 1000/µL, Hgb > 10g/dL, Plts > 100,000/µL, & no circulating blasts)), and overall survival (OS). Categorical variables were compared using Fisher’s exact test. Kaplan Meier methods estimated survival outcomes, and log rank tests compared survival between groups. Multivariable analyses were performed using Cox proportional hazards models. Results: 51 pts, considered unfit for IIT, received frontline HMAs. ORR and OS were highest in the ELN favorable risk AML pts (n = 13; ORR = 92%, p = .009; median OS = 17.5 months, p = .022). Among 41 NPM1 mutated pts, 15 received HMAs; and 26 received intensive induction. ORRs were 73% and 84%, respectively (p = .434). No difference was found in OS distributions between the HMA and IIT groups in univariate and multivariate (adjusted for age and FLT3 status) models (p = .329 and .241, respectively). Interestingly, ORR was 100% among 9 HMA-treated pts with NPM1 mutated, CD34-, FLT3/ITD-, cytogenetically normal AML. Conclusions: HMA therapy is highly effective frontline treatment in favorable risk AML pts considered unfit for IIT. Survival results with HMAs in NPM1 mutated AML are comparable to those of fitter pts treated with IIT. In selected favorable risk pts considered unfit for standard induction, HMAs can be a successful bridge to potentially curative therapy, including more intensive therapy or transplant. Cytogenetically normal AML with an isolated NPM1 mutation and CD34- status appears to be exceptionally responsive to frontline treatment with HMAs. Prospective validation of these findings is necessary.

Author(s):  
Ella Nissan ◽  
Abdulla Watad ◽  
Arnon D. Cohen ◽  
Kassem Sharif ◽  
Johnatan Nissan ◽  
...  

Polymyositis (PM) and dermatomyositis (DM) are autoimmune-mediated multisystemic myopathies, characterized mainly by proximal muscle weakness. A connection between epilepsy and PM/DM has not been reported previously. Our study aim is to evaluate this association. A case–control study was conducted, enrolling a total of 12,278 patients with 2085 cases (17.0%) and 10,193 subjects in the control group (83.0%). Student’s t-test was used to evaluate continuous variables, while the chi-square test was applied for the distribution of categorical variables. Log-rank test, Kaplan–Meier curves and multivariate Cox proportional hazards method were performed for the analysis regarding survival. Of the studied 2085 cases, 1475 subjects (70.7%) were diagnosed with DM, and 610 patients (29.3%) with PM. Participants enrolled as cases had a significantly higher rate of epilepsy (n = 48 [2.3%]) as compared to controls (n = 141 [1.4%], p < 0.0005). Using multivariable logistic regression analysis, PM was found only to be significantly associated with epilepsy (OR 2.2 [95%CI 1.36 to 3.55], p = 0.0014), whereas a non-significant positive trend was noted in DM (OR 1.51 [95%CI 0.99 to 2.30], p = 0.0547). Our data suggest that PM is associated with a higher rate of epilepsy compared to controls. Physicians should be aware of this comorbidity in patients with immune-mediated myopathies.


Author(s):  
Claudius E. Degro ◽  
Richard Strozynski ◽  
Florian N. Loch ◽  
Christian Schineis ◽  
Fiona Speichinger ◽  
...  

Abstract Purpose Colorectal cancer revealed over the last decades a remarkable shift with an increasing proportion of a right- compared to a left-sided tumor location. In the current study, we aimed to disclose clinicopathological differences between right- and left-sided colon cancer (rCC and lCC) with respect to mortality and outcome predictors. Methods In total, 417 patients with colon cancer stage I–IV were analyzed in the present retrospective single-center study. Survival rates were assessed using the Kaplan–Meier method and uni/multivariate analyses were performed with a Cox proportional hazards regression model. Results Our study showed no significant difference of the overall survival between rCC and lCC stage I–IV (p = 0.354). Multivariate analysis revealed in the rCC cohort the worst outcome for ASA (American Society of Anesthesiologists) score IV patients (hazard ratio [HR]: 16.0; CI 95%: 2.1–123.5), CEA (carcinoembryonic antigen) blood level > 100 µg/l (HR: 3.3; CI 95%: 1.2–9.0), increased lymph node ratio of 0.6–1.0 (HR: 5.3; CI 95%: 1.7–16.1), and grade 4 tumors (G4) (HR: 120.6; CI 95%: 6.7–2179.6) whereas in the lCC population, ASA score IV (HR: 8.9; CI 95%: 0.9–91.9), CEA blood level 20.1–100 µg/l (HR: 5.4; CI 95%: 2.4–12.4), conversion to laparotomy (HR: 14.1; CI 95%: 4.0–49.0), and severe surgical complications (Clavien-Dindo III–IV) (HR: 2.9; CI 95%: 1.5–5.5) were identified as predictors of a diminished overall survival. Conclusion Laterality disclosed no significant effect on the overall prognosis of colon cancer patients. However, group differences and distinct survival predictors could be identified in rCC and lCC patients.


2021 ◽  
Vol 39 (6_suppl) ◽  
pp. 59-59
Author(s):  
Umang Swami ◽  
Taylor Ryan McFarland ◽  
Benjamin Haaland ◽  
Adam Kessel ◽  
Roberto Nussenzveig ◽  
...  

59 Background: In mCSPC, baseline CTC counts have been shown to correlate with PSA responses and progression free survival (PFS) in small studies in the context of androgen deprivation therapy (ADT) without modern intensification with docetaxel or novel hormonal therapy. Similar correlation of CTC count with PSA responses and PFS was recently reported from an ongoing phase 3 trial in mCSPC setting (SWOG1216) without reporting the association in the context of ADT intensification. Furthermore, none of these studies correlated CTCs with overall survival (OS). Herein we evaluated whether CTCs were associated with outcomes including OS in a real world mCPSC population treated with intensified as well as non-intensified ADT. Methods: Eligibility criteria: new mCSPC receiving ADT with or without intensification and enumeration of baseline CTCs by FDA cleared Cell Search CTC assay. The relationship between CTC counts (categorized as: 0, 1-4, and ≥5/7.5 ml) and both PFS and OS was assessed in the context of Cox proportional hazards models, both unadjusted and adjusted for age, Gleason, PSA at ADT initiation, de novo vs. non-de novo status, and ADT intensification vs. non-intensification therapy. Results: Overall 99 pts were identified. Baseline characteristics are summarized in Table. In unadjusted analyses, CTC counts of ≥5 as compared to 0 were strongly associated with inferior PFS (hazard ratio [HR] 3.38, 95% CI 1.85-6.18; p < 0.001) and OS (HR 4.44 95% CI 1.63-12.10; p = 0.004). In multivariate analyses, CTC counts of ≥5 as compared to 0 continued to be associated with inferior PFS (HR 5.49, 95% CI 2.64-11.43; p < 0.001) and OS (HR 4.00, 95% CI 1.31-12.23; p = 0.015). Within the ADT intensification subgroup also, high CTC counts were associated with poor PFS and OS. For PFS, the univariate HR for CTC ≥5 vs. 0 was 4.87 (95% CI 1.66-14.30; p = 0.004) and multivariate HR for CTC ≥5 vs. 0 was 7.43 (95% CI 1.92-28.82; p = 0.004). For OS, the univariate HR for CTC ≥5 vs. 0 was 15.88 (95% CI 1.93-130.58; p = 0.010) and multivariate HR for CTC ≥5 vs. 0 was 24.86 (95% CI 2.03-304.45; p = 0.012). Conclusions: To best of our knowledge this is the first study to show that high baseline CTC counts are strongly associated with inferior PFS as well as OS in pts with newly diagnosed mCSPC, even in those who received intensified ADT therapy. Identifying these pts at highest risk of progression and death can help with counselling and prognostication in clinics as well as design and enrollment in future clinical trials. [Table: see text]


Author(s):  
Tzu-Wei Yang ◽  
Chi-Chih Wang ◽  
Ming-Chang Tsai ◽  
Yao-Tung Wang ◽  
Ming-Hseng Tseng ◽  
...  

The prognosis of different etiologies of liver cirrhosis (LC) is not well understood. Previous studies performed on alcoholic LC-dominated cohorts have demonstrated a few conflicting results. We aimed to compare the outcome and the effect of comorbidities on survival between alcoholic and non-alcoholic LC in a viral hepatitis-dominated LC cohort. We identified newly diagnosed alcoholic and non-alcoholic LC patients, aged ≥40 years old, between 2006 and 2011, by using the Longitudinal Health Insurance Database. The hazard ratios (HRs) were calculated using the Cox proportional hazards model and the Kaplan–Meier method. A total of 472 alcoholic LC and 4313 non-alcoholic LC patients were identified in our study cohort. We found that alcoholic LC patients were predominantly male (94.7% of alcoholic LC and 62.6% of non-alcoholic LC patients were male) and younger (78.8% of alcoholic LC and 37.4% of non-alcoholic LC patients were less than 60 years old) compared with non-alcoholic LC patients. Non-alcoholic LC patients had a higher rate of concomitant comorbidities than alcoholic LC patients (79.6% vs. 68.6%, p < 0.001). LC patients with chronic kidney disease demonstrated the highest adjusted HRs of 2.762 in alcoholic LC and 1.751 in non-alcoholic LC (all p < 0.001). In contrast, LC patients with hypertension and hyperlipidemia had a decreased risk of mortality. The six-year survival rates showed no difference between both study groups (p = 0.312). In conclusion, alcoholic LC patients were younger and had lower rates of concomitant comorbidities compared with non-alcoholic LC patients. However, all-cause mortality was not different between alcoholic and non-alcoholic LC patients.


2019 ◽  
Vol 6 (Supplement_2) ◽  
pp. S16-S16 ◽  
Author(s):  
Muhammad R Sohail ◽  
G Ralph Corey ◽  
Bruce L Wilkoff ◽  
Jeanne Poole ◽  
Suneet Mittal ◽  
...  

Abstract Background Cardiovascular implantable electronic device (CIED) infections are associated with significant morbidity, mortality, and cost. There is limited evidence on antibiotic prophylactic strategies to prevent CIED infection. Recently, the TYRX Envelope, which elutes a combination of rifampin and minocycline for a minimum of 7 days, was shown to significantly reduce major CIED infections in the WRAP-IT trial. We sought to characterize the pathogens among patients who experienced an infection in the current era. Methods All patients undergoing CIED replacement, upgrade, revision, or de novo cardiac resynchronization therapy (CRT-D) received standard of care antibiotic prophylaxis and were randomized 1:1 to receive TYRX or not. The primary endpoint was major CIED infection within 12 months of the procedure. Major infection was defined as an infection resulting in (1) system extraction or revision, (2) long-term suppressive antibiotic therapy, or (3) death. Data were analyzed using the Cox proportional hazards regression model. Results A total of 6,983 patients were randomized worldwide with 3,495 randomized to receive an envelope and 3,488 randomized to the control. At 12 months, 25 major infections (0.7%) were observed in the envelope group and 42 major infections (1.2%) in the control group, resulting in a 40% reduction of major infections (HR: 0.60, 95% CI: 0.36–0.98, P = 0.04). Of 63 infections assayed, causative pathogens were identified in 36 infections whereas cultures were negative in 27 cases. Staphylococcus species (n = 22) were the predominate pathogens and a 53% reduction was observed with the use of TYRX (Figure 1). Moreover, there was only 1 CIED pocket infection with Staphylococcus species in the envelope group compared with 14 pocket infections in the control group. A comparison of timing of infection in the envelope group showed the presence of 11 endocarditis/bacteremia infections at 103 ± 84 days compared with 14 pocket infections presenting at 70 ± 78 days from the procedure. Conclusion In this large randomized trial, the use of the TYRX Envelope containing rifampin and minocycline resulted in a significant reduction of major CIED infections and was effective against staphylococcal species, which are the predominant cause of pocket infections. Disclosures All Authors: No reported Disclosures.


Blood ◽  
2009 ◽  
Vol 114 (22) ◽  
pp. 2638-2638 ◽  
Author(s):  
Mahesh Seetharam ◽  
Olga K Weinberg ◽  
Li Ren ◽  
Lisa Ma ◽  
Katie Seo ◽  
...  

Abstract Abstract 2638 Poster Board II-614 Background: The importance of cytogenetics in prognosis of AML is now widely recognized and accepted in clinical practice. A recent study found that autosomal chromosomal monosomy predicted for an adverse outcome. The goal of this study is to characterize patients with monosomal karyotype by mutation status and clinical features. Methods: One-hundred forty consecutive AML patients diagnosed at Stanford University Hospital between 2005 and 2008 with adequate material for mutation analysis were studied. Cases were classified using the 2008 WHO criteria. Diagnostic cytogenetic findings were reviewed and patients were stratified into risk groups using Southwest Oncology Group criteria. An abnormality was considered clonal when at least two metaphases had the same aberration, except for clonal monosomy, which required at least three metaphases. The karyotype analysis was based on 20 or more metaphases. All samples were tested for NPM, FLT3 (ITD and D835) and CEBPA mutations. Clinical parameters including hemogram data at time of diagnosis were reviewed. Clinical follow-up including overall survival (OS), progression free survival (PFS) and complete remission (CR) rates were retrospectively determined. Kaplan-Meier methods and univariate Cox proportional hazards regression analysis were used to compare the clinical data. Results: The cases included 77 males and 63 females with a median age of 58 (range 17-83). Cytogenetic risk-group stratification resulted in 14 patients with favorable, 88 with intermediate and 28 with unfavorable risk status. Loss of one or more autosomal chromosomes was present in 18 /130 patients (13.8%) with available cytogenetic studies. A single autosomal monosomy was found in 5 patients while 13 patients had two or more autosomal monosomies. The most common chromosomes lost in these 18 patients included 7 (55% of 18 cases), 5 (50%), 17 (33%), 21 (22%), 20 (22%), 22 (17%) and 18 (11%). Using the 2008 WHO criteria, there were 66 AML with myelodysplasia-related changes (AML-MRC), 55 AML not otherwise specified (AML-NOS), 14 AML with either t(8;21), inv(16) or t(15;17) and 5 therapy related AMLs. Overall, 35 patients (25% of all patients) had a NPM1 mutation (19 of which were FLT3 mutated), 33 had FLT3-ITD mutation (24%), 11 had FLT3-D835 (8%) and 11 had a CEBPA mutation (8%) (4 of which were FLT3 mutated). Patients with monosomal karyotype were significantly older (83 vs. 59 years, p=0.0125) and presented with lower WBC (34 vs. 66 K/uL, p=0.0006), lower platelets (41 vs. 64 K/uL, p=0.0111), and lower blasts (38% vs. 65%, p=0.0030) as compared to the rest of AML patients. In addition, patients with monosomal karyotype were more frequently diagnosed with AML-MRC (16/18 vs. 48/107, p=0.0034) and exhibited a decreased frequency of NPM1 mutation (0/18 vs. 28/107, p=0.0138) and FLT3-ITD mutation (0/18 vs. 29/107, p=0.0117). Clinical outcome data showed that patients with monosomal karyotype had a significantly worse OS, PFS and CR compared to the rest of AML patients (OS p=0.001, PFS p=0.002 and CR p=0.0262). Dividing patients by number of monosomies showed that patients with 2 or more monosomies had a significantly worse OS (p=0.0001) and PFS (p=0.0045) than patients without any monosomies. However, no difference in OS or PFS was seen when comparing patients with 1 monosomy to those with 2 or more monosomies. Within the AML-MRC group, monosomal karyotype correlated with lower WBC (17 vs. 37 K/uL, p=0.0005), lower platelets (21 vs. 35 K/uL, p=0.0095), lower blasts (19% vs. 36%, p=0.0015) and shorter OS (p=0.0322) and PFS (p=0.0084). Conclusion: AML patients with monosomal karyotype exhibit a significantly worse OS, PFS and lower CR as compared to other AML patients. Most of patients fall within the newly defined AML-MRC group and are characterized by significant absence of NPM1 and FLT3-ITD mutations. Disclosures: No relevant conflicts of interest to declare.


2007 ◽  
Vol 25 (18_suppl) ◽  
pp. 519-519 ◽  
Author(s):  
M. N. Ibrahim ◽  
Z. Abdullah ◽  
L. Healy ◽  
C. Murphy ◽  
I. Y. Yousif ◽  
...  

519 Background: Carcinoma in situ (CIS) of the breast is a precancerous lesion with the potential to progress to invasive cancer. In 2003, CIS accounted for 19% of all newly diagnosed invasive and non-invasive breast lesions combined in the United States. Current treatment options are mastectomy ± tamoxifen, and breast-conserving surgery with radiotherapy ± tamoxifen. As there are no randomized comparisons of these 2 treatments, data from the Surveillance Epidemiology and End Results (SEER) database was used to compare their survival rates. Methods: 88,285 patients were identified with CIS from 1988 - 2003. Of these, 27,728 patients were treated with a total mastectomy, and 25,240 patients received breast-conserving surgery with radiotherapy. Kaplan-Meier survival analyses and Cox proportional hazards regression were used to compare overall survival and disease specific survival at 5 and 10 years. Results: Kaplan-Meier analyses demonstrated 5 year overall survival rates for total mastectomy vs. breast conserving surgery with radiotherapy of 95.46% vs. 97.59% respectively (Log-rank P < 0.0001). The 5 year rates for disease specific survival were 99.16% vs. 99.72% respectively (Log-rank P < 0.0001). At 10 years the overall survival rates had fallen to 91.96% vs. 96.09% respectively (Log-rank P < 0.0001). The 10 year disease specific survival rates were 98.61% vs. 99.50% respectively (Log-rank P < 0.0001). Cox proportional hazards regression demonstrated a relative risk of 0.847 (95% confidence interval (CI) 0.790 - 0.907) and 1.110 (95% CI 0.931 - 1.324) for 5 year overall survival and disease specific survival respectively, when total mastectomy was compared with breast conserving surgery and radiotherapy. At 10 years, the relative risks were 0.865 (95% CI 0.820 - 0.913) and 1.035 (95% CI 0.900 - 1.190) for overall survival and disease specific survival respectively. Conclusions: Overall, when looking at disease-specific survival rates by multi-variate analysis, there does not appear to be a significant difference between total mastectomy and breast-conserving surgery with radiotherapy in the treatment of CIS. No significant financial relationships to disclose.


2007 ◽  
Vol 25 (18_suppl) ◽  
pp. 4048-4048
Author(s):  
Y. Yeh ◽  
Q. Cai ◽  
J. Chao ◽  
M. Russell

4048 Background: NCCN guidelines recommend assessment of =12 lymph nodes (LN) to improve accuracy in colorectal cancer (CRC) staging. Previous studies have used various cut-points to assess the relationship between the number of LN sampled and survival. The association between NCCN guideline-compliant nodal sampling and survival is assessed, while controlling for other risk factors. Methods: We selected 145,485 adult patients newly diagnosed with stage II or III from SEER during 1990–2003. Kaplan-Meier curves were compared using the log-rank test. Cox proportional hazards models were constructed to determine the effect of sampling ≥ 12 LN on survival. Results: Median patient follow-up was 5.7 years. The table shows overall survival rates in CRC patients with < 12 versus =12 LN assessed: After adjusting for age, sex, tumor size and grade, sampling ≥ 12 LN was independently associated with improved survival. For patients with =12 versus <12 LN assessed, survival increased by 13% for stage IIa [HR=0.75; 95%CI 0.72–0.78; p< .001], 16% for stage IIb [HR=0.69; 95%CI 0.67- 0.71; p< .001], 12% for stage IIIb [HR=0.75; 95%CI 0.72–0.77], and 10% for stage IIIc [HR=0.85, 95%CI 0.81–0.89]. The association was not statistically significant for stage IIIa patients. Conclusion: Consistent with previous reports, this analysis found that optimal nodal sampling increased survival across stage II and III, specifically when ≥ 12 LN are sampled and when controlling for other risk factors. Furthermore, the results underscore the need for adhering to the NCCN guidelines. The lack of a statistically significant association in stage IIIa patients may be due to small cohort size. [Table: see text] [Table: see text]


2012 ◽  
Vol 30 (15_suppl) ◽  
pp. 517-517 ◽  
Author(s):  
Jason Christiansen ◽  
John M. S. Bartlett ◽  
Mark Gustavson ◽  
David Rimm ◽  
Tammy Robson ◽  
...  

517 Background: Hormone receptors, HER2 and Ki67 are residual risk markers in early breast cancer. Combining these markers into a unified algorithm (IHC4) provides information on residual recurrence risk of patients treated with hormone therapies. This study aimed to independently investigate the validity of the IHC4 algorithm for residual risk prediction using both conventional (DAB)-IHC and quantitative immunofluorescence (QIF-AQUA). Methods: The TEAM pathology study recruited >4500 samples from patients treated in the TEAM trial. TMAs were stained for ER, PgR, HER2 and Ki67 using QIF-AQUA technology or DAB-based immunohistochemistry (DAB-IHC). Central HER2 FISH was performed. Quantitative image analysis was used to generate expression scores that were normalized to produce “IHC4 algorithm” as well as novel algorithm scores.  Algorithm scores were compared with disease recurrence in univariate and multivariate Cox Proportional Hazards models. Results: Both DAB-IHC and QIF-AQUA IHC4 continuous models were significant (P<0.0001) for prediction of disease recurrence with a continuous Hazard Ratio (HR) of 1.011 (1.010 – 1.013) for QIF-AQUA IHC4 versus 1.008 (1.007 – 1.010) for the DAB-IHC IHC4 model using the published IHC4 algorithm (Cuzick et al 2011).  Binning continuous model scores (4 bins) by Kaplan-Meier survival analysis was used to graphically illustrate these effects.  De novo models for both DAB-IHC and QIF-AQUA were also significantly (P<0.0001) predictive of residual risk in early breast cancer. Additionally, all 4 models were independent predictors of recurrence (P<0.0001) with other recognized clinical prognostic factors in multivariate analysis.  Although results from DAB and QIF-AQUA were modestly correlated, the QIF-AQUA model showed enhanced prediction of recurrence in both Cox Proportional Hazards Modeling and C-index calculations. Conclusions: Either conventional DAB or QIF-AQUA methods of IHC provided evidence supporting the clinical utility of IHC4 algorithms in the context of the TEAM study.  With careful standardization, either of these IHC4 assays should be considered for prediction of residual risk in early breast cancer.


2012 ◽  
Vol 30 (15_suppl) ◽  
pp. e14106-e14106
Author(s):  
Susan H Foltz Boklage ◽  
Charles Kreilick ◽  
Carl V Asche ◽  
Sally Haislip ◽  
James W. Gilmore ◽  
...  

e14106 Background: Improvements in survival for advanced-stage colorectal cancer (CRC) patients who receive chemotherapy have been reported. We compared survival rates for patients with 3+ vs. <3 lines of therapy using electronic medical records of a local oncology practice in Georgia, USA. Methods: The Georgia Cancer Specialist (GCS) EMR Database (1/1/2005–07/ 31/2010) was used. The database contains data on patient demographics, cancer diagnostic information, chemotherapy and non-chemotherapy drugs administered written prescriptions, chemotherapy/radiation protocols, chemotherapy protocol changes, office visit information, and hospitalizations. Patients newly diagnosed with CRC between 01/01/05 and 06/31/10 treated with systemic therapy for CRC were identified. Patients were followed from initial CRC diagnosis to death, loss to follow-up, or end of study. Patients were categorized by number of lines of therapy received (1, 2, 3+) and original stage at diagnosis (III b/c, IV, unknown). Survival following initial line of therapy was evaluated using Cox proportional hazards models controlling forstage at diagnosis, type of 1st line treatment, and other patient characteristics. Results: The study included 704 patients with a median age of 63 years (age range 26-85 years) at diagnosis and 49% (n=345) female. 45% (n=317) and 42% (n=296) had stage IV and III b/c CRC at diagnosis, respectively. 53% (n=373) received only 1st line treatment, 27% (n=190) received 1st and 2nd line treatment and 20% (n=141) received 3rd line and beyond. The median follow up was 431 days and death was reported in 27%(n=190) of subjects. The multivariate Cox proportional hazard analysis indicated that there was no statistical difference in survival between patients who received 2nd line of therapy vs. 3 plus lines of therapy (HR=1.42; p<0.067). Conclusions: A non-statistical significant association between 2nd and more than 3 total lines of therapy in survival was found in subjects diagnosed with stage III B/C and IV. However the trend towards survival was present, indicating that some patients could benefit from the addition of 3rd line but it would require additional studies to confirm this.


Sign in / Sign up

Export Citation Format

Share Document