Blood tests to predict one- to two-year survival of patients with difficult cancers.

2021 ◽  
Vol 39 (15_suppl) ◽  
pp. e16158-e16158
Author(s):  
Robert L. De Jager ◽  
Howard Bruckner ◽  
Fred Bassali ◽  
Elisheva Dusowitz ◽  
AJ Book ◽  
...  

e16158 Background: A sequence of drug combinations produces > 1 median (M) -strong 2-year (yr) survival (S) (Bruckner et al AACR 14 Antica Res (ACR) 16, 18 SIGO 19). Trials included high-risk patients (pts). Each initial series has 5-yr Ss, after pts were referred for hospice care. Prognostic ALAN blood tests (Ts) have been validated for stage IV (Adv) Cholangiocarcinoma (CCA) (Salati et al EuJCa18). Other Ts predict unexpected favorable (F) S of pts with gastric ca, PS 2-3. Bruckner et al JAMA, 82); but, there is little known about Ts for resistant (R) Ca. Methods: Planned Kaplan-Meier intent to treat analysis to find Ts that: expand eligibility (El) for therapy; identify biomarkers that predict therapy can prolong S and identify new hypotheses for therapy. El pts have:R to test drugs, Pancreatic (PC), Intrahepatic bile duct, CCA, Colon, CRC and new (N) APC. All series: -/+ high risk, -/+ aged, PS 0-2. El: Helsinki criteria- consent, recovered from severe (gr3) toxicity; able to reach office, -/+ help, and S > 6 wks. Inel: CNS involved, IV needed, F clinical factors predict 1 yr MST. Ts include A.L.A.N. scores, (AS) (Salati ibid) and other blood Ts (ACR ibid, Lavin et al CTR 82) Therapy GFLIO in mg/M2: gemcitabine 500, leucovorin 180, fluorouracil 1200, 24 hr infusion. Irinotecan 80 D2 Oxaliplatin 40. Then for progression (pg), add docetaxel 20-25, except CRC mitomycin C 4-6; next pg add cetuximab, except APC or KRAS-M, weekly, and next pg replace cetuximab with bevacizumab 10mg/kg ibid ACR 16. Results: At all ages, overall (O) S is > 1 yr for RCRC, and NAPC and sets with any 1 F or UnF T other than < 3.1 Albumin (Alb) or < 2.1 lymph/monocyte ratio (LMR) b For CCA, 17R/16N, OMS > 2 yrs 66% of pts and ≥ 2 yrs for all test sets except UnF, 26% of pts, MS 17 mos, with low Alb. For CRC: 50R OMS is 16.5 mos; 42% S 2 yrs, Fav Ts: MS > ̃ 2yrs, 39-82% of pts have FTs; Neutrophil Lymphocyte Ratio (NLR); < 3.1, 61% S 2 yrs, p < .02; Lymphs > 1.5, 53% S 2 yrs, p < .02; AS 0; 59% S 2 yrs, p < .06; Platelets < 300,000, 54% S 2 yrs, p < .06; Alb: ≥ 3.5, 48% S 2 yrs, p < .11. For N-APC: 53 pts, OS is 14.5 mos and > 12 mos in sets with any 1 UnF T other than Alb or LMR. FTs: MST 16.4-18 mos. 34-77% of pts have FTs; Alb ≥ 3.5, 34% S 2 yrs, p < 0.001; WBC < 10, 29% S 2 yrs, p < .06; AS 0-2, 35% S 2 yrs, p 2.7E-7. For R-PC: 53 pts, OS is 12 mos for 44% of pts, FTs: MST 13.6-17 mos, 21-70% of pts have FTs: Alb ≥ 3.5 30% S 2 yrs, p .0004; AS: 0, 41% S 2 yrs, p .0006; NLR < 3, 37% S 2 yrs, p < .02. GFLIO’s < 5% gr3 induction toxicity, is reversible, with no hospitalization, neutropenic fever or gr3 neuropathy. Conclusions: Robust Ts identify many difficult pts with median > 1 and testable prospective > 2 yr rates of S. Ts warrant development: validation with GFLIO and other therapy and other cancers; to improve Ts, models for eligibility and geriatric criteria; to identify false -/+ trials; and personalize trials to correct UnF Ts. FTs, with GFLIO, can change prognosis and practice for > 50% of pts now advised “against” any therapy due to a clinical estimate of “less than 6 -10 mos to live.” Clinical trial information: NCT01905150.

Blood ◽  
2006 ◽  
Vol 108 (11) ◽  
pp. 5285-5285
Author(s):  
Sang Kyun Sohn ◽  
YoonYoung Cho ◽  
JongGwang Kim ◽  
YeeSoo Chae

Abstract Background Reharvesting leukocytes from donors for a donor lymphocyte infusion (DLI) is inconvenient and occasionally impossible in case of unrelated donors. The effect of a growth factor-primed DLI is known to be comparable to that of nonprimed DLI for patients with relapsed disease. We reserved some portion of PBSCs harvested at the time of transplantation for the purpose of future DLI for relaping disease. Method In total, ninety nine patients (43 high risk, 46 standard disease) with hematologic malignancies who were treated by allo-PBSCT were allocated on an intent-to-treat basis. The dose of CD34+ cells with a range of 2–6*106/kg was transplanted, and additional PBSCs were cryopreserved. Result PBSC harvest for transplantation allowed to reserve extra cells in 35 (67.3%) high risk patients and in low risk 25 (55.6%) patients. Among 29 patients (29.9%) who relapsed after allogeneic PBSCT, 19 (65.5%) patients were treated with mainly cytarabine-based chemotherapy followed by cryopreserved PBSC infusion. The median dose of CD3+ and CD34+ cells for the primed DLI was 1.43*108/kg and 4.75*106/kg, respectively. Six (24.9%) out of 19 relapsed patients exhibited a complete response after the primed DLI, and their 1-year survival rate was 36%. The new development or progression of graft-versus-host disease after the primed DLI was observed in 16 (82%) patients. Overall, the survival at 1 year after the primed DLI was 21%. Conclusion The induction of a graft-versus-leukemia effect through a primed DLI, using additional PBSCs reserved at the time of transplantation, would appear to be feasible for patients with relapsed hematologic malignancies. Furthermore, this approach seem to be more convenient for donors.


Blood ◽  
2015 ◽  
Vol 126 (23) ◽  
pp. 3862-3862
Author(s):  
Alessia Bari ◽  
Luigi Marcheselli ◽  
Tamar Tadmor ◽  
Raffaella Marcheselli ◽  
Maria Christina Cox ◽  
...  

Abstract Background There is an increasing amount of data showing that tumor microenvironment, host immunity and inflammatory responses play an important role in determining the clinical course and outcome in patients with malignant lymphoma. Several investigators have considered the absolute monocyte count (AMC) as a surrogate biomarker of tumor associated macrophages within the tumor microenvironment, the absolute lymphocyte count (ALC) as an important biomarker of tumor infiltrating lymphocytes, reflecting host immunity status, and the absolute neutrophil count (ANC) as indicative of the systemic inflammatory response to malignancy. All the above parameters have been suggested as significant prognostic factors in Hodgkin lymphoma (HL). The aim of the present retrospective study was to verify in whether neutrophil : lymphocyte ratio (NLR) can be utilized as an independent prognostic factor in a large cohort of patients with nodular sclerosis (NS) subtype HL. Patients and Methods This retrospective analysis included data from 1017 patients diagnosed with NS HL according to the WHO criteria. We reviewed the clinical and laboratory data of consecutive "therapy-naïve" patients, treated in different centers in Italy and in Israel between 1993-2012, after approval by local institutional review boards. Patients had received different combination chemotherapy regimens : doxorubicin, bleomycin, vinblastine and darcarbacine (ABVD), mechlorethamine, vincristine, procarbazine, and prednisone (MOPP)/epidoxirubicin, bleomycin, and vinblastine (EBV)/lomustine (CCNU), doxorubicin, and vindesine (CAD), Vinblastine, bleomycin, and methotrexate (VBM), bleomycin, etoposide, doxorubicin, cyclophosphamide, vincristine, procarbazine, and prednisone (BEACOPP) and Stanford V. The cut-off for NLR was determined from the analysis of the log (HR) as a function of NLR, using Cox cubic spline regression. The importance of the covariate was examined using the bootstrap inclusion frequency (BIF) with log-likelihood ratio test, (cut-off of 0.05), over 1000 resample of hierarchical Cox PH model, where NLR was added to IPI. Progression free survival (PFS) and overall survival (OS) were determined by Kaplan-Meier estimates and risk groups compared using the log-rank test .We also performed Cox proportional hazard analysis. The effect size of risk was reported as a hazard ratio (HR) with the associated 95% confidence interval (CI95). Results Of the 1017 patients, 990 (97%) had data on both IPS and NLR. Median age was 31 years (range 17-69) and 49% were males. The 5-yr PFS and OS after median follow-up of 85 months (range 1-244 months) were 81% (95CI 78-84) and 91% (95CI 89-93), respectively, for all patients. The log(HR) for PFS and OS varied linearly for the function of NLR and the cut-off was selected at 6 for both outcomes. Patients with NLR >6 had a worse PFS and OS compared to NLR ≤6 (84% vs 75% and 92% vs 88% at 5-years, respectively). Figure 1). For PFS the HR for patients with NLR>6 was 1.65 (CI95 1.25-2.18, p<0.001), while for OS the HR was 1.82 (CI95 1.25-2.65, p=0.002). When adjusted in Cox PH regression by IPS score, NLR >6 maintained it's prognostic value in both PFS (HR 1.49, CI95 1.12-1.98, p=0.006; with a BIF of 76%) and OS (HR 1.56, CI95 1.06-2.29, p=0.023; with a BIF of 64%). This was also evident in continuous form for NLR both s in PFS (HR adjusted by IPS 1.02, CI95 1.01-1.04, p=0.010) and OS (HR adjusted by IPS 1.02, CI95 1.01-1.05, p=0.039). Conclusion . Although the majority of patients with HL can be cured, about 1/3 of those with advanced stage disease relapse or progress after first line therapy. Several approaches have been employed to recognize high risk patients, including gene expression profiling and positron emission tomography. However these procedures are expensive and not always easy to perform and interpret. In conclusion, despite it is retrospective nature, our study shows that NLR can reliably identify high risk patients at the time of diagnosis. This easily obtainable simple prognostic parameter could well be utilized to improve the discriminating power of the IPS score in patients with NS HL. Figure 1. PFS and OS by NLR < 6 or NLR ≥ 6 Figure 1. PFS and OS by NLR < 6 or NLR ≥ 6 Disclosures No relevant conflicts of interest to declare.


2015 ◽  
Vol 33 (7_suppl) ◽  
pp. 68-68 ◽  
Author(s):  
Phuoc T. Tran ◽  
Amol Narang ◽  
Ashwin Ram ◽  
Scott P. Robertson ◽  
Pei He ◽  
...  

68 Background: In patients with localized prostate cancer undergoing radiation therapy (RT) +/- androgen deprivation therapy (ADT), an end of radiation (EOR) PSA obtained during the last week of RT may serve as an early post-treatment predictor of poor outcomes and identify patients in whom to pursue treatment intensification or novel therapies. Methods: We reviewed an IRB-monitored, prospectively acquired database of patients with prostate cancer treated with definitive RT at our institution from 1993-2007 (n=890). Patients with an available EOR PSA were divided into two cohorts and analyzed separately based on inclusion of ADT into the treatment regimen. EOR PSA thresholds of 0.5 ng/mL and 1.0 ng/mL were explored. Multivariate analysis was performed to determine prognostic factors for biochemical failure-free survival (BFFS, Phoenix criteria) and overall survival (OS). Kaplan-Meier survival curves were constructed, with stratification by EOR PSA thresholds. Results: Median age was 69 years, with an even distribution of NCCN low risk (33.5%), intermediate risk (34.0%), and high risk (32.5%) patients. Median RT dose was 7020 cGy, and 54.5% were treated with ADT. Median follow-up of the entire cohort was 11.7 yrs. EOR PSA level was available for the majority of patients (77.5%). On multivariate analysis, EOR PSA >0.5 ng/mL was significantly associated with worse BFFS (p<0.0001) and OS (p<0.0001). In the subset of patients undergoing RT with ADT for NCCN intermediate/high risk disease, 5 yr BFFS was more disparate based an EOR PSA threshold of 0.5 ng/mL (5 yr BFFS: 87.3% vs. 41.1%, p<0.001), than initial NCCN risk level (5 yr BFFS: 88.7% vs. 76.9%, p=0.038). In NCCN low risk patients undergoing definitive RT alone, an EOR PSA threshold of 1.0 ng/mL was significantly prognostic of outcome (5 yr BFFS: 100.0% vs. 88.6%, p=0.024). Conclusions: For NCCN intermediate/high risk patients undergoing RT with ADT, EOR PSA >0.5 ng/mL may represent a better surrogate for poor outcomes than initial risk group. In addition, NCCN low risk patients undergoing RT alone who obtained an EOR PSA ≤1.0 ng/mL experienced excellent BFFS. Prospective evaluation of the utility of EOR PSA should be explored.


EP Europace ◽  
2021 ◽  
Vol 23 (Supplement_3) ◽  
Author(s):  
H Santos ◽  
M Santos ◽  
I Almeida ◽  
H Miranda ◽  
C Sa ◽  
...  

Abstract Funding Acknowledgements Type of funding sources: None. OnBehalf Portuguese Registry of Acute Coronary Syndromes Background Acute coronary syndromes (ACS) are common and several scores were proposed to identify high-risk patients that presented worse prognosis in short and long-term follow up. CHA2DS2-VASc score is the score used to decide the initiation of anticoagulation therapy in atrial fibrillation (AF) patients. It is an easy and convenient score, used by physicians in clinical practice, which is helpful to apply in ACS predicting the high-risk patients. Objective CHA2DS2-VASc score as a prognosis method in ACS. Methods Multicenter retrospective study, based on the Portuguese Registry of ACS between 1/10/2010-4/09/2019. CHA2DS2-VASc test as a predictor of AF with a receiver operating characteristic curve. Logistic regression to access if the score was a predictor of AF. According with a punctuation of CHA2DS2-VASc as 0, 1 and ≥2, was performed a Kaplan-Meier test to establish the survival rates and cardiovascular admission at one year of follow-up. Results 25271 patients had ACS, 1023 patients (4.2%) presented de novo AF. CHA2DS2-VASc score was a median predictor of de novo AF (Area Under Curve: 0.642, confidence interval (CI) 0.625-0.659), with a 66.7% sensibility and 55.1% specificity. Logistic regression revealed that the CHA2DS2-VASc score was a predictor of de novo AF in ACS (odds ratio (OR) 2.07, p &lt; 0.001, CI 1.74-2.47). Mortality rates at one year of follow-up, even showing higher mortality rates associated with higher CHA2DS2-VASc punctuation, do not revealed to be significant, p = 0.099. On the other hand, the score exhibited a significant value, p = 0.050, for re-admission for all causes, according to the classification as 0, 1 or ≥2. Regarding re-admission for cardiovascular causes at one year of follow-up was associated with the score classification, with a Kaplan-Meier test of p = 0.011. Conclusions CHA2DS2-VASc score was a predictor of de novo AF in ACS and can be used as a prognostic method for all causes of re-admission and, in special, for cardiovascular cause of re-admission.


2016 ◽  
Vol 34 (33) ◽  
pp. 4015-4022 ◽  
Author(s):  
Sergio Cortelazzo ◽  
Corrado Tarella ◽  
Alessandro Massimo Gianni ◽  
Marco Ladetto ◽  
Anna Maria Barbui ◽  
...  

Purpose The benefit of high-dose chemotherapy with autologous stem-cell transplantation (ASCT) as first-line treatment in patients with diffuse large B-cell lymphomas is still a matter of debate. To address this point, we designed a randomized phase III trial to compare rituximab plus cyclophosphamide, doxorubicin, vincristine, and prednisone (R-CHOP)-14 (eight cycles) with rituximab plus high-dose sequential chemotherapy (R-HDS) with ASCT. Patients and Methods From June 2005 to June 2011, 246 high-risk patients with a high-intermediate (56%) or high (44%) International Prognostic Index score were randomly assigned to the R-CHOP or R-HDS arm, and 235 were analyzed by intent to treat. The primary efficacy end point of the study was 3-year event-free survival, and results were analyzed on an intent-to-treat basis. Results Clinical response (complete response, 78% v 76%; partial response, 5% v 9%) and failures (no response, 15% v 11%; and early treatment-related mortality, 2% v 3%) were similar after R-CHOP versus R-HDS, respectively. After a median follow-up of 5 years, the 3-year event-free survival was 62% versus 65% ( P = .83). At 3 years, compared with the R-CHOP arm, the R-HDS arm had better disease-free survival (79% v 91%, respectively; P = .034), but this subsequently vanished because of late-occurring treatment-related deaths. No difference was detected in terms of progression-free survival (65% v 75%, respectively; P = .12), or overall survival (74% v 77%, respectively; P = .64). Significantly higher hematologic toxicity ( P < .001) and more infectious complications ( P < .001) were observed in the R-HDS arm. Conclusion In this study, front-line intensive R-HDS chemotherapy with ASCT did not improve the outcome of high-risk patients with diffuse large B-cell lymphomas.


CJEM ◽  
2017 ◽  
Vol 19 (S1) ◽  
pp. S62
Author(s):  
V. Thiruganasambandamoorthy ◽  
M. Sivilotti ◽  
M.A. Mukarram ◽  
C. Leafloor ◽  
K. Arcot ◽  
...  

Introduction: Concern for occult serious conditions leads to variations in ED syncope management [hospitalization, duration of ED/inpatient monitoring including Syncope Observation Units (SOU) for prolonged monitoring]. We sought to develop evidence-based recommendations for duration of ED/post-ED ECG monitoring using the Canadian Syncope Risk Score (CSRS) by assessing the time to serious adverse event (SAE) occurrence. Methods: We enrolled adults with syncope at 6 EDs and collected demographics, time of syncope and ED arrival, CSRS predictors and time of SAE. We stratified patients as per the CSRS (low, medium and high risk as ≤0, 1-3 and ≥4 respectively). 30-day adjudicated SAEs included death, myocardial infarction, arrhythmia, structural heart disease, pulmonary embolism or serious hemorrhage. We categorized arrhythmias, interventions for arrhythmias and death from unknown cause as arrhythmic SAE and the rest as non-arrhythmic SAE. We performed Kaplan-Meier analysis using time of ED registration for primary and time of syncope for secondary analyses. Results: 5,372 patients (mean age 54.3 years, 54% females, and 13.7% hospitalized) were enrolled with 538 (10%) patients suffering SAE (0.3% died due to an unknown cause and 0.5% suffered ventricular arrhythmia). 64.8% of SAEs occurred within 6 hours of ED arrival. The probability for any SAE or arrhythmia was highest within 2-hours of ED arrival for low-risk patients (0.65% and 0.31%; dropped to 0.54% and 0.06% after 2-hours) and within 6-hours for the medium and high-risk patients (any SAE 6.9% and 17.4%; arrhythmia 6.5% and 18.9% respectively) which also dropped after 6-hours (any SAE 0.99% and 2.92%; arrhythmia 0.78% and 3.07% respectively). For any CSRS threshold, the risk of arrhythmia was highest within the first 15-days (for CSRS ≥2 patients 15.6% vs. 0.006%). ED monitoring for 2-hours (low-risk) and 6-hours (medium and high-risk) and using a CSRS ≥2 cut-off for outpatient 15-day ECG monitoring will lead to 52% increase in arrhythmia detection. The majority (82.2%) arrived to the ED within 2-hours (median time 1.1 hours) and secondary analysis yielded similar results. Conclusion: Our study found 2 and 6 hours of ED monitoring for low-risk and medium/high-risk CSRS patients respectively, with 15-day outpatient ECG monitoring for CSRS ≥2 patients will improve arrhythmia detection without the need for hospitalization or observation units.


Author(s):  
Ming-Hsien Tsai ◽  
Hui-Ching Chuang ◽  
Yu-Tsai Lin ◽  
Tai-Lin Huang ◽  
Fu-Min Fang ◽  
...  

Background: To assess the presence of adverse pathological features at the time of salvage total laryngectomy (TL) associated with oncologic outcome. Methods: Ninety patients with persistent/locally recurrent disease and who subsequently underwent salvage TL after definitive treatment by radiation alone (RTO) or concurrent chemo-radiation (CCRT) from 2009 to 2018 were retrospectively enrolled. Kaplan–Meier methods were used to estimate overall survival (OS), disease-specific survival (DSS), and disease-free survival (DFS). Results: Lymphovascular invasion (LVI), perineural invasion, positive margin, and stage IV disease were associated with worse survival in the univariate analysis. In the multivariate analysis, the presence of LVI and positive margin were both independent negative predictors in OS (LVI: adjusted hazard ratio (aHR) = 2.537, 95% CI: 1.163–5.532, p = 0.019; positive margin: aHR = 5.68, 95% CI: 1.996–16.166, p = 0.001), DSS (LVI: aHR = 2.975, 95% CI: 1.228–7.206, p = 0.016); positive margin: aHR = 11.338, 95% CI: 2.438–52.733, p = 0.002), and DFS (LVI: aHR 2.705, 95% CI: 1.257–5.821, p = 0.011; positive margin (aHR = 6.632, 95% CI: 2.047–21.487, p = 0.002). Conclusions: The presence of LVI and positive margin were both associated with poor OS, DSS, and DFS among patients who underwent salvage TL after failure of RTO/CCRT. The role of adjuvant therapy for high-risk patients after salvage TL to improve the chance of survival requires more investigation in the future.


2021 ◽  
Author(s):  
Weiqiang You ◽  
Zerong Cai ◽  
Nengquan Sheng ◽  
Li Yan ◽  
Huihui Wan ◽  
...  

Abstract BackgroundPatients with stage I-III gastric cancer (GC) undergoing R0 radical resection display extremely different prognosis. How to discriminate high-risk patients with poor survival conveniently is a clinical conundrum to be solved urgently.MethodsPatients with stage I-III GC from 2010 to 2016 at Shanghai Jiao Tong University Affiliated Sixth People’s Hospital were included in our study. The associations of clinicopathological features with disease-free survival (DFS) and overall survival (OS) were examined via Cox proportional hazard model. Nomograms were developed which systematically integrated prognosis-related features. Kaplan–Meier survival analysis was performed to compare DFS and OS among groups. The results were then externally validated by The Sixth Affiliated Hospital, Sun Yat-sen University.ResultsA total of 585 and 410 patients were included in the discovery cohort and in the validation cohort. Multivariable analysis demonstrated that T stage, N stage, lymphatic/vascular/nerve infiltration, preoperative CEA and CA199 were independent prognostic factors (P < 0.05). The 3-year and 5-year calibration curves based on the nomograms showed perfect correlation between predicted and observed outcomes. Distinct differences were noticed in the survival of different risk groups (p < 0.001). Patients with low risk signature had a 5-year rate of 84.0% for DFS and 83.0% for OS, whereas high risk signature had the worst 5-year rate (only 35.0% for DFS and 24.0% for OS). Similar results were achieved in validation cohort.ConclusionsThe signatures based on clinicopathologic features demonstrate the powerful ability to conveniently identify distinct subpopulations, which may provide significant suggestions for individual follow-up and adjuvant therapy.


Blood ◽  
2021 ◽  
Vol 138 (Supplement 1) ◽  
pp. 4718-4718
Author(s):  
Afsaneh M. Shariatpanahi ◽  
Sarah Grasedieck ◽  
Matthew C. Jarvis ◽  
Faezeh Borzooee ◽  
Reuben S. Harris ◽  
...  

Abstract Background: The prognosis of MM is determined by affected organs, tumor burden as measured by e.g., the international staging system (ISS), disease biology such as cytogenetic abnormalities, and response to therapy. The outcome of high-risk MM patients classified by ISS or adverse risk cytogenetics is not uniform and patients show heterogeneous survival. Recent insights into the pathogenesis of MM highlighted genome/transcriptome editing as well as inflammation as drivers for the onset and progression of MM. We hypothesized that inclusion of molecular features into risk stratification could potentially resolve the challenge of accurately distinguishing between high-risk and low-risk MM patients at initial diagnosis and improve outcome. Aim: We aimed to create a simple molecular risk score to identify unrecognized patient subgroups, who have been previously misclassified by current risk stratifiers. Method: The Multiple Myeloma Research Foundation CoMMpass study genomics dataset, combining mRNA Seq and clinical data from more than 700 MM patients, allowed us to evaluate the prognostic value of demographic and clinical parameters, cytogenetics, and gene expression levels of APOBEC genes as well as inflammation-modulating cytokines in MM patients. We calculated hazard ratios and Kaplan-Meier survival estimates for all extracted features. Combining clinical variables that were significantly associated with PFS and OS, we then applied machine learning approaches to identify the most accurate classification model to define a new risk score that is easy to compute and able to stratify NDMM patients more accurately than cytogenetics-based classifiers. Based on a Kaplan-Meier survival curve analysis, we then evaluated the performance of our newly built EI score in sub-classifying of current multiple myeloma risk stratifiers. Results: Based on machine learning models, we defined a weighted OS/PFS risk score (Editor-Inflammation (EI) score) based on mRNA expression of APOBEC2, APOBEC3B, IL11, TGFB1, TGFB3, as well as ß2-microglobulin and LDH serum levels. We showed that the EI score subclassified patients into high-risk, intermediate-risk, and low-risk prognostic groups and demonstrated superior performance (C-index: 0.76) compared to ISS (C-index: 0.66) and R-ISS (C-index: 0.64). We further showed that EI low-risk patients do not benefit from autograft and maintenance therapy. Re-classification of ISS (Figure 1a, b, c) and R-ISS risk groups further confirmed the superiority of the EI score. In addition, the EI score identified previously unrecognized distinct subgroups of MM patients with adverse risk cytogenetics but good prognosis (Figure 1d, e, f). For example, the EI score excellently subclassified del(17p) MM patients into three main risk subgroups including a super low-risk group (none of them has p53 mut) with 5-year OS of 100%, an intermediate-risk group (30% of these patients also have p53 mut) with 5-year OS rate of 75%, and a very poor prognosis group of patients (40% of these patients also have p53 mut) with 5-year OS rate of 0% (2y OS: 40%) (Figure 1f). In line, we could show that patients with del(17p) and high EI score exhibit an enrichment of APOBEC induced genomic mutations compared to intermediate-risk and low-risk patients supporting the hypothesis that del(17p) along with high APOBEC expression levels activate the APOBEC mutation program and thus create an optimal environment for tumor progression. These findings support the necessity of a prognostic score that more accurately reflects MM disease biology. Conclusion: Although MM is considered as an incurable disease, an improved risk stratification could help to identify previously unrecognized low- and high-risk patient subgroups that are over- or undertreated and lead to improved outcomes. Our EI score is a simple score that is based on recent insights into MM biology and accurately identifies high-risk and low-risk newly diagnosed MM patients as well as misclassified MM patients in different cytogenetic and ISS risk subgroups. Figure 1 Figure 1. Disclosures No relevant conflicts of interest to declare.


Blood ◽  
2005 ◽  
Vol 106 (11) ◽  
pp. 1152-1152
Author(s):  
Bart Barlogie ◽  
Guido Tricot ◽  
Erik Rasmussen ◽  
Elias Anaissie ◽  
Frits van Rhee ◽  
...  

Abstract Background: TT2 introduced T into the frontline therapy for MM in a randomized phase III trial design (ASCO 2005). In comparison to TT1, TT2 applied: (1) more intensive chemotherapy for induction prior to and introduced CCT after melphalan 200mg/sqm-based tandem autotransplants, designed to improve survival in high-risk patients with CA; (2) DEX pulsing was added during the 1st year of interferon (IFN) maintenance therapy. We now report on the outcome of patients treated on the “no thalidomide” arm of TT2 (TT2-) in comparison to TT1, in order to evaluate the effect of dose-intensification during induction and post-transplant therapy. Patients and Methods: 231 patients were enrolled in TT1 (median follow-up, 11yr) and 345 in TT2- (median follow-up, 3.5yr). Completion rates of 1st/2nd transplants were 195/165 (84%/71%) with TT1 and 292/235 (85%/68%) with TT2-. In TT2-, 64% started CCT and 36% received DEX consolidation (when platelets failed to recover or no benefit was documented from induction DCEP). TT2- and TT1 were compared in terms of pre-transplant-1 and final CR rates (intent-to treat), EFS/OS from treatment start, 1st transplant and from last (2nd or 1st) transplant. EFS and OS were examined in the context of baseline prognostic variables including CA. Results: Compared to TT1, TT2- induced similar pre-Tx1 CR rates (11% vs 12%, p=0.8) and final CR rates (41.3% vs 40.7%, p=0.9). The median onset of CR was 8.9mo for TT2 vs 8.4mo for TT1 (p=0.4). 5-year EFS/OS were 45%/63% with TT2- vs 28%/57% with TT1 (p&lt;0.001/p=0.06). 4-yr post-transplant-1 EFS/OS were 48%/65% with TT2- vs 28%/56% with TT1 (p&lt;0.001/p=0.02); 4-yr post-transplant-2 EFS/OS were 46%/64% with TT2- and 31%/50% with TT1 (both p=0.01). Pre-study CA, high LDH (&gt;=190U/L), and low Hb (&lt;10g/dL) were independently, significantly associated with poorer post-transplant-2 EFS and OS (p&lt;0.05); independent of these factors, TT2- improved post-transplant-2 EFS and OS compared to TT1 (p&lt;0.001/p=0.033). In the presence of CA, TT2- with CCT vs TT2- with DEX improved 4-yr OS (measured from a 6-mo landmark post-transplant-2) from 37% (similar to 39% with TT1 [no DEX or CCT]) to 78%, which is comparable to the 65% for TT1 without CA. Conclusion: In this historical comparison of TT1 with TT2-, the more intensive induction chemotherapy with TT2- did not improve CR. However, post-transplant CCT benefited the 1/3 of patients with CA, doubling 4-yr post-transplant OS in comparison with TT1 in this high-risk subgroup. A phase 3 randomized trial addressing the CCT concept is warranted. Figure Figure


Sign in / Sign up

Export Citation Format

Share Document