Comparison of Matched/Mismatched Unrelated Donor Stem Cell Transplantation to Autologous Stem Cell Transplantation for Acute Myeloid Leukemia in First Complete Remission: A Study from the Acute Leukemia Working Party of the European Group for Blood and Marrow Transplantation (EBMT)

Blood ◽  
2015 ◽  
Vol 126 (23) ◽  
pp. 3216-3216
Author(s):  
Francesco Saraceni ◽  
Myriam Labopin ◽  
Norbert Claude Gorin ◽  
Didier Blaise ◽  
Reza Tabrizi ◽  
...  

Abstract Background. Optimal post-remission strategy for patients with acute myeloid leukemia (AML) is still matter of debate. Allogeneic Stem Cell Transplant (allo-HSCT) is the most effective treatment to prevent leukemia relapse, and for patients who lack a sibling donor transplantation from a matched or mismatched unrelated donor (URD) is usually the preferred alternative. However, increase in donor-recipient HLA mismatch, patient age and comorbidity scores lead to higher non relapse mortality (NRM) rates; moreover incidence of chronic GVHD is rather high after transplant from unrelated donors. Autologous Stem Cell Transplant (ASCT) has several advantages compared to allo-HSCT including low NRM, no GVHD risk, less late effects and better quality of life. The aim of the current study was to compare the outcome of allo-HSCT from matched (10/10 URD) or mismatched unrelated donor at a single HLA-locus (9/10 URD) to ASCT in patients with AML in first CR. Patients and methods. We performed a retrospective analysis of 2689 AML patients receiving 10/10 URD (n=1260), 9/10 URD-HSCT (n=356) or ASCT (n=1073) in first CR between 2005 and 2013 and reported to the ALWP of the EBMT. Results. Median FU was 35, 27 and 27 months for ASCT, 10/10 and 9/10 URD, respectively (p<10-4); median age was 48.7, 50.8, 48.7 years, respectively (p=10-3). Time from diagnosis to transplant was longer for URD compared to ASCT (p<10-4); patients who received URD had more frequently poor risk cytogenetics (p<10-4), were more likely to get a TBI-based conditioning (p<10-4) and were transplanted more recently (p<10-4), compared to patients who received ASCT. The 2-year cumulative incidence of relapse (RI) for ASCT, 10/10 and 9/10 URD were 46.3±3%, 24.9±3% and 27.7±5%, respectively (p<10-5), while the 2-year NRM rates were 3.1±2%, 16.4±4% and 20.5±4%, respectively (p<10-5). The 2-year KM estimates of leukemia-free survival (LFS) were 50.6±3% for ASCT, 58.7±3% for 10/10 URD and 51.8±6% for 9/10 URD (p=0.002), while the 2-year overall survival (OS) rates were 68.2±3, 63.6±3% and 55.1±6%, respectively (p<10-4). ASCT showed significantly higher RI compared to URD independently of cytogenetic risk (good risk: p<10-4, intermediate and poor risk: p<10-5); accordingly, 2y LFS was significantly better for URD compared to ASCT in all risk groups (good risk: p=0.034, intermediate risk: p=0.0007, poor risk: p=0.021). ASCT and URD showed similar OS in good and poor risk patients, while in intermediate risk group ASCT resulted in similar OS compared to 10/10 URD and better OS compared to 9/10 URD (66.2±4% for ASCT, 65.8±5% for 10/10 URD, 55.4±7% for 9/10 URD, p=0.012) (Fig 1). Within intermediate cytogenetic risk group, FLT3-ITD mutational status affected outcome; in patients harboring FLT3-ITD 10/10 URD showed the best LFS and OS (LFS: 36.3±11% for ASCT, 58.4±7% for 10/10 and 34±13% for 9/10 URD, p=10-3; OS: 51.7±12%, 62.2±7% and 41.4±14%, respectively, p=0.02). Conversely, in patients with wild type FLT3-ITD URD showed better LFS compared to ASCT (51.3±8% for ASCT, 66.7±7% for 10/10 URD, 64±13% for 9/10 URD, p=0.008), while no difference was observed in OS. Multivariate analysis confirmed significantly lower RI for 10/10 (HR 0.36, p<10-5, 95% CI:0.29-0.44) and 9/10 URD (HR 0.43, p<10-5, 95% CI:0.32-0.57) and higher NRM for 10/10 URD (HR 3.88, p<10-5, 95% CI:2.37-6.33) and 9/10 URD (HR 4.89, p<10-5, 95% CI:2.84-8.43) compared to ASCT. URD-SCT was associated with better LFS compared to ASCT (HR 0.57, p<10-5, 95%, CI:0.47-0.67 for 10/10 URD; HR 0.69, p=0.002, 95% CI:0.55-0.87 for 9/10 URD). 10/10 URD was associated with better OS compared to ASCT (HR 0.81, p=0.031, 95% CI:0.66-0.98) but no difference in OS was observed between 9/10 URD and ASCT (HR 1.02, p=0.87, 95% CI:0.79-1.31). Conclusion. In AML patients lacking an HLA-matched sibling donor URD-HSCT significantly reduces relapse risk and improves LFS. 10/10 URD showed better OS compared to ASCT in MV analysis in our series, while 9/10 URD impact on LFS didn't translate in better OS. In intermediate risk patients, in the absence of an HLA fully matched sibling or unrelated donor, autologous transplant may be considered as a valid option as ASCT results seem to overlap 10/10 URD outcome and to provide better survival compared to mismatch URD. Analysis is ongoing to better define which subpopulation of patients might benefit from each approach. Figure 1. OS in patients with intermediate risk cytogenetics. Figure 1. OS in patients with intermediate risk cytogenetics. Disclosures Craddock: Celgene: Consultancy, Honoraria, Research Funding; Pfizer: Speakers Bureau; Sunesis: Honoraria; Johnson and Johnson: Consultancy.

Blood ◽  
2013 ◽  
Vol 122 (21) ◽  
pp. 542-542
Author(s):  
Jakob Passweg ◽  
Myriam Labopin ◽  
Jan J Cornelissen ◽  
Liisa Volin ◽  
Gérard Socié ◽  
...  

Abstract Introduction In younger patients with AML achieving CR1 with intermediate or high risk disease allogeneic HSCT is the treatment of choice. Conditioning intensity is varied, reduced intensity (RIC) conditioning regimens are usually given to older patients, whereas young patients traditionally receive myeloablative regimens (MAC). In patients between the ages of 40-60 both types of regimens are used with little knowledge about factors that would lead physicians to prefer one over the other. Previous studies had shown that RIC regimens were associated with somewhat higher relapse risks but lower risks of transplant related mortality (TRM). We hypothesized that in low-intermediate risk disease based on cytogenetic classification RIC is superior to MAC whereas in high risk leukemia MAC is superior to RIC given higher antileukemic activity. Patients and Methods This study included 2974 of 5388 eligible patients with AML transplanted in CR1 in 2000-2011 based on the availability of cytogenetics to classify by risk status at diagnosis. Only sibling or unrelated donors and marrow or peripheral blood stem cell transplants were considered. Regimens were classified as MAC (n=1638) or RIC (n=1336) based on published criteria. Median follow-up of surviving patients was 46 and 41 months respectively. Groups differed by many variables. MAC recipients were significantly younger (37.6 vs 53.8 years), had a shorter interval from diagnosis to transplantation (143 vs 165 days), were more frequently male (53% vs 48%), had less frequently poor risk cytogenetics 19% vs 22%, received less frequently stem cells from an unrelated donor (20% vs 33%), and had more frequently marrow as a stem cell source (36% vs 7%). The Kaplan-Meier estimator, the cumulative incidence function and Cox proportional hazards regression models were used where appropriate. Results Table 1 shows similar overall (OS) and leukemia free survival (LFS) in both groups but a lower relapse incidence (RI) and a higher transplant related mortality incidence (TRM) in the MAC group. Acute grade II-IV GvHD was higher with MAC, incidence of chronic GvHD did not differ significantly. In univariate analysis overall survival was higher with RIC in cytogenetic good risk AML (55±5% vs 77±7% MAC vs RIC) but not in intermediate risk (61±1% vs 62±2%) or poor risk AML (42±3% vs 40±3%). Relapse incidence was lower with MAC in poor risk AML (36±3% vs 51±3%) and intermediate risk AML (21±1% vs 30±1%) but not in good risk AML (19±4% vs 13±5%). TRM was higher in MAC vs RIC in all three cytogenetic risk groups. Multivariate analysis confirmed a significant LFS and OS advantage of RIC in good risk but not in intermediate and poor risk leukemia. Conclusions In patients aged 40-60 MAC conditioning has no advantage over RIC conditioning in spite of RIC transplant recipients being generally in a poorer risk category. We confirm lower relapse rates but higher TRM risks with MAC compared to RIC. We fail to show superiority of MAC in patients with high risk cytogenetics but there appears to be an advantage for RIC over MAC in the small cohort of patients with good risk leukemia. Disclosures: Kuball: Miltenyi: GMP product development Other.


2008 ◽  
Vol 26 (36) ◽  
pp. 5980-5987 ◽  
Author(s):  
Franck Morschhauser ◽  
Pauline Brice ◽  
Christophe Fermé ◽  
Marine Diviné ◽  
Gilles Salles ◽  
...  

Purpose A prospective multicenter trial evaluated a risk-adapted salvage treatment with single or tandem autologous stem-cell transplantation (ASCT) for 245 Hodgkin's lymphoma (HL) patients who experience treatment failure with first-line therapy. Patients and Methods Poor-risk patients (150 with primary refractory disease or ≥ two of the following risk factors at first relapse: time to relapse < 12 months, stage III or IV at relapse, and relapse within previously irradiated sites) or intermediate-risk patients (95 with one risk factor at relapse) were eligible for tandem or single ASCT, respectively. Results Among poor-risk patients, 105 (70%), including 30 of 55 with cytoreductive chemotherapy-resistant disease, received tandem ASCT, whereas 92 intermediate-risk patients (97%) received single ASCT. According to intent-to-treat analysis, the 5-year freedom from second failure and overall survival (OS) estimates were 73% and 85%, respectively, for the intermediate-risk group and 46% and 57%, respectively, for the poor-risk group. Outcomes were similar for primary refractory and poor-risk/relapsed HL. For patients with chemotherapy-resistant disease, the 46% 5-year OS rate achieved with tandem ASCT compares favorably with the previously reported 30%. Outcomes for partial and complete responders to cytoreduction receiving tandem ASCT did not differ significantly and were better than those previously reported for partial responders receiving single ASCT, but not superior to those reported for complete responders receiving single ASCT. Six poor-risk patients (4%) died from toxicity. Conclusion Single ASCT is appropriate for intermediate-risk patients. For poor-risk patients, our results suggest a benefit of tandem ASCT for half of the patients with chemotherapy-resistant disease and partial responders, but not for complete responders to cytoreductive chemotherapy.


Blood ◽  
2006 ◽  
Vol 107 (10) ◽  
pp. 3832-3840 ◽  
Author(s):  
Nicolas Mounier ◽  
Michele Spina ◽  
Jean Gabarre ◽  
Martine Raphael ◽  
Giuliano Rizzardini ◽  
...  

We aimed to compare AIDS risk–adapted intensive chemotherapy in AIDS-related lymphoma (ARL) patients before and after the advent of highly active antiretroviral therapy (HAART). A total of 485 patients aged from 18 to 67 years were randomly assigned to chemotherapy after stratification according to an HIV score based on performance status, prior AIDS, and CD4+ cell counts below 0.10 × 109/L (100/mm3). A total of 218 good-risk patients (HIV score 0) received ACVBP (doxorubicin, cyclophosphamide, vindesine, bleomycin, and prednisolone) or CHOP (doxorubicin, cyclophosphamide, vincristine, and prednisolone); 177 intermediate-risk patients (HIV score 1), CHOP or low-dose CHOP (Ld-CHOP); and 90 poor-risk patients (HIV score 2-3), Ld-CHOP or VS (vincristine and steroid). The 5-year overall survival (OS) in the good-risk group was 51% for ACVBP versus 47% for CHOP (P = .85); in the intermediate-risk group, 28% for CHOP versus 24% for Ld-CHOP (P = .19); and in the poor-risk group, 11% for Ld-CHOP versus 3% for VS (P = .14). The time-dependent Cox model demonstrated that the only significant factors for OS were HAART (relative risk [RR] 1.6, P < .001), HIV score (RR 1.7, P < .001), and the International Prognostic Index (IPI) score (RR 1.5, P < .001) but not chemotherapy regimen. Our findings indicate that in ARL patients, HIV score, IPI score, and HAART affect survival but not the intensity of the CHOP-based chemotherapy.


Blood ◽  
2009 ◽  
Vol 114 (22) ◽  
pp. 2840-2840
Author(s):  
Ian H Gabriel ◽  
Ruhena Sargent ◽  
Hugues de Lavallade ◽  
Richard Szydlo ◽  
Jane Apperley ◽  
...  

Abstract Abstract 2840 Poster Board II-816 Multiple myeloma (MM) remains incurable with a median survival of 3–4 years. Despite high dose therapy and autologous stem cell transplant (ASCT) most patients relapse with median progression-free survival (PFS) of 2.5–4 years and overall survival (OS) of 4–5 years. Although allogeneic SCT (allo-SCT) is potentially curative due to a graft-versus-myeloma effect, its applicability is significantly limited by high transplant related mortality (TRM). Therefore, the identification of additional independent biological predictors of outcome is required in order to tailor therapy to disease. Natural killer (NK) cells provide first line defence against tumors. NK cells have been shown to recognize and kill myeloma cells both in the allogeneic and autologous settings and donor NK genotype has been shown to influence leukemia free survival following allo-SCT. The aim of this study was to investigate the impact of KIR genotype on event-free (EFS) and OS following ASCT for MM. We performed KIR genotyping on 190 patients with MM receiving a first autologous transplant. KIR genotype and haplotype frequencies were comparable to those published for normal controls. Factors found on univariate analysis to be associated with a shorter EFS included haplotype Bx (containing at least 1 of the KIR B haplotype-defining loci- KIR2DL5, 2DS1, 2DS2, 2DS3, 2DS5, or 3DS1) (median 547 vs 656 days, P = 0.036), ≥3 activating KIR genes (median 547 vs 615 days, P = 0.046), the presence of activating KIR genes KIR2DS1 and KIR3DS1 (median 456 vs 589 days, and 464 vs 619 days, P=0.045 and 0.01 respectively). Disease status at ASCT was the most highly predictive factor for EFS. In patients with good risk disease (CR or PR at ASCT) KIR3DS1 status was highly predictive for EFS 464 days (341–586) vs 731 days (599–862) (P = 0.003) and OS 807days (713-901) vs 967 (925-1009) (P=0.023). KIR3DS1 was not predictive in patients with poor risk disease (P=0.36). Of note KIR3DS1+ve patients were equally represented in good risk (CR and PR) and poor risk (refractory or relapsed) groups at ASCT (around 30% in both groups). Notably the median EFS for KIR3DS1+ good risk patients was not significantly different to poor risk disease patients (P = 0.061). ASCT outcome was then determined according to 3 main groups based on disease status and KIR3DS1 status; A: Good Risk, KIR3DS1-ve; B: Good Risk, KIRDS1+ ve; and C Poor risk (KIR3DS1+ve or -ve). The RR of relapse or death was 1.0, 1.9 (P=0.002, 95% CI 1.3-3.1), and 3.0 (P=0.0001, 95% CI 1.9-4.8) respectively. By multivariate analysis, after adjusting for the presence of adverse cytogenetics and serum albumin and β2m, the KIR3DS1 status and grouping remained highly predictive of poor EFS, RR of 1.0, 2.7 (P= 0.021, 95%CI 1.2-6.2) and 5.3 (P= < 0.0001, 95%CI 2.4-11.7) respectively. The prognostic value of KIR3DS1 however, was greatest in patients in whom the ligand for the corresponding inhibitory KIR3DL1, Bw4 was missing. KIR3DS1+ KIR3DL1+ HLA-Bw4 negative patients had significantly reduced median EFS of 400d (315-495) vs 615 (545-684) for all other patients (P=0.048). Again this was most striking in good risk patients. Patients who had the genotype KIR3DS1+ KIR3DL1+ HLA-Bw4 –ve had a significantly shorter EFS survival of 372 days compared to 509 days in KIR3DS1+KIR3DL1+HLA-Bw4+ patients and 793 days for KIR3DS1 negative individuals (P=0.004). In conclusion: Our data from 190 patients with MM suggests that KIR3DS1, a gene previously linked to an increase risk of progression to invasive cervical carcinoma, independently predicts for poor EFS and OS following ASCT. A significant proportion (30%) of patients who are defined as good risk at ASCT (CR and PR) are KIR3DS1+ve and have an EFS which is not significantly different from patients who have refractory/relapsed disease at ASCT. This effect of KIR3DS1 is more significant in the absence of HLA-Bw4, the ligand for the inhibitory receptor KIR3DL1. The mechanism for this is effect is unclear and we are currently performing functional studies to further understand these findings. Disclosures: Apperley: Novartis: Consultancy, Honoraria. Marin:Novartis: Consultancy, Research Funding.


Blood ◽  
2005 ◽  
Vol 106 (11) ◽  
pp. 171-171 ◽  
Author(s):  
Brenda Gibson ◽  
Ian Hann ◽  
David Webb ◽  
Siebold De Graaf ◽  
Richard Stevens (deceased) ◽  
...  

Abstract Between 1988 and 2002, 868 children (0–15 years) were entered into MRC AML 10 (1988–95, n=341) and AML 12 (1995–2002, n=527) trials. Children were allocated to one of three MRC risk groups: good risk - patients with t(8,21),inv(16),t(15,17) irrespective of bone marrow status after course 1 or the presence of other genetic abnormalities; standard risk - patients with neither favourable nor adverse cytogenetics and not more than 15% blasts in the bone marrow after course 1; poor risk - patients with more than 15% blasts in the bone marrow after course 1 or with adverse abnormalities of -5,-7, del(5q), abn(3q), complex (>/-5 abnormalities) and without favourable genetic abnormalities. Outcome from CR - death in CR (DCR), relapse risk (RR), disease-free survival (DFS) and survival from CR (OSCR) - was analysed by MRC risk group. RISK GROUP AML 10 AML 12 Good Standard Poor Good Standard Poor DCR (8yr %) 9 13 11 7 5 10 RR (8yr %) 35 40 66 21 37 53 DFS (8yr %) 59 52 31 74 60 42 OSCR (4yr %) 81 60 39 88 75 49 OSCR (8yr %) 78 57 37 84 72 49 In AML 10 all patients were eligible for SCT with a histocompatible sibling donor, but unrelated donor transplantation was not part of the protocol. Because of their favourable outcome in AML 10, good risk children were not eligible for SCT in AML 12 and part way through the trial a similar approach was adopted for standard risk patients, whilst SCT continued to be recommended for poor risk patients with their inferior outcome. Both sibling and unrelated donor transplantation were permitted. In AML 10 and AML 12, 38 of 139 (27%) poor risk children underwent SCT - 17 sibling allografts, 11 unrelated donor allografts and 10 autografts. The procedural mortality was: 6%, 55% and 0% respectively. Mantel-Byar analysis (to account for time to SCT) comparing transplanted with non-transplanted poor risk children showed no evidence of reduction in relapse risk (HR 1.02, 95% CI 0.58–1.79, p=0.9), disease-free survival (HR 1.47, 95% CI 0.87–2.50, p=0.16) or survival benefit (HR 1.64, CI 0.94–2.85, p=0.08 against SCT), both overall or for any type of SCT. The survival at 8 years from SCT was 41% for sibling allografts, 18% for unrelated donor allografts and 60% for autografts. OUTCOME BY TREATMENT - AML 10 & AML 12 Poor Risk (all) Poor Risk (censored at SCT) DCR (8yr %) 11 5 RR (8yr %) 58 55 DFS (8yr %) 37 43 OSCR (4yr %) 45 50 OSCR (8yr %) 44 50 Outcome is relative and, whilst poor risk children still do worse than good and standard risk patients, the outcome for poor risk children has improved. A survival at 8 years from CR of about 50% for poor risk children in AML12 (and no deaths beyond 4 years suggesting that most of those who survive are cured), raises questions as to whether any children with AML should be transplanted in 1st CR given the mortality and morbidity of the procedure. The high mortality associated with unrelated donor transplantation requires further investigation.


Blood ◽  
2009 ◽  
Vol 114 (22) ◽  
pp. 3923-3923
Author(s):  
Jin Takeuchi ◽  
Atsuko Hojo

Abstract 3923 Poster Board III-859 Introduction Wide use of rituximab, cyclophosphamide, doxorubicin, vincristine, and prednisone (R-CHOP) has improved the clinical outcome for elderly patients with DLBCL; however, a higher prevalence of coexisting disorders remains a problem. Correlation between their comorbidities and prognosis has not yet been well investigated. Patients and methods We retrospectively analyzed all patients over 65 years old who had been newly diagnosed with DLBCL at our institution from 2001 to 2008. To assess their comorbid medical status, we calculated the Charlson Comorbidity Index (CCI) for patient excluding primary disease. Prognostic factors were identified by Cox proportional hazards regression model. We classified patients into a low CCI group (CCI 0-1) and a high CCI group (CCI 2 or higher). Kaplan-Meyer curves for each group were evaluated by logrank test. Results A total of 80 patients were enrolled in this analysis. The median age was 73 (range 66-90) and the median observation period was 28 months (range 4-90 months). 62 patients (77.5%) were treated with R-CHOP, 15 (18.6%) underwent some other regimen, and 3 (3.8%) were given best supportive care only. According to revised International Prognostic Index (r-IPI), 43 patients were in the good risk group and the others were in the poor risk group. The estimated 3 year over all survival (OS) rate for these groups were 90% and 45% (p<0.0001). As for CCI, 14 patients (17.5%) were assigned to the high CCI group. Multivariate analysis revealed high CCI was associated with worse OS, while independent of r-IPI [Hazard Ratio (HR) 3.20, 95% Confidence interval (CI) 1.28-7.41, p=0.0145]. Among r-IPI poor risk patients, the high CCI group was inferior to the low CCI group for the 3 year OS rate (14% vs 56% p=0.0358), whereas this was not significant among r-IPI good risk patients (69% vs 94% p=0.0617). Conclusions Among elderly patients with DLBCL, high CCI is independently associated with poor survival. Patients having both poor r-IPI and high CCI may need discrete strategies. Disclosures: No relevant conflicts of interest to declare.


Blood ◽  
2010 ◽  
Vol 116 (21) ◽  
pp. 3409-3409
Author(s):  
Massimo Breccia ◽  
Roberto Latagliata ◽  
Fabio Stagno ◽  
Antonella Gozzini ◽  
Elisabetta Abruzzese ◽  
...  

Abstract Abstract 3409 A score aiming at early identification of CML patients showing sensitivity to second generation TKIs was proposed by the Hammersmith group. The score was created by analizing 80 patients and was based on 3 prognostic factors: previous cytogenetic response to imatinib, Sokal risk and recurrent neutropenia during imatinib. Subsequently, the score was validated in a small series of 28 patients. Aim of our study was to confirm the validity of this score and to establish its strength on a large group of CML patients resistant to imatinib and treated with second generation TKIs. One hundred twenty-seven patients were collected from 6 different Italian hematologic centers. There were 66 males and 61 females, median age 54 years (range 25–80). Twenty-seven patients received interferon before imatinib. Thirty patients had primary resistance, whereas 97 patients received second-generation TKI after acquired resistance to imatinib. The application of Hammersmith score was possible in 118 patients with available data: 52 patients were identified as good risk, 27 patients as intermediate risk and 38 patients as poor risk. The 1-year cumulative incidence of complete cytogenetic response (CCR) was 73% in good risk patients, 40% in intermediate risk patients and 23% in poor risk patients (p=0.0001). Similarly, the cumulative incidence of major molecular response (MMR) was 52% in good risk, 28% in intermediate risk and 13% in poor risk category (p=0.001). In the evaluation of event-free survival (EFS), events were considered loss of hematologic or cytogenetic response, disease progression, death for any cause, toxicity: the estimated 2-year event-free survival (EFS) was 89% in good risk, 70% in intermediate risk and 55% in poor risk group (p=0.0001). Progression-free survival (PFS) was defined as survival without evidence of accelerated or blastic phase: the estimated 2-year PFS was 97% in good risk, 93% in intermediate risk and 87% in poor risk category (p=0.05). Kaplan Meier estimated 2-year overall survival (OS) was 100% in the good risk, 93% in the intermediate risk and 82% in the poor risk category (p=0.001). In conclusion, as suggested by Milojkovic et al, some prognostic factors before starting second generation TKIs might predict cytogenetic response and outcome. As far as we known, the so-called Hammersmith score was not yet validated in large series of patients: we demonstrated that this score was able to discriminate patients at high risk of failure and consequent progression before treatment with second generation TKIs. Disclosures: No relevant conflicts of interest to declare.


Blood ◽  
2011 ◽  
Vol 118 (21) ◽  
pp. 493-493
Author(s):  
Christoph Röllig ◽  
Martin Bornhäuser ◽  
Christian Thiede ◽  
Michael Kramer ◽  
Anthony Ho ◽  
...  

Abstract Abstract 493 Background: According to retrospective analyses, the presence of a mutated Nucleophosmin-1 gene (NPM1+) in acute myeloid leukemia (AML) is associated with a favorable prognosis, particularly in the absence of an FLT3-ITD mutation (FLT3-ITD-). Therefore, AML with NPM1+/FLT3-ITD- and normal karyotype has been classified as favorable risk in current prognostic classifications. In order to assess the predictive value with regard to allogeneic stem cell transplantation (allo SCT), we compared the clinical course of 309 NPM1+ AML patients eligible for allo SCT in a donor versus no-donor analysis. Patients and Methods: Patients diagnosed with AML, aged 18–60 years, and treated in the AML 2003 trial of the Study Alliance Leukemia (SAL) were analyzed. According to the risk-adapted treatment strategy of the trial, cytogenetically intermediate-risk (IR) and adverse-risk (AR) patients should receive an allo SCT as consolidation treatment if an HLA-identical-sibling donor (IR) or HLA-matched related or unrelated donor (AR) was available. Patients with no available donor received high-dose cytarabine-based consolidation or autologous SCT. In order to avoid selection bias in an as-treated analysis of transplanted versus non-transplanted patients, we compared relapse-free survival (RFS) and overall survival (OS) depending on the availability of a suitable donor in a donor-no-donor analysis. Survival analyses were performed using the Kaplan-Meier method including log-rank tests for significance testing. Cox regression models and Wald tests were used for multivariate analyses on the influence of potential prognostic variables on the outcomes. Results: Of 1182 patients enrolled in the AML 2003 trial between December 2003 and November 2009, 375 were NPM1+ (32%), and 309 patients were eligible for evaluation for the donor vs. no-donor analysis. Their median age was 49 years, 304 patients had an intermediate-risk karyotype according to MRC criteria (98%), and amongst them there were 277 patients with a normal karyotype (90%). The FLT3-ITD mutation was present in 144 patients (37%). A donor was identified for 77 patients (25%), of whom 57 actually received allo SCT as first consolidation (74%). The no-donor group consisted of 232 patients. Age, disease status, cytogenetic profile, and FLT3-ITD incidence were equally distributed between the two groups. Median follow up was 41 months (3.4 years). The 3-year RFS in the donor and no-donor groups was 72% (95%–CI 61%–82%) and 47% (95%–CI 40%–55%), respectively. (p=0.007). The OS in the donor and no-donor groups were 70% (95%–CI 59%–81%) versus 60% (95%–CI 54%–67%) after 3 years, and 70% (95%–CI 59%–81%) versus 53% (95%–CI 45%–61%) after 5 years (p=0.138). In multivariate analyses, the presence of a donor as a prerequisite for allo SCT retained its statistically significant favorable influence on RFS (HR=0.56) even after adjustment for established risk factors such as FLT3-ITD, cytogenetic risk, WBC, LDH, age, and disease status. In patients with normal karyotype and NPM1+/FLT3-ITD- (n=152), the 3-year RFS in the donor and no-donor groups was 87% (95%–CI 77%–97%) and 53% (95%–CI 42%–63%), respectively (p=0.001). Conclusions: According to our results, allo SCT leads to a significantly prolonged RFS in NPM1+ AML patients with a pronounced effect even in NPM1+/FLT3-ITD- patients. The absence of a statistically significant difference in OS is most likely due to the fact that relapsed NPM1+ patients responded well to salvage treatment, particularly to allo SCT from an unrelated donor. Our data suggest that patients with NPM1+ AML who have a well-matched donor benefit from allo SCT in first remission. This hypothesis is currently being tested prospectively in a randomized controlled trial (“ETAL-1”, NCT01246752) evaluating allo SCT in all intermediate-risk AML patients with a well-matched sibling or unrelated donor identified until the achievement of first CR. Disclosures: No relevant conflicts of interest to declare.


Blood ◽  
2012 ◽  
Vol 120 (21) ◽  
pp. 231-231
Author(s):  
Nigel H Russell ◽  
Robert K. Hills ◽  
Lars Kjeldsen ◽  
John L Yin ◽  
Charles Craddock ◽  
...  

Abstract Abstract 231 Reduced Intensity conditioning (RIC) offers a feasible option for older patients to an allogeneic stem cell transplant and to potentially benefit from a graft vs leukaemia effect. However the relative clinical benefit in AML is less clear. Since our previous experience did not show an overall survival advantage from myeloablative transplant in patients >40 years we have examined the impact of RIC allograft in 1st CR on the outcome of patients aged 40–70 years treated within the UK NCRI AML15 (2002–2009) and AML16 (2006–2012) trials compared to chemotherapy. Methods: Both trials offered the option of RIC transplant in CR1 for patients who were not good risk. A total of 2454 patients between 40 to 70 years entered CR (AML15: 1580 and AML16: 874) of whom 407 received a RIC (292/1580 in AML15 and 115/874 in AML16). Matched sibling transplants were given in 229, and MUDs in 178. The cytogenetic risk groups were 258 intermediate, 59 adverse, 90 not known. Follow-up is complete to 1st January 2012. Comparisons of transplant versus not are carried out using Mantel-Byar analysis to allow for time to transplant, with patients censored at the time of non-RIC allo transplant. Data from the two trials were pooled and split by age. Results: The OS for the 255 patients <60 yrs. was significantly superior to no transplant (53% vs 41%, HR 0.79(0.66–0.96), p=0.02). There was clear benefit in the 164 intermediate risk group patients (59% vs 44%, HR 0.67 (0.53–0.86) p=0.0008) with less evidence for the 40 who had adverse risk (16% vs 10%, HR 0.89 (0.58–1.35) p=0.6). In 152 patients 60+ yrs. the overall benefit was not significantly superior (37% vs 24%, HR 0.85 (0.68–1.06), p=0.2), there was a non-significant trend for benefit in the 94 intermediate risk group patients (43% vs 26%, HR 0.73 (0.49–1.09), p=0.3), and clearer benefit in the 19 patients in the adverse group (16% vs 3%, HR 0.57 (0.36–0.91), p=0.01). Considering the types of transplant, in the <60 group the survival benefit was restricted to sibling RIC (Sibling 61%: MUD 35%: no transplant 41%), and in the 60+ group a similar trend of borderline significance was seen (Sibling 49%:MUD 28%: no transplant 24%). In analysis by cytogenetic group, with the exception of patients over 60 yrs with adverse karyotype, sibling allograft gave consistently better survival. Conclusion: This pooled analysis shows that RIC allo SCT in AML 1st CR improves the survival of older patients with AML aged <60 but possibly only if a sibling donor is used. There was less benefit (sibling or MUD) for adverse risk patients. For patients aged >60 years overall benefit was less clear but there was a similar trend for benefit in intermediate risk patients. However, given the lack of statistical heterogeneity, our data does not exclude a benefit for patients with adverse risk cytogenetics or for those undergoing SCT from an unrelated donor. This observation runs counter to what we observe in patients <40 yrs, where the benefit is limited to adverse risk patients. Disclosures: No relevant conflicts of interest to declare.


2009 ◽  
Vol 27 (15_suppl) ◽  
pp. 7015-7015
Author(s):  
F. P. Santos ◽  
W. Qiao ◽  
J. E. Cortes ◽  
D. Jones ◽  
F. Ravandi ◽  
...  

7015 Background: Mutations of the FLT3 gene (in special internal tandem duplication -ITD) are common in normal karyotype AML (NK-AML) and are associated with shorter relapse free and overall survival (OS). The frequency of FLT3 mutations is lower in other cytogenetic subgroups and the impact on outcome is unclear. Methods: The records of patients (pts) with newly diagnosed AML (from 2003 to 2007) were reviewed. Pts were divided among three cytogenetic subgroups: Good-risk (t(8;21), Inv(16)/t(16;16)) Intermediate-Risk (Diploid,-Y) and Poor Risk (-5,-7, 11q abnormalities). FLT3 ITD and tyrosine kinase domain (TKD) mutations were determined on baseline DNA samples by a PCR based method with 1% sensitivity. Since the frequencies of FLT3 mutations were lower in good- and poor-risk subgroups, ITD/TKD mutations were considered together in the analysis, while in the intermediate risk group they were analyzed separately. Survival curves stratified by FLT3 mutation were estimated by Kaplan Meier plots and compared by logrank test. A Cox model was fit for OS, and non-significant variables were eliminated in a step-down fashion with a p- value cut-off of p = .10. Results: A total of 481 pts were included (65 pts=good risk, 272 pts=intermediate risk and 144 pts= poor risk). Prevalence of FLT3 mutations is shown in the Table. No difference was found in median OS between FLT3-mutated and FLT3- wild type pts in the good risk group (not reached (NR) vs NR, P = 0.57) nor in the poor risk group (55 vs 24 weeks, P = 0.44). In intermediate risk, OS was worse in FLT3-ITD positive pts (33 vs 89 weeks, P < 0.0001) but not in FLT3-TKD positive pts (77 vs 70 weeks, P = 0.89). In the Cox model, FLT3 mutations were prognostic for OS only in intermediate risk pts with FLT3-ITD (HR 2.63, P < 0.0001). Conclusions: In our cohort of pts, FLT3 mutations did not have a prognostic impact in AML with good and poor risk karyotype. [Table: see text] No significant financial relationships to disclose.


Sign in / Sign up

Export Citation Format

Share Document