Transfusion Practices in Myelodysplastic Syndromes: Preliminary Results of An Epidemiologic and Economical Study

Blood ◽  
2011 ◽  
Vol 118 (21) ◽  
pp. 4336-4336
Author(s):  
Matthieu Filloux ◽  
Adrien Chauchet ◽  
Yvan Beaussant ◽  
Chrystelle Vidal ◽  
Franck Leroux ◽  
...  

Abstract Abstract 4336 Background Despite the improvements and promises of novel agents (demethylating and erythropoiesis stimulating agents), red blood cells (RBC) and platelets transfusion remains frequent and essential in the management of myelodysplastic syndromes (MDS). Reducing patients’ dependence to transfusion is a major outcome, linked with the disease prognosis, patients’ quality of life and economical issues. Few data in the literature have described transfusions practices and requirements in MDS patients. We report here an epidemiological study of the transfusion practices in our center and try to build an economical evaluation of direct transfusions costs. Design and Methods We conducted a retrospective, descriptive study including all new patients diagnosed with MDS in the department of haematology of Besan□ on between 2006 and 2009. Patients were classified as high risk (HR) MDS when IPSS was ≥ 1.5, and as low risk (LR) when IPSS was ≤ 1. We compared HR and LR groups and four categories of age (≤68; [68–76]; [76–83] and >83 years) according to transfusion data. The economic study compared direct costs of RBC transfusions and erythroipoiesis-stimulating agents for LR and HR patients, including the costs of hospital stays. Materials, transport, medical examinations, doctors’ and nurses’ wages and iron chelation were not considered in this analysis. T-test and χ2-test were used for comparisons. Results 205 patients were analysed, median age at diagnosis was 74.3 years (table 1), with a predominance of men (sex ratio 1.33). IPSS score was available for 75% of patients (n=154), 111 LR patients and 43 HR patients. Median follow-up was 32 months [10–57]. Twenty-three patients (11%) developed a secondary acute leukaemia and 11 (5%) received allogeneic stem cell transplantation. At diagnosis, hemoglobin (Hb) level was not significantly different between HR and LR patients. Platelets level was lower in the HR vs LR (109 vs. 178 G/L respectively, p<0.0001). 60.5% of patients (n=124/205) received labile blood products during the follow-up, more frequently RBC than platelets (87.5% vs. 12.5% respectively). The mean Hb threshold at transfusion was 8.1 g/dl without any significant difference between neither age groups nor IPSS; age did not influenced transfusion requirement (Table 2). In comparison to HR patients, LR patients were less transfused (55 vs. 79%, p<0.006, table 2), and had longer mean intervals between transfusions (32.5 vs. 16.9 days, p<0.001). Furthermore, a progressive shortening of transfusion intervals was observed for both groups along the time; this progression was faster for HR patients. The anti-erythrocyte immunization rates, excluding anti-RH or anti-Kell, was 6.7%. Economical analysis showed that annual costs of RBC transfusion were 11,409 euros in LR patients and 21,945 euros in HR patients versus 11,492 euros for EPO. Conclusion In our study, neither transfusion requirement nor transfusion threshold were correlated with age, whereas both were affected by IPSS. Furthermore, age did not appear to be a predictive factor concerning the transfusion dependence. Despite its limits, the economic study reveals that EPO and transfusion's annual costs are similar in low risk patients. Updated data will be presented on EPO and 5-azacytidine use in our cohort to assess their impact on transfusion practices. Disclosures: No relevant conflicts of interest to declare.

Blood ◽  
2021 ◽  
Vol 138 (Supplement 1) ◽  
pp. 3061-3061
Author(s):  
Diego Adrianzen Herrera ◽  
Andrew D Sparks ◽  
Insu Koh ◽  
Neil A. Zakai

Abstract Introduction: The treatment of myelodysplastic syndromes (MDS) changed after the approval of lenalidomide in 2005, and 2 hypomethylating agents (HMA): azacitidine in 2004 and decitabine in 2006. Erythropoietin-stimulating agents (ESA) and reduced intensity conditioning stem cell transplantation (RIC-SCT) also became more widely utilized around that period. Analyses evaluating potential overall survival (OS) improvements in MDS since 2007 across different cancer registries have shown conflicting results. 'Real-world' survival trends are important to measure the true impact of these treatments outside clinical trials, especially as MDS treatment innovations may not reach all patients. We assessed the temporal change in OS and cancer-specific survival (CSS) of MDS patients in the US over the decade before (2001 - 2006) and after modern treatment options were established (2007 - 2016). Methods: Adult subjects diagnosed with MDS between 2001 and 2016 were identified in Surveillance, Epidemiology, and End Results (SEER), and categorized in 2 groups based on year of diagnosis: 2001 - 2006 (cohort 1) and 2007 - 2016 (cohort 2). MDS histologic risk was classified into low, intermediate, and high, using International Classification of Diseases for Oncology 3 rd Edition (ICD-O-3) codes. Cause of death (COD) reported to State registries (SEER recode) was used to determine CSS, defined as death from MDS or acute myeloid leukemia. Only cases with histologic confirmation of ICD-O-3 codes and complete follow up records were analyzed. Kaplan-Meier estimation was used to summarize unadjusted OS distribution. To assess the association of cohort 2 with OS and CSS gains, follow up duration was restricted to a maximum of 5 years in both cohorts and survival analysis was performed using multivariable Cox-proportional hazards regression model adjusting for age, sex, race, ethnicity, MDS risk, and geographic location. Results: We included a total of 42,217 patients with MDS, 13,633 (30.8%) in Cohort 1 (2001 - 2006) and 30,584 (69.2%) in Cohort 2 (modern treatment era). Subjects in cohort 2 were slightly older (mean age 73.8 vs 73.2 years, p&lt;0.001) and included more males (58.6% vs 56.3%, p&lt;0.001). Cohort 1 included more subjects with low MDS histologic risk (27.8% vs 17.2%) and fewer subjects with high MDS risk (18.4% vs 20.5%) (Table 1). Median OS for low, intermediate, and high risk MDS were 44 months (95%CI, 41.5 - 46.5), 27 months (95%CI, 25.7 - 28.3) and 10 months (95%CI, 9.3 - 10.7) in cohort 1, and 48 months (95%CI 45.9 - 50.1), 26 months (95%CI 25.3 - 26.7) and 11 months (95%CI 10.6 - 11.4) in cohort 2, but these differences were not significant. In the multivariable model adjusted for age, sex, race, ethnicity, MDS risk, and geographic location, cohort 2 was associated with a significantly lower hazard of overall death compared to cohort 1, HR for OS of 0.97 (95%CI, 0.95 - 0.99, p&lt;0.001). Similarly, the modern era of treatment was associated with lower cancer-specific death compared to cohort 1, HR for CCS 0.93 (95%CI, 0.89 - 0.93, p=0.038). MDS histologic risk was the strongest factor associated with higher risk of overall and cancer-specific death. Other factors significantly associated with worse OS and CSS were advancing age, male sex, Hispanic ethnicity and unmarried status (Table 2). When analysis was restricted to patients with high risk MDS, cohort 2 was associated with a lower hazard of cancer-specific death, HR for CSS 0.90 (95%CI, 0.84 - 0.94, p=0.006), but no significant difference in overall death, HR for OS 0.96 (95%CI, 0.91 - 1.02, p=0.17). Discussion: In a SEER analysis, we found that the modern paradigm of MDS treatment, including access to lenalidomide, HMA, ESA and RIC-SCT, is associated with only modest survival gains at the population level. Across all MDS risk groups, improvements in cancer-specific death were larger than those seen for overall death and, in high risk MDS, a significant gain in CSS did not translate into longer OS. These data suggest there is a need for targeting non-cancer excess mortality in patients with MDS, who usually present with a high comorbidity burden. Strategies to optimize medical conditions coexisting with MDS and better supportive care may help consolidate the gains associated with currently available MDS-directed therapies. Figure 1 Figure 1. Disclosures No relevant conflicts of interest to declare.


Blood ◽  
2021 ◽  
Vol 138 (Supplement 1) ◽  
pp. 4675-4675
Author(s):  
Christina K Ferrone ◽  
Amy JM McNaughton ◽  
Iran Rashedi ◽  
Hubert Tsui ◽  
Michael J Rauh

Abstract The recognition of MDS is challenging in early stages, where diagnosis may rely solely upon morphological criteria for dysplasia, a non-specific finding prone to inter-observer variation. Patients with equivocal bone marrow (BM) findings may be discharged from Hematology clinics and lost to follow up, or subjected to serial, invasive BM investigations and diagnostic delays. We therefore aimed to demonstrate the importance of hematopathologist-triaged, targeted NGS in identifying clonal cytopenias of undetermined significance (CCUS) in cases where MDS diagnostic criteria are not met based on morphology or cytogenetic analysis. We explored this using three REB-approved cohorts. Our first cohort was retrospective with BM samples ranging from 2010-14, involving cases that were previously suspicious for but non-MDS diagnostic. This included 70 patients from Sunnybrook (SHSC) and Kingston Health Sciences Centres (KHSC): 16 age-matched controls (8 negative lymphoma staging, 8 non-MDS cytopenias); 18 suspicious for MDS; 20 MDS; and 16 MDS/MPN. DNA was extracted and NGS was performed using our custom 48-gene Ion Torrent AmpliSeq myeloid panel (ThermoFisher). We identified suspected mutations in 2/16 (13%) controls (i.e. CHIP), 12/18 (67%) suspicious cases, 17/20 (85%) MDS cases, and 16/16 (100%) MDS/MPN cases. The mean and median number of mutations per suspicious patient (respectively 0.89 and 1; most commonly in SF3B1, TET2, RUNX1, and ASXL1) were lower than MDS (1.85 and 2; p=0.011) and MDS/MPN (3.13 and 3; p&lt;0.0001). There was a significant difference in the average variant allele frequency (VAF) per patient (those with ≥1 mutation) between control and suspicious groups (p=0.022), however, there were no significant differences in the average VAF between suspicious, MDS, and MDS/MPN cases. Furthermore, of the 16 patients with BM suspicious for MDS, 7 went on to get MDS. 4 of these patients had at least 1 clinically relevant somatic variant, while 3 had none. Of those with at least 1 variant, 3 had IPSS-level cytopenias at the time, indicating that had their mutational status been known at the time of their assessment, they would have been diagnosed with the provisional CCUS entity (while the rest would be classified as CHIP). To supplement these findings, we are amassing a prospective cohort involving cases at SHSC where patients have either idiopathic cytopenias (ICUS), or confirmed MDS diagnoses with one or more previously non-diagnostic BM. To date, we have performed sequencing for 36 of these patients, including 23 ICUS and 13 diagnosed MDS cases. Of the ICUS cases, 10 (44%) had at least 1 variant (mean # variant/patient = 1, mean variant allele frequency (VAF) = 34.0%) consistent with CCUS, while 12/13 (92%) of MDS patients had at least 1 variant (mean # variants/patient = 2, mean VAF = 42.3%). These findings are consistent with CCUS being common in suspicious MDS cases, with similar clonal size but lesser mutational burden than diagnosed MDS. In addition to these preliminary findings, 15/36 patients have serial samples that we are currently processing for NGS (among other cases we are accruing to present at the ASH meeting). By exploring serial cases with molecular results pre- and post- MDS diagnosis, we aim to further elucidate which features of CCUS may predict progression to MDS. Finally, we assessed clonality in cases suspicious for myeloid malignancy in our existing prospective myeloid NGS cohort at KHSC (Ferrone et al, JMD 2021). In this cohort of 168 patients, when focusing on cytopenias yet to be diagnosed, 71 patients had suspected MDS, MPN, or MDS/MPN prior to NGS (completed using the Oncomine Myeloid Assay; ThermoFisher). 36/71 (51%) were found to have variants that indicate clonality. This facilitated diagnoses of either myeloid malignancies or pre-malignant states, with nine cases in total of ICUS resulting in the identification of variants that were non-diagnostic of MDS (mainly in TET2), but indicative of CHIP (n=2) or CCUS (n=7). Furthermore, for the limited number with available follow up data, we found no significant difference in survival between individuals with low-grade MDS (n=10) and CCUS (n=6) (p=0.457). This evidence is in keeping with recent findings that the clinical features of CCUS may be consistent with low-risk MDS, emphasizing the importance of closely monitoring these patients, and even the possibility of assessing and treating them similarly to those with low-risk MDS. Disclosures No relevant conflicts of interest to declare.


Blood ◽  
2013 ◽  
Vol 122 (21) ◽  
pp. 259-259 ◽  
Author(s):  
Valeria Visconte ◽  
Heesun J. Rogers ◽  
Ali Tabarroki ◽  
Li Zhang ◽  
Yvonne Parker ◽  
...  

Abstract The link between SF3B1 mutation and the ring sideroblast (RS) phenotype in myelodysplastic syndromes (MDS) was solidified by the identification of RS in Sf3b1 heterozygous (Sf3b1+/-) mice. The identification of SF3B1 mutations in refractory anemia with RS (RARS) and RARS with thrombocytosis (RARS-T) showed the importance of RNA splicing in MDS biology. Furthermore, it opened the possibility of targeted therapy using spliceosome inhibitors in RARS/-T. However, many questions remain unanswered in linking SF3B1 dysfunction to MDS biology like the downstream targets of this gene. The identification of a robust murine model is essential to study a specific molecularly defined disease-phenotype and develop targeted therapies. We identified occasional RS in the bone marrow (BM) of Sf3b1+/- which are rarely found in current mouse models of MDS (Beachy SH, Hematol Oncol Clin North Am, 2010). However, aside from RS in the BM no other MDS features were found. Sf3b1+/- mice were originally engineered as a means to study the interaction between polycomb (PcG) genes and other protein-complexes (Isono K, Genes Dev, 2005). Homozygous Sf3b1-/- mice died at the stage of pre-implantation of the embryos while Sf3b1+/- appear healthy. Several tools have been tested to model MDS in genetically engineered mice targeting key genes in MDS. However, the creation of an ideal mouse model resembling distinct morphologic MDS subtypes is still lacking. To define a mouse model useful for preclinical therapeutic studies, we evaluated the hematologic features of Sf3b1+/- and Sf3b1+/+ mice during a long term follow-up. Five Sf3b1+/- and 5 Sf3b1+/+ mice were followed over time until 12 months of age. Blood was drawn from the retro-orbital vein every month starting from 6 months of age. Using two-sample Wilcoxon test we compared standard hematologic parameters finding a significant difference over the time between Sf3b1+/- and Sf3b1+/+: hemoglobin (g/dL) 6.9 ±0.73 vs 10.0 ±1.6 (P=0.008), red blood cells (M/uL) 8.3±0.5 vs 5.9±1.0 (P=0.008), platelets (K/uL) 731±105 vs 579±93 (P=0.008), and mean corpuscular volume (fL) 47.8±1.5 vs 45.1±1.1 (P=0.032). We did not detect any significant difference in other parameters although lymphocytes were more represented vs neutrophils, eosinophils and monocytes in Sf3b1+/- vs Sf3b1+/+ (6.3K/uL ±3.1 vs 5.8 ±1.8; P=1). Analysis of the BM, showed no difference in cell number between Sf3b1+/- (n=7) and Sf3b1+/+ (n=7) (44.1±9.1 vs 43.2 ±11; P=0.62). However, distinct dyserythropoiesis such as nuclear budding or irregular nuclei in Wright-Giemsa and occasional RS in Prussian blue stains were noted in Sf3b1+/- which were not present in Sf3b1+/+. In support of the iron overload seen in SF3B1 mutant patients (pts), a similar observation was made in Sf3b1+/- by light microscopy and rhodamine based- flow cytometry to quantify mitochondrial iron (Visconte, Abstract #64897). We also characterized the transcriptome of Sf3b1+/- and Sf3b1+/+. Total RNA was isolated from BM of age/gender matched mice, polyA cDNA was prepared from 3ug of RNA and Mouse RNA-sequencing was run on Illumina HiSeq2000. 200 exons were found differentially used in Sf3b1+/- vs Sf3b1+/+. Chromosome 1 contains the highest number of genes with at least 1 exon alternatively used similar to what we observed in SF3B1 mutant pts. In total 22 genes showed stronger differential expression in Sf3b1+/- vs Sf3b1+/+. Sf3b1 was down-regulated as expected (MFC: 0.74) in Sf3b1+/-. Studies in Sf3b1+/- mice show that Sf3b1 protein physically interacts with Class II PcG proteins (PRC1) which are relevant in MDS. When we interrogated PcG genes and others, we found lower mRNA levels of ezh2 (MFC: 0.06) and npm1 and tpr53 (MFC: 0.01 and 0.28) and no difference in asxl1 and runx1 (MFC:1.22 and 1.1) in Sf3b1+/- vs Sf3b1+/+. Jak2, dock8, and uhrf2 showed significant (P=.0003) higher expression in Sf3b1+/-. MDS is a heterogeneous disease characterized by genetic and non-genetic causes. Introduction of secondary events implicated in MDS pathogenesis can modify the phenotype of Sf3b1+/- mice. In sum, Sf3b1+/- mice after 6 months of follow-up developed macrocytic anemia, thrombocytosis, RS and dyserythropoiesis akin to human RARS/-T. Furthermore, transcriptome analysis shows exon usage/ gene expression changes similar to human SF3B1 mutants lending support that Sf3b1+/- can serve as a mouse model for studying the biology of human low risk MDS specifically that of RARS/-T. Disclosures: No relevant conflicts of interest to declare.


Author(s):  
Florin Eggmann ◽  
Thomas J. W. Gasser ◽  
Hanjo Hecker ◽  
Mauro Amato ◽  
Roland Weiger ◽  
...  

Abstract Objectives This study aimed to retrospectively evaluate clinical and radiographic outcomes of partial pulpotomy performed in permanent teeth with carious pulp exposure. Materials and methods Records of patients undergoing treatment at an undergraduate dental clinic between 2010 and 2019 were screened for partial pulpotomies in teeth with a presumptive diagnosis of normal pulp or reversible pulpitis. The follow-up had to be ≥ 1 year. Patient data were retrieved and analyzed using Mantel-Cox chi square tests and Kaplan–Meier statistics. The level of significance was set at α = 0.05. Results Partial pulpotomy was performed in 111 cases, of which 64 (58%) fulfilled the eligibility criteria. At the time of partial pulpotomy, the mean age was 37.3 (± 13.5) years (age range 18–85). The mean observation period was 3.1 (± 2.0) years. Two early failures (3.1%) and five late failures (7.7%) were recorded. The overall success rate of maintaining pulp vitality was 89.1%, with 98.4% tooth survival. The cumulative pulp survival rates of partial pulpotomy in patients aged < 30 years, between 30 and 40 years, and > 40 years were 100%, 75.5%, and 90.5%, respectively, with no significant difference between the age groups (p = 0.225). At follow-up, narrowing of the pulp canal space and tooth discoloration were observed in 10.9% and 3.1% of cases, respectively. Conclusions Across age groups, partial pulpotomy achieved favorable short and medium-term outcomes in teeth with carious pulp exposure. Clinical relevance Adequate case selection provided, partial pulpotomy is a viable operative approach to treat permanent teeth with deep carious lesions irrespective of patients’ age.


2020 ◽  
Vol 2020 ◽  
pp. 1-8
Author(s):  
C. Q. Hoang ◽  
H. D. Nguyen ◽  
N. X. Ho ◽  
T. H. T. Vu ◽  
T. T. M. Pham ◽  
...  

Background. Scarce information exists about immunity to hand, foot, and mouth disease (HFMD) among household contacts of index cases in Vietnam and what that means for reducing ongoing HFMD transmission in the community. Methods. We analyzed neutralizing antibodies (NT) and the incidence of enterovirus (EVs) infection among household contacts of index cases in a province where HFMD remains endemic. Throat swab and 2 mL blood samples from household contacts were collected at enrollment, during and after 2 weeks follow-up. Results. The incidence of EV-A71 infection among household contacts was 40/84 (47.6%, 95% Cl: 36.9-58.3%), compared with 106/336 (31.5%, 95% Cl: 26.6-36.5%) for CV-A6 and 36/107 (33.6%, 95% Cl: 24.7-42.6%) for CV-A16. The incidence of CV-A6 infection was fairly constant across ages; in contrast, CV-A71 and CV-A16 had some variation across ages. At baseline, higher geometric mean titer (GMT) of EV-A71, CV-A6, and CV-A16 antibody titers was found for 25-34-year groups (range 216.3 to 305.0) compared to the other age groups. There was a statistically significant difference in GMT values of CV-A6 and CV-A16 between those who had an infection or did not have infection among households with an index case of these serotypes. Conclusions. Our results indicated that adults were becoming infected with HFMD and could be contributing to the transmission. There is, therefore, a need for considering the household setting as an additional target for intervention programs for HFMD.


Blood ◽  
2019 ◽  
Vol 134 (Supplement_1) ◽  
pp. 4056-4056
Author(s):  
Michelle Janania Martinez ◽  
Prathibha Surapaneni ◽  
Juan F Garza ◽  
Tyler W Snedden ◽  
Snegha Ananth ◽  
...  

BACKGROUND It is estimated that 8110 persons will be diagnosed with Hodgkin Lymphoma (HL) in the US during 2019, but the advent of new treatment options has increased the cure rate to at least 80%. It has been reported that the rates of HL are lower in the adolescent and young adult (AYA) Hispanic population but significantly higher in the Hispanic population older than 65. The relative survival estimates are stated to be similar between AYA Hispanics (HI) and non-Hispanics (NH) but for ages 65-84, HI have a significantly higher mortality rate. Pediatric studies have suggested that ethnicity plays a role in outcomes in patients with HL but there is limited data in the adult population. There is an unmet need in the field, where dossiers on underrepresented ethnic minorities need to be carefully considered and compared to existing data. Therefore, our study aims to compare survival outcomes in Hispanics vs Non-Hispanics with HL, who were treated at the only NCI designated cancer center of South Texas. To our knowledge this is the largest cohort of HL patients from a single academic institution that serves primarily Hispanics. METHODS We located and retrospectively analyzed a total of 616 patients with diagnosis of Lymphoma (HL and NHL) by International Classification of Diseases (ICD) codes and identified 116 cases of HL; all the patients received care at UT Health San Antonio, between 2008-2018. Key variables for each patient included age, gender, race/ethnicity, comorbidities, insurance status, stage, BM and extranodal involvement, treatment received, outcome at 3 and 5 years and vitality status in 2018. Continuously distributed outcomes were summarized with the mean and standard deviation and categorical outcomes were summarized with frequencies and percentages. The significance of variation in the mean with disease category was assessed with one way ANOVA and the significance of associations between categorical outcomes was assessed with Pearson's Chi Square or Fisher's Exact test as appropriate. Multivariate logistic regression was used to model binary outcomes in terms of covariates and indicators of disease. All statistical testing was two-sided with a significance level of 5%. R1 was used throughout. The study was approved by the local Institutional Review Board. The findings will be available to patients, funders and medical community through traditional publishing and social media. RESULTS We identified 116 patients with HL, of which 73 were HI (63%), 43 NH (36%) and 1 not specified (1%). In regard to race, 92% identified as Caucasian, 4% as African American, 3% other and 1% Asian. The median age at diagnosis was 37.4, (SD 15.13). There were 49 females (42%) and 67 males (58%). The most common funding source was commercial insurance N=54 (47%), followed by a hospital payment plan N=30 (26%), Medicare N=16 (14%), unfunded N=13 (11%) and Medicaid N=3 (2%). Most prevalent co-morbidities were HTN N=28 (24%) and diabetes mellitus N= 23(20%); 50% of patients had no co-morbidities (N=63).At diagnosis ECOG of 0-1 was seen in 108 patients (93%); 8 were Stage I (7%), 39 stage II (33%), 32 stage III (28%), and 37 stage IV (32%). EBV was positive in 26 patients (22%). There were 15 patients that were HIV positive (13%), 54% with CD4 count <200, and 12 (75%) on antiretroviral therapy at diagnosis. Median PFS was 853.85 days (SD 912.92). We excluded patients who were lost to follow up or had not reached 3/5 years. At 3 year follow up there was: complete response in 37 HI (74%) vs 22 NH (92%); disease progression in 8 (16%) vs 0 (0%); death in 5 (10%) vs 2 (8%), respectively (p-value= 0.094). At 5 year follow up there was: complete response in 30 HI (77%) vs 17 NH (90%); progressive disease in 2 (5%) vs 0 (0); death 7 (18%) vs 2 (11%), respectively (p-value = 0.619). At the end of 2018, 41 HI (84%) were alive compared to 22 NH (88%) [p-value 0.74]. CONCLUSION Within the limitations of sample size, our study demonstrates that in the prevalently Hispanic population of our institution, HI patients with HL have no statistically significant difference in outcome when compared to NH patients. Disclosures No relevant conflicts of interest to declare.


Blood ◽  
2006 ◽  
Vol 108 (11) ◽  
pp. 3389-3389 ◽  
Author(s):  
John D. Shaughnessy ◽  
Jeffrey Haessler ◽  
Jerry Zeldis ◽  
Yongsheng Huang ◽  
Fenghuang Zhan ◽  
...  

Abstract Background: THAL, whose activity in MM was discovered in the setting of advanced and refractory disease in the late 1990’s (Singhal, NEJM, 2000), has become the standard front-line therapy in combination with dexamethasone (DEX). In a randomized phase III tandem transplant trial, TT2, a higher complete response (CR) rate and longer event-free survival (EFS) had been observed on the THAL arm (Barlogie, NEJM, 2006). The similar overall survival (OS) on THAL and control arms had been attributed to the routine use of THAL as salvage therapy for the patients randomized to the No-THAL arm and the shorter post-relapse OS among patients randomized to the THAL arm. Patients and Methods: With a median follow-up on TT2 of 53mo, 107 patients have relapsed and 219 died. Subset analyses were performed to determine whether THAL confers an OS advantage in any subgroup of patients. Results: 6-yr EFS and OS rates are 48%/63% on THAL and 38%/58% on control arm (p=0.01/0.67). Post-relapse OS is now similar with median durations of 5.3mo/4.3mo among control/THAL arms (p=0.11). According to multivariate analyses of 11 standard prognostic factors, EFS was shorter among patients treated without THAL, in the presence of cytogenetic abnormalities (CA), B2M and LDH elevations and low albumin, whereas CR was favorable; OS was inferior with CA, high LDH, low albumin and in patients not receiving 2nd transplant or not achieving CR. Randomization to THAL was beneficial only in the >2 risk factor group: 6-yr OS was 47% in 31 patients on THAL and 12% in 31 control patients (Figure 1, p=0.01). When examined in the context of GEP (70 gene model-based high versus low risk groups) and inter-phase FISH data (amp1q21), available in 260 patients, the 57 with GEP low risk and absence of amp1q21 receiving THAL had 5-yr OS of 90% compared to 74% among 73 controls (p=0.13). Conclusion: With longer follow-up of 53mo on TT2, EFS remains superior among patients randomized to THAL; post-relapse survival is no longer inferior among those randomized to THAL; THAL benefited a high-risk subgroup with >2 standard risk factors, whereas no significant `difference has yet emerged among genetically defined subgroups. Figure Figure


Blood ◽  
2008 ◽  
Vol 112 (11) ◽  
pp. 2113-2113 ◽  
Author(s):  
Susan Branford ◽  
Rebecca Lawrence ◽  
Andrew Grigg ◽  
John Francis Seymour ◽  
Anthony Schwarer ◽  
...  

Abstract A major molecular response (MMR) by 12 or 18 months (m) of standard dose imatinib for patients (pts) with newly diagnosed chronic phase CML is associated with a low risk of progression to accelerated phase or blast crisis. Phase II/III trials suggest that MMR may be achieved earlier with higher doses of imatinib. We determined whether the timing of MMR affects the long term stability of response with regard to the acquisition of BCR-ABL mutations and/or loss of MMR (collectively defined as an “event”) for pts with up to 8 years of follow up since commencing first-line imatinib. All pts treated with 400 to 600mg of first-line imatinib who were monitored regularly at our institution for BCR-ABL levels by real-time quantitative PCR and mutation analysis by direct sequencing were evaluated: 181 pts were followed for a median of 45m (range (r) 3–96m). The event rate was compared for pts dependent on the time to MMR (≤0.1% IS (international scale)) in 6m intervals to 18m of imatinib. The events for pts with undetectable BCR-ABL (complete molecular response, CMR) were also determined. Strict sensitivity criteria were used for CMR: undetectable BCR-ABL where the sensitivity of analysis indicated BCR-ABL was &lt;0.003% IS, (equivalent to at least 4.5 log below the standardized baseline) which was confirmed on a subsequent analysis. Loss of MMR was defined as a confirmed &gt;2 fold rise from nadir to a level &gt;0.1% IS in pts who maintained imatinib dose. 144/181 pts (80%) achieved MMR at a median of 12m (r 3–53m). Consistent with other studies, maintaining a higher dose of imatinib in the first 6m of therapy was associated with a significantly higher frequency of pts achieving MMR by 6m. 118 pts received an average dose of &lt;600mg in the first 6m and 18/118 (15%) achieved MMR by 6m, whereas 63 pts received an average dose of 600mg in the first 6m and 23/63 (37%) achieved MMR by 6m, P=0.002. Mutations were detected in 14/181 pts (8%) at a median of 9m (r 3–42m). An event occurred in 8 pts with MMR at a median of 36m (r12–57m) after commencing imatinib, including one patient who had achieved CMR. Mutations were found in 4 pts and 3/4 lost MMR. The remaining 4 lost MMR without a mutation. The one patient with a mutation who did not lose MMR had a 3-fold rise in BCR-ABL at the time of mutation detection and responded to a higher imatinib dose. The other pts with mutations had therapeutic intervention upon cytogenetic relapse (2) or loss of MMR (1). The 4 pts with loss of MMR and no mutation had accelerated phase (1), cytogenetic relapse (2) and one maintained CCR with 3m of follow up. The median fold rise in BCR-ABL upon loss of MMR was 26 (r 4–220). The probability of an event if MMR was achieved by a) 6m was 0% (n=41 evaluable pts), b) &gt;6 to 12m was 12% (n=40) and c) 12 to 18m was 19% (n=33). The median follow up since MMR was achieved was not significantly different for the groups: 49m (r 3–87m), 38m (r 6–87m), 40m (r 9–78m), respectively, P=0.5. The risk of an event for pts with MMR achieved by 6m was significantly lower than in pts with MMR achieved by &gt;6 to 18m, P=0.04. CMR occurred in 55 pts who were followed for a median of 24m (r 3–55m) after its attainment. Only 1 event occurred in these 55 pts, which was at 6m after CMR was achieved and 57m after commencing imatinib. This patient had maintained MMR for 45m but loss of a major cytogenetic response occurred 6m after loss of MMR. There was a significant difference in the probability of CMR by 60m of imatinib dependent on the time to MMR, P&lt;0.0001 (Figure). All pts failed to achieve CMR by 60m if not in MMR at 18m whereas the actuarial rate of CMR at 60m was 93% in those with MMR by 6m. The initial slope of BCR-ABL decline correlated strongly with the decline over the longer term. The mean time to CMR after attainment of MMR was significantly faster for pts with MMR by 6m compared to those with MMR at &gt;6 to 12m and &gt;12 to 18m: 24m vs 37m vs 42m, respectively, P=0.001. This suggests the rate of BCR-ABL reduction below the level of MMR was faster in pts with MMR by 6m, which may be clinically beneficial as none of these pts had a subsequent event. Based on these findings we propose that inducing earlier molecular responses with higher dose imatinib or more potent kinase inhibitors may lead to more durable and deeper responses. It remains possible however, that early molecular response reflects a more biologically favourable disease rather than being the direct cause of more durable response. Finally, CMR was associated with an extremely low risk of events, making it an appropriate next target of therapy after MMR is achieved. Figure Figure


Blood ◽  
2010 ◽  
Vol 116 (21) ◽  
pp. 368-368 ◽  
Author(s):  
Elias J. Anaissie ◽  
Frits van Rhee ◽  
Antje Hoering ◽  
Sarah Waheed ◽  
Yazan Alsayed ◽  
...  

Abstract Abstract 368 Background: TT3, incorporating bortezomib and thalidomide with induction prior to and consolidation after melphalan 200mg/m2-based transplants and 3 year maintenance with VTD (year 1) and TD (years 2+3) in TT3A and with VRD for 3 years in TT3B resulted in a high CR rate of ∼60% and, in the 85% of patients with GEP-defined low-risk MM, 5-yr OS/EFS of 80%/78%; 5-year CR duration estimate was 88%. Patients and Methods: Phase III trial TT4 for low-risk MM randomized patients between standard (S) and light (L) arms. TT4-L applied 1 instead of 2 cycles of induction therapy with M-VTD-PACE prior to and 1 instead of 2 cycles of consolidation with dose-reduced VTD-PACE after tandem transplantation. M-VTD-PACE comprised melphalan, bortezomib, thalidomide, dexamethasone and 4-day continuous infusions of cisplatin, doxorubicin, cyclophosphamide, etoposide. TT4-S applied standard single dose melphalan 200mg/m2, while TT4-L used a 4-day fractionated schedule of melphalan 50mg/2 on days 1–4. VRD maintenance for 3 years was identical in both arms. Here we report, for both TT4 arms combined, on grade >2 mucosal toxicities, applying CTCAE version 3.0, and on efficacy (CR, EFS, OS) in relationship to TT3 in low-risk MM. At the time of analysis, median follow-up on TT4 is 10.7 months and on TT3A/B 62.3/33.4 months. To facilitate comparisons between trials with different follow-up times, TT3 data were backdated to follow-up time comparable to TT4 as of this reporting time. Results: Baseline characteristics were similar in TT3 (n=364) and TT4 (n=165) in terms of B2M both >=3.5mg/L and >5.5mg/L, and elevated levels of CRP, creatinine, and LDH. Presence of cytogenetic abnormalities (CA) overall and in terms of CA13/hypodiploidy was similar in both. Fewer TT4 patients had ISS-1 (31% v 43%, P=0.010) and more had hemoglobin <10g/dL (35% v 26%, P=0.029). While neither trial had GEP-defined high-risk in the 70-gene model (R70), the more recently validated R80 distribution showed 7% high-risk in TT4 v 3% in TT3 (P=0.031). DelTP53 was more prevalent in TT4 than TT3 (39% v 10%, P<0.001), and MY favorable subgroup designation pertained to 3% in TT4 v 12% in TT3 (P=0.002). Toxicities are reported per protocol phase. During induction (TT4, n=160; TT3, n=364), grade >2 mucosal toxicities included colitis in 0%/1% (P=0.32), esophagitis/dysphagia in 0%/1% (P=0.33), GI mucositis, NOS in 1%/1% (P=0.99) and stomatitis/pharyngitis in 0%/1% (P=0.99). With transplant-1, (TT4, n=139; TT3, n=344), grade >2 mucosal toxicities included colitis in 3%/1% (P=0.24), esophagitis/dysphagia in 1%/5% (P=0.03), gastritis in 1%/0% (P=0.29), GI mucositis, NOS in 1%/2% (P=0.73) and stomatitis/pharyngitis in 0%/5% (P=0.008); with transplant-2 (TT4, n=105; TT3, n=294), grade >2 mucosal toxicities included colitis in 4%/3% (P=0.77), esophagitis/dysphagia in 0%/2% (P=0.20), GI mucositis, NOS in 2%/3% (P=0.99) and stomatitis/pharyngitis in 0%/1% (P=0.58). With consolidation (TT4, n=85; TT3, n=280), grade >2 mucosal toxicities included colitis in 0%/3% (P=0.36) and GI mucositis, NOS in 0%/1% (P=0.99). Timing of onset and final levels of CR differed substantially between TT4 and TT3 in favor of TT4 (P=0.006); no differences were observed in OS (P=0.36), EFS (P=0.66), and CR duration (P=0.12). Conclusion: TT4 (both arms combined) provided, despite higher proportions of patients with unfavorable characteristics than in TT3, superior CR rate and comparable survival outcomes to TT3's low-risk population. GI toxicities were reduced in TT4 v TT3. Results of TT4 arms will be presented. Disclosures: No relevant conflicts of interest to declare.


Blood ◽  
2010 ◽  
Vol 116 (21) ◽  
pp. 4338-4338
Author(s):  
Stefania Paolini ◽  
Sarah Parisi ◽  
Ilaria Iacobucci ◽  
Cristina Papayannidis ◽  
Maria Chiara Abbenante ◽  
...  

Abstract Abstract 4338 Background. Acute lymphoblastic leukemia (ALL) presents with different outcome in children and adults, with event-free-survival (EFS) rates of 70–80% and 30–40% at 5 years, respectively. This reflects both a different disease biology and different therapeutic approaches. Recently, results apparently improved in young adults/adolescents aged 15–21 years, with de novo ALL, when treated with pediatric intensive regimens rather than with typical adult regimens. Similarly, clinical studies are ongoing in older patients, toxicity related-therapy seeming the limiting issue. Aims. We report a single centre experience on adult ALL patients treated with an intensive pediatric-inspired schedule, designed to assess its tolerability and efficacy. Methods. From November 2007 to June 2010 seventeen ALL patients (M/F=12/5) were treated at our Center according to a modified AIEOP LAL2000 regimen. Treatment consisted of 7 days steroid pre-treatment, and four drugs 78-days induction (phase IA and phase IB) after which high risk (HR) patients were treated with three polychemotherapy blocks, while intermediate (IR) and standard risk (SR) patients went on 8-weeks consolidation and subsequent delayed intensification. Allo-SCT was planned for all patients with HLA-matched donor, as alternative to 2-years maintenance therapy. Median age was 31 years (range, 17–47). According to cytogenetic, response to steroid and minimal residual disease patients were classified into HR (n=7), IR (n=6) and SR (n=4). Results. 15/17 patients completed the induction phase IA, two being out for toxicity (grade IV infection and intestinal occlusion). Twelve (71%) obtained a complete remission (CR); three were refractory. However, one of them subsequently achieved CR after polychemotherapy blocks, for an overall response rate of 76% (13/17). Eleven patients then completed the 28-days induction IB. One patient is ongoing. Median induction duration was 92 days (range 82–136). Delays were mostly due to extra-hematological toxicity, the commonest being gastrointestinal (n=12), infective (n=7) and thrombotic (n=3). Delays were accumulated in both induction phases without significant difference between phase IA (median 18.5 days, range 4–37) and phase IB (median 17 days, range 9–66), despite an absolute number of moderate-severe AE superior in phase-IA versus phase-IB (12 vs 5). After induction, 4/12 patients already received consolidation therapy; 2/4 then received allo-SCT. The median duration of consolidation was 51 days (range 22–94). Conversely, 6/12 patients received polychemotherapy-blocks, one patient went directly on alloSCT and the remaining is ongoing. After polychemotherapy-blocks, five out six patients received allo-SCT. The median CR duration was 13 months (range 1+-42+); two patients relapsed, both after allo-SCT. With a median follow-up of 11 months (range 2–43) 11/17 (65%) patients are alive, 9 in CR (5 undergone allo-SCT). Six patients dead, three in CR for infectious complications, 3 for relapsed/refractory disease. Conclusions. Though in a small series, pediatric-like intensive chemotherapy seemed to be feasible in adult ALL. Extra-hematological toxicity, however, caused significant treatment delays during induction. Finally, the overall outcome appeared promising, though longer follow-up and larger populations are needed to draw definitive conclusions. Acknowledgments. BolognAIL, European LeukemiaNet, AIRC, Fondazione Del Monte di Bologna e Ravenna, FIRB 2006, PRIN 2008, Ateneo RFO, Project of Integrated Program (PIO), Programma di Ricerca Regione – Università 2007–2009. Disclosures: No relevant conflicts of interest to declare.


Sign in / Sign up

Export Citation Format

Share Document