Scoring System To Predict the Risk of Early Death during Induction Chemotherapy for Primary Acute Myeloid Leukemia (AML).

Blood ◽  
2006 ◽  
Vol 108 (11) ◽  
pp. 1972-1972
Author(s):  
I. Sánchez-Ortega ◽  
D. Valcárcel ◽  
S. Brunet ◽  
Jordi Esteve ◽  
J. Berlanga ◽  
...  

Abstract Complications during induction chemotherapy (CT) for AML cause mortality in 5–20% of cases. It is relevant to define the risk of early death in this setting. In patients with a high probability of dying due to toxicity, the intensification of the supportive measures and/or alternative antineoplastic approaches could be appropriate. The aim of this study was to detect the features associated with lethal complications during induction CT for AML. We defined early deaths (ED) as those occurring before 42 days after the start of induction in the absence of evident leukemia. We analyzed all consecutive patients diagnosed with AML between June 1998 and February 2006 in 20 Spanish hospitals. These cases were treated according two multicenter trials CETLAM 99 (n=326) and CETLAM 2003 (n=248). Both schemes included idarubicin 12 mg/m2 days 1,3 and 5, etoposide 100 mg/m2 days 1–3 and cytarabine 500 mg/m2/12 hours days 1, 3, 5 and 7 as front-line treatment. In the CETLAM 2003, G-CSF (150 mg/m2 days 0–7) was added as priming therapy. The series included 574 patients, 248 (43%) female, with a median age of 48 years. Creatinine level was elevated (> 1,2 mg/dL) in 11% of patients. The overall mortality rate during induction (ED) was 12% (n=69). 335 (58%) patients achieved a complete remission with a single course of CT, 108 (19%) a partial remission and 62 (11%) were refractory. The most common causes of death were infection (n= 28, 46%), bleeding (n=7, 11%), pulmonary failure not due to infection (n=6, 10%), and multiorgan failure (n=4, 7%). Univariate analysis showed that age older than 50 years old, male gender, M4 or M5 FAB subtype, leukocyte count higher than 100x109/L, blasts in the marrow >70% and creatinine level above 1,2 mg/dL were associated with more frequent ED. In multivariate analysis, elevated creatinine level [hazard ratio (HR) 2.6 (1.3–5.2); P=0.009], leukocytosis >100x109/L [HR 2.3 (1.1–4.6) P=0.021] and age > 50 years old [HR 2.1 (1.4–3.9) P=0.018] were independent risk factors. The remaining parameters, including among others the use of G-CSF, gender, blasts in the marrow, treatment protocol, cytogenetics at diagnosis and FLT-3 mutational status were not associated with higher incidence of ED. Taking into account the three significant variables identified as risk factors, we developed a scoring system to predict the probability of ED during induction. The probability of ED in the low risk (none risk factor, n=212), intermediate risk (1 risk factor, n=239) and high risk (2 or 3 risk factors, n=57) categories were 3%, 11% and 28% respectively (P<0.001). The HR of ED for patients in the intermediate and high risk groups were 3.6 (1.5–8.1, P=0.003) and 7.9 (3.1–19.7; P<0.001) as compared with the low risk group. In conclusion, impaired kidney function, advanced age and hyperleukocytosis are the relevant variables increasing mortality during induction CT for AML. With these three factors it is possible to define a scoring system which predicts the probability of ED. A different approach for patients in the high-risk group should be investigated.

2020 ◽  
Vol 11 ◽  
pp. 215013272098129
Author(s):  
Lauren Oshman ◽  
Amanda Caplan ◽  
Raabiah Ali ◽  
Lavisha Singh ◽  
Rabeeya Khalid ◽  
...  

Introduction: The CDC and Illinois Department of Public Health disseminated risk factor criteria for COVID-19 testing early in the pandemic. The objective of this study is to assess the effectiveness of risk stratifying patients for COVID-19 testing and to identify which risk factors and which other clinical variables were associated with SARS-CoV-2 PCR test positivity. Methods: We conducted an observational cohort study on a sample of symptomatic patients evaluated at an immediate care setting. A risk assessment questionnaire was administered to every patient before clinician evaluation. High-risk patients received SARS-CoV-2 test and low-risk patients were evaluated by a clinician and selectively tested based on clinician judgment. Multivariate analyses tested whether risk factors and additional variables were associated with test positivity. Results: The adjusted odds ratio of testing positive was associated with COVID-19-positive or suspect close contact (aOR 1.56, 95% CI 1.15-2.10), large gathering attendance with a COVID-19-positive individual (aOR 1.92, 95% CI 1.10-3.34), and, with the largest effect size, decreased taste/smell (aOR 2.83, 95% CI 2.01-3.99). Testing positive was associated with ages 45-64 and ≥65 (aOR 1.75, 95% CI 1.25-2.44, and aOR 2.78, 95% CI 1.49-5.16), systolic blood pressures ≤120 (aOR 1.64, 95% CI 1.20-2.24), and, with the largest effect size, temperatures ≥99.0°F (aOR 3.06, 95% CI 2.23-4.20). The rate of positive SARS-CoV-2 test was similar between high-risk and low risk patients (225 [22.2%] vs 50 [19.8%]; P = .41). Discussion: The risk assessment questionnaire was not effective at stratifying patients for testing. Although individual risk factors were associated with SARS-CoV-2 test positivity, the low-risk group had similar positivity rates to the high-risk group. Our observations underscore the need for clinicians to develop clinical experience and share best practices and for systems and payors to support policies, funding, and resources to test all symptomatic patients.


2021 ◽  
Vol 21 (1) ◽  
Author(s):  
Sandra Chamat-Hedemand ◽  
Niels Eske Bruun ◽  
Lauge Østergaard ◽  
Magnus Arpi ◽  
Emil Fosbøl ◽  
...  

Abstract Background Infective endocarditis (IE) is diagnosed in 7–8% of streptococcal bloodstream infections (BSIs), yet it is unclear when to perform transthoracic (TTE) and transoesophageal echocardiography (TOE) according to different streptococcal species. The aim of this sub-study was to propose a flowchart for the use of echocardiography in streptococcal BSIs. Methods In a population-based setup, we investigated all patients admitted with streptococcal BSIs and crosslinked data with nationwide registries to identify comorbidities and concomitant hospitalization with IE. Streptococcal species were divided in four groups based on the crude risk of being diagnosed with IE (low-risk < 3%, moderate-risk 3–10%, high-risk 10–30% and very high-risk > 30%). Based on number of positive blood culture (BC) bottles and IE risk factors (prosthetic valve, previous IE, native valve disease, and cardiac device), we further stratified cases according to probability of concomitant IE diagnosis to create a flowchart suggesting TTE plus TOE (IE > 10%), TTE (IE 3–10%), or “wait & see” (IE < 3%). Results We included 6393 cases with streptococcal BSIs (mean age 68.1 years [SD 16.2], 52.8% men). BSIs with low-risk streptococci (S. pneumoniae, S. pyogenes, S. intermedius) are not initially recommended echocardiography, unless they have ≥3 positive BC bottles and an IE risk factor. Moderate-risk streptococci (S. agalactiae, S. anginosus, S. constellatus, S. dysgalactiae, S. salivarius, S. thermophilus) are guided to “wait & see” strategy if they neither have a risk factor nor ≥3 positive BC bottles, while a TTE is recommended if they have either ≥3 positive BC bottles or a risk factor. Further, a TTE and TOE are recommended if they present with both. High-risk streptococci (S. mitis/oralis, S. parasanguinis, G. adiacens) are directed to a TTE if they neither have a risk factor nor ≥3 positive BC bottles, but to TTE and TOE if they have either ≥3 positive BC bottles or a risk factor. Very high-risk streptococci (S. gordonii, S. gallolyticus, S. mutans, S. sanguinis) are guided directly to TTE and TOE due to a high baseline IE prevalence. Conclusion In addition to the clinical picture, this flowchart based on streptococcal species, number of positive blood culture bottles, and risk factors, can help guide the use of echocardiography in streptococcal bloodstream infections. Since echocardiography results are not available the findings should be confirmed prospectively with the use of systematic echocardiography.


2012 ◽  
Vol 22 (8) ◽  
pp. 1389-1397 ◽  
Author(s):  
Seiji Mabuchi ◽  
Mika Okazawa ◽  
Yasuto Kinose ◽  
Koji Matsuo ◽  
Masateru Fujiwara ◽  
...  

ObjectivesTo evaluate the significance of adenosquamous carcinoma (ASC) compared with adenocarcinoma (AC) in the survival of surgically treated early-stage cervical cancer.MethodsWe retrospectively reviewed the medical records of 163 patients with International Federation of Gynecology and Obstetrics stage IA2 to stage IIB cervical cancer who had been treated with radical hysterectomy with or without adjuvant radiotherapy between January 1998 and December 2008. The patients were classified according to the following: (1) histological subtype (ASC group or AC group) and (2) pathological risk factors (low-risk or intermediate/high-risk group). Survival was evaluated using the Kaplan-Meier method and compared using the log-rank test. Multivariate analysis of progression-free survival (PFS) was performed using the Cox proportional hazards regression model to investigate the prognostic significance of histological subtype.ResultsClinicopathological characteristics were similar between the ASC and AC histology groups. Patients with the ASC histology displayed a PFS rate similar to that of the patients with the AC histology in both the low-risk and intermediate/high-risk groups. Neither the recurrence rate nor the pattern of recurrence differed between the ASC group and the AC group. Univariate analysis revealed that patients with pelvic lymph node metastasis and parametrial invasion achieved significantly shorter PFS than those without these risk factors.ConclusionsCharacteristics of the patients and the tumors as well as survival outcomes of ASC were comparable to adenocarcinoma of early-stage uterine cervix treated with radical hysterectomy. Our results in part support that the management of ASC could be the same as the one of AC of the uterine cervix.


2019 ◽  
Author(s):  
Junxiong Yin ◽  
Chuanyong Yu ◽  
Hongxing Liu ◽  
Mingyang Du ◽  
Feng Sun ◽  
...  

Abstract Objective: To establish a predictive model of carotid vulnerable plaque through systematic screening of high-risk population for stroke.Patients and methods: All community residents who participated in the screening of stroke high-risk population by the China National Stroke Screening and Prevention Project (CNSSPP). A total of 19 risk factors were analyzed. Individuals were randomly divided into Derivation Set group and Validation Set group. According to carotid ultrasonography, the derivation set group patients were divided into instability plaque group and non-instability plaque group. Univariate and multivariable logistic regression were taken for risk factors. A predictive model scoring system were established by the coefficient. The AUC value of both derivation and validation set group were used to verify the effectiveness of the model.Results: A total of 2841 high-risk stroke patients were enrolled in this study, 266 (9.4%) patients were found instability plaque. According to the results of Doppler ultrasound, Derivation Set group were divided into instability plaque group (174 cases) and non-instability plaque group (1720 cases). The independent risk factors for carotid instability plaque were: male (OR 1.966, 95%CI 1.406-2.749),older age (50-59, OR 6.012, 95%CI 1.410-25.629; 60-69, OR 13.915, 95%CI 3.381-57.267;≥70, OR 31.267, 95%CI 7.472-130.83) , married(OR 1.780, 95%CI 1.186-2.672),LDL-c(OR 2.015, 95%CI 1.443-2.814), and HDL-C(OR 2.130, 95%CI 1.360-3.338). A predictive scoring system was created, range 0-10. The cut-off value of prediction model score is 6.5. The AUC value of derivation and validation set group were 0.738 and 0.737.Conclusion:For a high risk group of stroke individual, We provide a model that could distinguishing those who have a high probability of having carotid instability plaque. When resident’s predictive model score exceeds 6.5, the incidence of carotid instability plaque is high, carotid artery Doppler ultrasound would be checked immediately. This model can be helpful in the primary prevention of stroke.


2017 ◽  
Vol 58 (1) ◽  
pp. 16-24
Author(s):  
Insook Kim ◽  
Seonae Won ◽  
Mijin Lee ◽  
Won Lee

The aim of this study was to find out the risk factors through analysis of seven medical malpractice judgments related to fall injuries. The risk factors were analysed by using the framework that approaches falls from a systems perspective and comprised people, organisational or environmental factors, with each factor being comprised of subfactors. The risk factors found in each of the seven judgments were aggregated into one framework. The risk factors related to patients (i.e. the people factor) were age, pain, related disease, activities and functional status, urination state, cognitive function impairment, past history of fall, blood transfusion, sleep endoscopy state and uncooperative attitude. The risk factors related to the medical staff and caregivers (i.e. people factor) were observation negligence, no fall prevention activities and negligence in managing high-risk group for fall. Organisational risk factors were a lack of workforce, a lack of training, neglecting the management of the high-risk group, neglecting the management of caregivers and the absence of a fall prevention procedure. Regarding the environment, the risk factors were found to be the emergency room, chairs without a backrest and the examination table. Identifying risk factors is essential for preventing fall accidents, since falls are preventable patient-safety incidents. Falls do not happen as a result of a single risk factor. Therefore, a systems approach is effective to identify risk factors, especially organisational and environmental factors.


Author(s):  
Halley Ruppel ◽  
Vincent X. Liu ◽  
Neeru R. Gupta ◽  
Lauren Soltesz ◽  
Gabriel J. Escobar

Abstract Objective This study aimed to evaluate the performance of the California Maternal Quality Care Collaborative (CMQCC) admission risk criteria for stratifying postpartum hemorrhage risk in a large obstetrics population. Study Design Using detailed electronic health record data, we classified 261,964 delivery hospitalizations from Kaiser Permanente Northern California hospitals between 2010 and 2017 into high-, medium-, and low-risk groups based on CMQCC criteria. We used logistic regression to assess associations between CMQCC risk groups and postpartum hemorrhage using two different postpartum hemorrhage definitions, standard postpartum hemorrhage (blood loss ≥1,000 mL) and severe postpartum hemorrhage (based on transfusion, laboratory, and blood loss data). Among the low-risk group, we also evaluated associations between additional present-on-admission factors and severe postpartum hemorrhage. Results Using the standard definition, postpartum hemorrhage occurred in approximately 5% of hospitalizations (n = 13,479), with a rate of 3.2, 10.5, and 10.2% in the low-, medium-, and high-risk groups. Severe postpartum hemorrhage occurred in 824 hospitalizations (0.3%), with a rate of 0.2, 0.5, and 1.3% in the low-, medium-, and high-risk groups. For either definition, the odds of postpartum hemorrhage were significantly higher in medium- and high-risk groups compared with the low-risk group. Over 40% of postpartum hemorrhages occurred in hospitalizations that were classified as low risk. Among the low-risk group, risk factors including hypertension and diabetes were associated with higher odds of severe postpartum hemorrhage. Conclusion We found that the CMQCC admission risk assessment criteria stratified women by increasing rates of severe postpartum hemorrhage in our sample, which enables early preparation for many postpartum hemorrhages. However, the CMQCC risk factors missed a substantial proportion of postpartum hemorrhages. Efforts to improve postpartum hemorrhage risk assessment using present-on-admission risk factors should consider inclusion of other nonobstetrical factors.


Blood ◽  
2006 ◽  
Vol 108 (11) ◽  
pp. 3599-3599
Author(s):  
Naseema Gangat ◽  
Alexandra Wolanskyj ◽  
Rebecca F. McClure ◽  
Chin Y. Li ◽  
Susan M. Schwager ◽  
...  

Abstract Background It is widely recognized that advanced age and prior thrombosis predict recurrent thrombosis in essential thrombocythemia (ET) and are used to risk-stratify patients. However, the paucity of large sample size and long-term follow-up has limited the development of similar prognostic models for survival and leukemic transformation (LT). Methods Data was abstracted from the medical records of a consecutive cohort of patients with WHO-defined ET seen at the Mayo Clinic. Cox proportional hazards was used to determine the impact of clinical and laboratory variables on survival and LT. Overall survival and leukemia-free survival was estimated by Kaplan-Meier plots. Results i. Patient characteristics and outcome The study cohort included 605 patients of which 399 (66%) were females (median age, 57 years; range 5–91). Median follow-up was 84 months (range; 0–424). During this period, 155 patients (26%) have died and LT was documented in 20 patients (3.3%) occurring at a median of 138 months (range; 23–422) from ET diagnosis. ii. Prognostic variables for overall survival Univariate analysis of parameters at diagnosis identified age ≥ 60 years, hemoglobin less than normal (defined as < 12 g/dL in females and < 13.5 g/dL in males), leukocyte count ≥ 15 x 109/L, tobacco use, diabetes mellitus, thrombosis, male sex, and the absence of microvascular symptoms as independent predictors of inferior survival. All of the above except the last two (i.e. male sex and the absence of microvascular symptoms) sustained their prognostic significance on multivariate analysis. Based on the first three prognostic variables: age, hemoglobin level, and leukocyte count, we constructed a prognostic model for survival: low-risk (none of the risk factors), intermediate-risk (1of 3 risk factors), and high-risk (≥ 2 risk factors). The respective median survivals were 278, 200, and 111 months (p<0.0001; Figure 1) iii. Prognostic variables for leukemic transformation On univariate analysis of parameters at ET diagnosis, LT was significantly associated with platelet count ≥ 1000 x 109/L, hemoglobin less than normal, and exposure to P-32. However, on multivariate analysis, only hemoglobin less than normal and platelet count ≥ 1000 x 109/L maintained independent prognostic value. Accordingly, we utilized these two variables, to construct a prognostic model for LT: low-risk (none of the risk factors), intermediate-risk (1 risk factor), and high-risk (both risk factors). Only 1 of the 239 patients (0.4%) in the low-risk group vs. 14 of the 289 (4.8%) in the intermediate-risk and 5 of the 77 (6.5%) in the high-risk group underwent LT (p=0.0009; Figure 2). Conclusion The current study provides clinician-friendly prognostic models for both survival and LT in ET. Figure 1 Figure 1. Figure 2 Figure 2.


Blood ◽  
2008 ◽  
Vol 112 (11) ◽  
pp. 2686-2686 ◽  
Author(s):  
David P. Steensma ◽  
Curtis A Hanson ◽  
Ayalew Tefferi

Abstract Background: The 2001 WHO classification of myeloid neoplasms distinguished 2 forms of MDS associated with &gt;=15% ring sideroblasts and &lt;5% marrow blasts: refractory cytopenia with multilineage dysplasia and with ring sideroblasts (RCMD-RS) vs. refractory anemia with ring siderblasts (RARS, erythroid-restricted dysplasia). However, the real prognostic value of separating RCMD-RS from RCMD with &lt;15% ring sideroblasts and from RARS is uncertain, and the WHO has proposed merging RCMD-RS and RCMD in the 2008 classification revision. Furthermore, the WHO-based Prognostic Scoring System (WPSS), proposed by Malcovati and colleagues in 2005 as a dynamic system that overcomes some of the limitations of the 1997 International Prognostic Scoring System (IPSS), has undergone limited independent external validation to date and its applicability to sideroblastic MDS in particular is unclear. We assessed the validity of the 2008 WHO reclassification and the WPSS for MDS cases associated with &gt;=15% ring sideroblasts and a normal blast proportion. Methods: We reviewed WPSS and IPSS component parameters at diagnosis and the clinical outcomes of 465 patients (68% males, median age 72) evaluated at our institution over a 13-year period: 140 with RARS, 114 with RCMD-RS, and 211 with RCMD. Patients were assigned a WPSS score and risk category (very low-risk group=0 points; low=1; intermediate=2, high=3 or 4) by summing 3 subscores: 2001 WHO classification (0 for RARS, 1 point for RCMD or RCMD-RS), IPSS cytogenetic risk group (0=good, 1=indeterminate, 2=poor), and red cell transfusion dependence (0=no, 1=yes). Survival was assessed by Kaplan-Meier estimates, and prognostic factors examined by proportional hazards analysis. Results: The median time until death or last followup was 26 months, and 70% of patients were known to have died. The median survival by WHO MDS subtype was 75 months for RARS, 25 months for RCMD-RS, and 26 months for RCMD (Log-Rank p&lt;0.0001 for RARS vs. either RCMD-RS or RCMD; p=0.60 for RCMD vs. RCMD-RS ). Both the WPSS and IPSS predicted overall survival in patients with ring sideroblasts. Median survival for the patients grouped by WPSS risk category was 89 months for very low risk (n=95), 41 for low risk (n=198), 31 for intermediate risk (n=82), and 11 for high risk (n=91) (p&lt;0.0001, except for low risk vs. intermediate risk, p=0.31). (Very high risk WPSS scores cannot be achieved without excess marrow blasts, and such patients were excluded from this analysis.) Median survival by IPSS was 73 months for low-risk, 33 months for intermediate-1, and 8 months for intermediate 2 (p&lt;0.0001). The IPSS’ predictive power was unchanged if patients with secondary MDS were included or excluded (the IPSS was based on a review of 816 patients with apparently de novo MDS). Conclusions: These data support the WHO’s proposal to merge RCMD and RCMD-RS, and suggest that the adverse prognostic significance of multilineage dysplasia renders the presence of ring sideroblasts unimportant. The WPSS is a valid prognostic tool in patients with MDS associated with ring sideroblasts, but in this subgroup both the WPSS and IPSS stratify patients into 3 risk groups, and the WPSS does not offer additional value over the IPSS. Figure Figure


Blood ◽  
2008 ◽  
Vol 112 (11) ◽  
pp. 735-735
Author(s):  
Alex Klimowicz ◽  
Paola Neri ◽  
Adnan Mansoor ◽  
Anthony Magliocco ◽  
Douglas A. Stewart ◽  
...  

Abstract Background: Autologous stem cell transplantation (ASCT) has dramatically improved the survival of myeloma patients; however, this approach has significant toxicities and nearly 25% of MM patients progress within one year from their transplant. While gene expression profiling-based (GEP) molecular classification has permitted the identification of unresponsive high-risk patients, these approaches have proven too costly and complex to translate into clinical practice. Less expensive and more readily available methods are needed clinically to identify, at the time of diagnosis, MM patients who may benefit from more aggressive or experimental therapies. While protein-based tissue arrays offer such alternative, biases introduced by the “observer-dependent” scoring methods have limited their wide applicability. Methods: We have designed a simplified, fully automated and quantitative protein expression based-classification system that will allow us to accurately predict survival post ASCT in a cost effective and “observer-independent” manner. We constructed tissue microarrays using diagnostic bone marrow biopsies of 82 newly diagnosed MM patients uniformly treated with a dexamethasone based induction regimen and frontline ASCT. Using the HistoRx PM-2000 quantitative immunohistochemistry platform, coupled with the AQUA analysis software, we have examined the expression of the following proteins: FGFR3 which is associated with t(4;14), cyclin B2 and Ki-67 which are associated with cellular proliferation, TACI which is associated with maf deregulation, and phospho-Y705 STAT3 and p65NF-κB, which are associated with myeloma cell growth and survival. For FGFR3, patients were divided into FGFR3 positive and negative groups based on hierarchical clustering of their AQUA score. For all other proteins examined, based on AQUA scores, the top quartiles or quintiles of patients were classified as high expression groups. Based on the univariate analysis, patients were further classified as “High Risk” MM if they had been identified as high expressers of either TACI, p65NF-κB or FGFR3. The Kaplan-Meier method was used to estimate time to progression and overall survival. Multivariate analysis was performed using the Cox regression method. Results: 82 patients were included in this study. In univariate analysis, FGFR3 and p65NF-κB expression were associated with significantly shorter TTP (p=0.018 and p=0.009) but not OS (p=0.365 and p=0.104). TACI expression levels predicted for worse OS (p=0.039) but not TTP (p=0.384). High expression of Ki67 or phospho-Y705 STAT3 did not affect survival. Of the 82 cases, 67 were included in the multivariate analysis since they had AQUA scores available for all markers: 26 (38.8%) were considered as High Risk by their AQUA scores and had significantly shorter TTP (p=0.014) and OS (p=0.006) compared to the Low Risk group. The median TTP for the Low and High Risk groups was 2.9 years and 1.9 years, respectively. The 5-years estimates for OS were 60.6% for the High Risk group versus 83.5% for the Low Risk group. Multivariate analysis was performed using del13q and our risk group classification as variables. Both our risk group classification and del13q were independent predictors for TTP, having 2.4 and 2.3 greater risk of relapse, respectively. Our risk group classification was the only independent predictor of OS with the High Risk group having a 5.9 fold greater risk of death. Conclusions: We have found that the expression of FGFR3, TACI, and p65NF-κB, in an automated and fully quantitative tissue-based array, is a powerful predictor of survival post-ASCT in MM and eliminates the “observer-dependent” bias of scoring TMAs. A validation of this “High Risk” TMA based signature is currently underway in larger and independent cohorts. Figure Figure


Blood ◽  
2011 ◽  
Vol 118 (21) ◽  
pp. 3811-3811
Author(s):  
Drorit Merkel ◽  
Kalman Filanovsky ◽  
Ariel Aviv ◽  
Moshe E. Gatt ◽  
Yair Herishanu ◽  
...  

Abstract Abstract 3811 Background: Azacitidine is an effective therapy for high risk myelodysplastic syndrome (MDS). Neutropenic fever is a common life threatening complication during azacitidine therapy, however predicting it, is challenging. Despite a number of large scale prospective studies, there are no established indications for primary or secondary prophylactic antibiotics or for the use of granulocyte colony-stimulating factor (G-CSF) (Pierre Fenauxa et al. Leukemia Research 2010). We used a retrospective survey of 98 high risk MDS and AML patients treated with Azacitidine, to develop a predicting model for infection during each cycle of Azacitidine therapy. Methods: We retrospectively studied 82 high risk MDS and 16 AML patients treated with 456 azacitidine cycles between 9.2008 and 7.2011at 11 institutions from Israel. Information, of complete blood count, creatinine and liver enzymes was documented prior to initiation of each cycle. Results: Patients' median age was 71 (range 27–92) and 57 (58%) of them males. Poor cytogenetic abnormalities were detected in 30.8% (25 of 82 patients with available cytogenetic) and 65 (67%) were transfusions dependent. The median interval between the initial diagnosis and the initiation of azacitidine therapy was 187 days (range 4 days – 18 years). Azacitidine was administrated as first line therapy in 24 (24%) of patients, 37 (38%) had failed growth factors, 5 (5%) were relapsing after allogeneic transplantation and 32 (33%) were given different chemotherapies prior to azacitidine therapy. Doses and schedule of azacitidine data were available for 98% (446/456) of cycles. The prevalence of 7 days cycles of 75mg/m2, 5 days cycles of 75mg/m2 or attenuated doses were 50.4%, 30%, 16.9% respectively. Adverse events were obtained from patient's charts. 13 major bleeding and 78 infections episodes (2.85% and 16.9% of all cycles) were recorded. Due to the low number of bleeding events we focused on factors predicting infection episodes. Infection rates of 22.7%, 14.2% and 6.9% correlated with azacitidine dose (75mg/m2x7d Vs 5d) and lower respectively). Excluding 87 cycles of doses lower than 75mg/m2 for 5 days, predictors of infections were evaluated in 369 cycles. Nine parameters were included in final analysis: age, sex, cytogenetics, being transfusion dependent prior to first cycle, time from diagnosis to the first cycle, azacitidine dose and neutrophil, thrombocyte and creatinine values prior to each cycle. The odd ratio off infections related to neutrophils count was higher than ANC, so we used neutrophils counts as a predictor. For each cycle we considered full 7 days Vs 5 days schedule, neutrophil above or below 500 cells/mcl, platelet above or below 20,000 cells/mcl and creatinine level prior to the first day of cycle. In univariate analysis neutrophil below 500, platelet below 20,000, creatinine level, azacitidine dose and being transfusion dependent were correlated with infection. In a multivariate analysis (table 1) transfusion dependency and platelets lower than 20,000 were the only significant parameters. Risk of infection was higher when a full seven days cycle was administrated but haven't reach statistical significance (p=0.07). Conclusions: Transfusion dependency prior to first cycle and platelets lower than 20,000 prior to each cycle, are the main significant risk factors for infections during azacitidine therapy. Neutropenia and age are known risk factors for infections in general, but were not significant in our study. We assume that in high risk MDS patients when most off the patients are old and neutropenic, thrombocytopenia is a surrogate marker of disease status which makes the patient more prone to infections. Therefore physicians should considerer these two parameters prior to every azacitidine cycle as guidance in the debate of concurrent prophylactic antibiotics, G-CSF or a tolerable dose of azacitidine. Our findings should be confirmed in a larger sample set but may pave the road for prospective studies of infection prophylaxis during azacitidine therapy. Disclosures: No relevant conflicts of interest to declare.


Sign in / Sign up

Export Citation Format

Share Document