Impact of 100-Day Survival on Outcome of Myeloablative Unrelated Bone Marrow Transplantation at an Early Stage of Leukemias.

Blood ◽  
2006 ◽  
Vol 108 (11) ◽  
pp. 5400-5400
Author(s):  
Akira Hiraoka

There is a growing demand for quality improvement in hematopoietic stem cell transplantation as in other fields of patient manegement. Indicators would be necessary for the quality control, which finally lead a continuoous quality improvement process. We analyzed the impact of 100-day survival on the outcome in leukemia patients receiving a myeloablative unrelated BMT at an early stage of leukemias. Data on 1,203 patients who received a myeloablative unrelated BMT at the first CR of acute leukemia or the first chronic phase of CML between 1993–2001 were retrieved from Japan Marrow Donor Program Registry’s database. There were 40 hospitals which performed more than 10 BMTs during this period. A significantly lower 100-day survival than the average of all patients (n=822) was found in six hospitals (group 1). The remaining 36 hospitals were grouped in group 2. Overall survival (OS) at 1-year was 0.56 in group 1 (n=125) and 0.75 in group 2 (n=697) (p<0.001). Figure Figure The 100-day survival was 0.66 (83/125) in group 1 and 0.90 (625/697) in group 2 (p<0.001). OS at 1-year for survivors over 100 days was 0.828 in group 1 and 0.832 in group 2 (p=0.9). Figure Figure After adjustment for age (~15, 16~39, 40~), type of leukemia (CML, ALL, ANLL), and group (1, 2) by using the Cox regression model, patients in group 1 were found to have a higher risk of death than patients in group 2 (Hazard Ratio 2.14; 95% CI 1.58–2.90). These results indicate that among hospitals which performed more than 10 BMTs between 1993–2003, 6 hospitals showed a significantly lower 100-day survival than the average of all patients, OS at 1-year for survivors over 100 days in patients in these 6 hospitals was simillar in patients in the remainging 34 hospitals, and after adjustment of age, type of leukemia and hospital group by using Cox regression model, patients in these 6 hospitals were found to have a higher risk of death than patients in the remaining 34 hospitals. Thus, the 100-day survival might be an indicator for the quality control in leukemia patients who receive a myeloablative unrelated BMT at an early stage of leukemias.

2021 ◽  
Vol 8 (3) ◽  
pp. 159-170
Author(s):  
Paweł Korczyc ◽  
Jędrzej Chrzanowski ◽  
Arkadiusz Stasiak ◽  
Joanna Stasiak ◽  
Andrzej Bissinger ◽  
...  

Aim: Our study aimed to identify the clinical variables associated with long-term mortality after MI and to construct a simple, easy to use clinical practice model for the prediction of 5 year mortality after MI. Material and Methods: This is a prospective 5-year observation study of MI patients admitted to the Department of Cardiology at the Copernicus Memorial Hospital in Lodz in 2010 and 2011. The data were collected during hospitalization and again after a period of 1 and 5 years. A multi-factor multi-level Cox regression model was constructed to investigate the impact of clinical factors on long-term survival.results: 92 patients (39 STEMI, 53 NSTEMI) were included in the study and their data were used to construct a Cox regression model with satisfactory fit (R 2 =0.7945). Factors associated with a decrease in 5-year risk are: age (1.06, 95%CI: 1.01-1.11), SYNTAX score (1.05, 95%CI: 1.02-1.08), WBC level (1.16, 95%CI: 1.08-1.26), and glycemia at enrollment (1.01, 95%CI: 1.01-1.01). Higher values of HDL at enrollment were associated with a decrease in 5-year risk (HR=0.97, 95%CI: 0.93-0.99).conclusion: The model we created is a valuable tool that is useful and easy to employ in everyday practice for assessing the 5-year prognosis of patients after MI. What is new: The study presents the new model for prediction of 5-year mortality after myocardial infarction. This model is based on simple clinical parameters and may by applied in everyday practice.


2009 ◽  
Vol 27 (15_suppl) ◽  
pp. 8080-8080
Author(s):  
L. E. Raez ◽  
T. Koru-Sengul ◽  
G. Allen ◽  
J. Clarke ◽  
E. S. Santos ◽  
...  

8080 Background: There are differences in the treatment outcome of non-small cell lung cancer (NSCLC) patients (pts) between non-Hispanic whites (NHW) and African Americans (AA). Little is known regarding the outcomes of Hispanics (H). Methods: Registry data on 2,696 pts with NSCLC treated during 1999–2006 was obtained. The objective of the study was to evaluate differences in NSCLC survival according to different ethnicities. Chi-square was used to compare distribution of tumor stage. Survival curves were compared using log-rank test for each of the tumor stages. Adjusted hazard ratios (AHR) and 95% confidence intervals (95% CI) were reported based on the results of a multivariate Cox regression model for overall survival (OS) with adjustment for gender, age at diagnosis, and race. Results: Most pts had stage III/IV at diagnosis; majority of the AA or HW presented in advanced stage compared with NHW. Significantly higher proportions of AA and H were diagnosed with stage IV compared to NHW ( Table ). Mean age at diagnosis was 62 yrs (AA 58, H 60, and NHW 66yrs) and it was significantly different among the 3 groups (one-way ANOVA, p<0.0001). AA and H have significantly shorter stage-specific median survival for early stage compared to that in NHW ( Table ). In pts with advanced stages the pattern was similar: AA and H have a significantly shorter median survival than that in NHW. In early-stage pts, significant predictors for OS from multivariate Cox regression model were female gender (AHR=0.65; p<0.001), AA (NHW as the referent group; AHR=2.67; p<0.0001), and H (NHW as the referent group; AHR=2.01; p<0.0001). In late-stage pts, significant predictors for OS were female gender (AHR=0.79; p=0.0002), AA (NHW as the referent group; AHR=1.53; p<0.0001, and H (NHW as the referent group; AHR=1.28; p=0.0006). Conclusions: NHW pts had better OS than H and AA; we will evaluate whether gene expression profiles or presence of EGFR overexpression have an impact on racial/ethnic disparities in the outcome of NSCLC. [Table: see text] No significant financial relationships to disclose.


BMJ Open ◽  
2022 ◽  
Vol 12 (1) ◽  
pp. e054069
Author(s):  
Marianna Meschiari ◽  
Alessandro Cozzi-Lepri ◽  
Roberto Tonelli ◽  
Erica Bacca ◽  
Marianna Menozzi ◽  
...  

ObjectiveThe first COVID-19–19 epidemic wave was over the period of February–May 2020. Since 1 October 2020, Italy, as many other European countries, faced a second wave. The aim of this analysis was to compare the 28-day mortality between the two waves among COVID-19 hospitalised patients.DesignObservational cohort study. Standard survival analysis was performed to compare all-cause mortality within 28 days after hospital admission in the two waves. Kaplan-Meier curves as well as Cox regression model analysis were used. The effect of wave on risk of death was shown by means of HRs with 95% CIs. A sensitivity analysis around the impact of the circulating variant as a potential unmeasured confounder was performed.SettingUniversity Hospital of Modena, Italy. Patients admitted to the hospital for severe COVID-19 pneumonia during the first (22 February–31 May 2020) and second (1 October–31 December 2020) waves were included.ResultsDuring the two study periods, a total of 1472 patients with severe COVID-19 pneumonia were admitted to our hospital, 449 during the first wave and 1023 during the second. Median age was 70 years (IQR 56–80), 37% women, 49% with PaO2/FiO2 <250 mm Hg, 82% with ≥1 comorbidity, median duration of symptoms was 6 days. 28-day mortality rate was 20.0% (95% CI 16.3 to 23.7) during the first wave vs 14.2% (95% CI 12.0 to 16.3) in the second (log-rank test p value=0.03). After including key predictors of death in the multivariable Cox regression model, the data still strongly suggested a lower 28-day mortality rate in the second wave (aHR=0.64, 95% CI 0.45 to 0.90, p value=0.01).ConclusionsIn our hospitalised patients with COVID-19 with severe pneumonia, the 28-day mortality appeared to be reduced by 36% during the second as compared with the first wave. Further studies are needed to identify factors that may have contributed to this improved survival.


2019 ◽  
Author(s):  
Huamao Ye ◽  
Xiang Feng ◽  
Yang Wang ◽  
Rui Chen ◽  
Meimian Hua ◽  
...  

Abstract Background: The effect of diagnostic ureteroscopy (DURS) on intravesical recurrence (IVR) after radical nephroureterectomy (RNU) were controversial. To investigate the impact of DURS, we carried out this single-center retrospective study by applying propensity-score matching (PSM) and Cox regression model. Patients and Methods: The data of 160 patients with pTa-pT3 upper tract urothelial carcinoma (UTUC) were analyzed. Eighty-six patients underwent DURS (DURS group) and 74 patients without DURS (control group). The DURS group was further sub-grouped into synchronous DURS group (DURS followed by immediate RNU, n=45) and non-synchronous DURS group (DURS followed by delayed RNU, n=41). Baseline confounders were corrected by PSM. The impact of DURS on IVR was assessed by Kaplan-Meier analysis in PSM cohort and by Cox regression model in the full data set. Results: The median follow-up time was 40.4 months. No difference of the 3-year IVRFS between DURS group and control group (72.6% vs. 65.3%, p=0.263). In subgroup analysis, the 3-year IVR-free survival of non-synchronous DURS group (51.4%) was significantly lower than that of synchronous DURS (78.3%) or control group (72.6%) (p=0.027). Further Cox regression analysis showed that non-synchronous DURS (HR 1.481, 95% CI 1.031-2.127, p=0.034) was independent risk factors for postoperative IVR. Conclusions: Non-synchronous DURS was not recommended for the diagnosis and preoperative evaluation of UTUC, because it could raise the risk of IVR after RNU. For UTUC patients in need of DURS, synchronous DURS could be a safer choice than the non-synchronous DURS in terms of lowering the IVR risk.


Blood ◽  
2006 ◽  
Vol 108 (11) ◽  
pp. 4703-4703
Author(s):  
Stefano Sacchi ◽  
Samantha Pozzi ◽  
Luigi Marcheselli ◽  
Alessia Bari ◽  
Stefano Luminari ◽  
...  

Abstract Some data suggest that there are been no improvement in survival of FL Pts in the last three decades of the 20th century. However that review ended in 1992, before the introduction of R treatment. Most recently reported data, show that evolving chemotherapies, including the incorporation of R has led to outcome improvement. Between 1994 and 2004, 344 Pts with FL were enrolled in different GISL Trials. For the purpose of this study we considered 270 Pts with similar characteristics enrolled in trials including or not R. The first group accounts for 176 naive Pts treated with Antracycline plus Fludarabine containing regimens (Cohort #1: 125 Pts) or plus R (Cohort #2: 51Pts). The second group accounts for 99 relapsed Pts treated with Antracycline plus Fludarabine containing regimens (Cohort #3: 40 Pts) or plus R (Cohort #4: 59 Pts). To evaluate the impact of the incorporation of R in front line and salvage therapies we assessed the patients OS, FFS, TTF, SAR in these different Cohorts of Pts. Descriptive analysis of prognostic features showed differences in the distribution among groups. To compensate for these variations we also performed Cox regression analysis. Previously Untreated patients. Regarding group #1 and #2 that enrolled Pts with clinical stage IIB, III and IV, FFS and OS according to treatment did not show any statistical differences. The univariate analysis of baseline clinical features showed an impact on OS and FFS for clinical stage, LDH level, involvement of more than 4 nodal sites and presence of extranodal involvement. The prevalence of this characteristics were higher in group #2 than group #1. Thus the FFS from group #2 vs. group #1 was adjusted for variation in prognostic features by Cox regression analysis, that shows a failure Hazard Radio reduction (HR) of 40 % in Pts who received R. Because of difference in follow up (FU) (49 months in Cohort #1 vs 21 months in Cohort #2), to evaluate differences in OS we utilized exact Log Rank test for unequal FU. So far, a trend exists for better OS in R treated patients, although the difference is not statistically significant. Relapsed Patients. Clinical characteristics were similar in the two Cohorts of pts. TTF was better in R treated Pts and the difference was statistically significant (66% vs. 53% at 3 yrs, p=0.023) The analysis of SAR demonstrated a better result for R Cohort with a statistically significant difference (88% vs. 68% at 3 yrs, p=0.022). OS according to treatment protocol, showed advantage for patients in R Cohort and the difference was statistically significant (92% vs. 70% at 5 yrs, p=0.004). Conclusion. In naïve patients our retrospective analysis showed a reduction of HR for FFS and a trend toward better OS in R treated Pts. In relapsed Pts all outcome parameters as OS, TTF and SAR had significant improvement in the Cohort treated with R. Although any conclusions between nonrandomized groups maybe subject to differences in observed and unobserved prognostic features, we believe that improvement have occurred in the management of FL Pts with the introduction of combined chemotherapy with R.


PeerJ ◽  
2019 ◽  
Vol 7 ◽  
pp. e6350 ◽  
Author(s):  
Jianfei Fu ◽  
Hang Ruan ◽  
Hongjuan Zheng ◽  
Cheng Cai ◽  
Shishi Zhou ◽  
...  

Objective This study was performed to identify a reasonable cutoff age for defining older patients with colorectal cancer (CRC) and to examine whether old age was related with increased colorectal cancer-specific death (CSD) and poor colorectal cancer-specific survival (CSS). Methods A total of 76,858 eligible patients from the surveillance, epidemiology, and end results (SEER) database were included in this study. The Cox proportional hazard regression model and the Chow test were used to determine a suitable cutoff age for defining the older group. Furthermore, a propensity score matching analysis was performed to adjust for heterogeneity between groups. A competing risk regression model was used to explore the impact of age on CSD and non-colorectal cancer-specific death (non-CSD). Kaplan–Meier survival curves were plotted to compare CSS between groups. Also, a Cox regression model was used to validate the results. External validation was performed on data from 1998 to 2003 retrieved from the SEER database. Results Based on a cutoff age of 70 years, the examined cohort of patients was classified into a younger group (n = 51,915, <70 years of old) and an older group (n = 24,943, ≥70 years of old). Compared with younger patients, older patients were more likely to have fewer lymph nodes sampled and were less likely to receive chemotherapy and radiotherapy. When adjusted for other covariates, age-dependent differences of 5-year CSD and 5-year non-CSD were significant in the younger and older groups (15.84% and 22.42%, P < 0.001; 5.21% and 14.21%, P < 0.001). Also an age of ≥70 years remained associated with worse CSS comparing with younger group (subdistribution hazard ratio, 1.51 95% confidence interval (CI) [1.45–1.57], P < 0.001). The Cox regression model as a sensitivity analysis had a similar result. External validation also supported an age of 70 years as a suitable cutoff, and this older group was associated with having reduced CSS and increased CSD. Conclusions A total of 70 is a suitable cutoff age to define those considered as having elderly CRC. Elderly CRC was associated with not only increased non-CSD but also with increased CSD. Further research is needed to provide evidence of whether cases of elderly CRC should receive stronger treatment if possible.


2020 ◽  
Author(s):  
Vincenzo De Marzo ◽  
Antonio Di Biagio ◽  
Roberta Della Bona ◽  
Antonio Vena ◽  
Eleonora Arboscello ◽  
...  

Abstract Background: Increases in cardiac troponin (cTn) in coronavirus disease 2019 (COVID-19) have been associated with worse prognosis. Nonetheless, data about the significance of cTn in elderly subjects with COVID-19 are lacking.Methods: From a registry of consecutive patients with COVID-19 admitted to a hub hospital in Italy from 25/02/2020 to 03/07/2020, we selected those ≥60 year-old and with cTnI measured within 3 days from the molecular diagnosis of SARS-CoV-2 infection. When available, a second cTnI value within 48 hours was also extracted. The relationship between increased cTnI and all-cause in-hospital mortality was evaluated by a Cox regression model and restricted cubic spline functions with three knots.Results: Of 343 included patients (median age 75.0 (68.0-83.0) years, 34.7% men), 88 (25.7%) had cTnI above the upper-reference limit (0.046 µg/L). Patients with increased cTnI had more comorbidities, greater impaired respiratory exchange and higher inflammatory markers on admission than those with normal cTnI. Furthermore, they died more (73.9% vs. 37.3%, p<0.001) over 15 (6-25) days of hospitalization. The association of elevated cTnI with mortality was confirmed by the adjusted Cox regression model (HR: 1.61, 95%CI: 1.06-2.52, p=0.039) and was linear until 0.3 µg/L, with a subsequent plateau. Of 191 (55.7%) patients with a second cTnI measurement, 49 (25.7%) had an increasing trend, which was not associated with mortality (univariate HR 1.39, 95%CI 0.87-2.22, p=0.265).Conclusions: In elderly COVID-19 patients, an initial increase in cTn is common and predicts a higher risk of death. Serial cTn testing may not confer additional prognostic information.


2012 ◽  
Vol 30 (15_suppl) ◽  
pp. e14163-e14163
Author(s):  
Ikenna Osuorji ◽  
Greg Dyson ◽  
Durga Yerasuri ◽  
Philip Agop Philip ◽  
Anthony Frank Shields ◽  
...  

e14163 Background: Metastatic colorectal disease is generally incurable and treatment is palliative with the intent to balance toxicity with quality of life. Coin 3 trial showed that pre-chemotherapy platelet counts > 400,000 per μL were associated with poor survival when intermittent chemotherapy was used, whereas patients with lower platelet counts did not have any significant difference between the intermittent and continuous chemotherapy arms. Methods: We reviewed retrospectively 775 stage IV colorectal cancer patients at Karmanos Cancer Center over a 10 year period to see if high platelet count was associated with a poor outcome irrespective of treatment. Our analysis included 480 patients with adenocarcinoma who had not received chemotherapy prior to referral, and where information on the baseline platelet count, race, and age was available. We also analyzed the impact of race, age and bevacizumab use. We used Cox regression model for analysis. Results: Among the patients 48.3% were African American (AA) and 51.7% were Caucasians (C). 34.4 % had had PLT > 400,000 per μL. For those with lower platelet counts the median survival was 26.2 months in the C and 14.1 months in the AA groups respectively. Patients with platelet above 400,000 had a median survival of 15.2 months for C and 12.6 months for AA. Cox regression analysis, showed hazard ratios for outcome of death were; 1.16(1.07-1.26) p<0.001 for age (per 10 yrs), 1.60(1.31-1.94) [AA versus C(ref)] p< 0.001 for race and 1.35(1.10-1.65)[>400 versus <] p<0.004 for platelet count. In subset analysis, 296(61.7%) patients who received chemotherapy had data regarding use of bevacizumab (B). Among the 31.7% who received B, the median survival was 25.6 months compared to14.1 months in the no B arm. A Cox regression model using B as a stratification variable showed that the impact of race {hazard ratio = 1.32 (1.02-1.69) p =0.03} and platelet count {hazard ratio = 1.27(0.97-1.65) p =0.08} were much less. Conclusions: Pre-chemotherapy Platelet count< 400,000, C race and younger age are associated with improved survival. Use of bevacizumab may mitigate the impact of these factors.


2020 ◽  
Author(s):  
Karine A Al Feghali ◽  
Samantha M Buszek ◽  
Hesham Elhalawani ◽  
Neil Chevli ◽  
Pamela K Allen ◽  
...  

Abstract Background This retrospective study investigated the impact of, in addition to age, the management and outcomes of elderly patients with glioblastoma (GBM). Methods The National Cancer Database was queried between 2004 and 2015 for GBM patients age 60 years and older. Three age groups were created: 60 to 69, 70 to 79, and 80 years and older, and 4 age/KPS groups: “age ≥ 60/ KPS &lt; 70” (group 1), “age 60 to 69/KPS ≥ 70” (group 2), “age 70 to 79/KPS ≥ 70” (group 3), and “age ≥ 80/KPS ≥ 70” (group 4). Multivariable (MVA) modeling with Cox regression determined predictors of survival (OS), and estimated average treatment effects analysis was performed. Results A total of 48 540 patients with a median age of 70 years (range, 60-90 years) at diagnosis, and a median follow-up of 6.8 months (range, 0-151 months) were included. Median survival was 5.0, 15.2, 9.6, and 6.8 months in groups 1, 2, 3, and 4, respectively (P &lt; .001). On treatment effects analysis, all groups survived longer with combined chemotherapy (ChT) and radiation therapy (RT), except group 1, which survived longer with ChT alone (P &lt; .001). RT alone was associated with the worst OS in all groups (P &lt; .01). Across all groups, predictors of worse OS on MVA were older age, lower KPS, White, higher comorbidity score, worse socioeconomic status, community treatment, tumor multifocality, subtotal resection, and no adjuvant treatment (all P &lt; .01). Conclusions In elderly patients with newly diagnosed GBM, those with good KPS fared best with combined ChT and RT across all age groups. Performance status is a key prognostic factor that should be considered for management decisions in these patients.


Blood ◽  
2020 ◽  
Vol 135 (16) ◽  
pp. 1386-1395 ◽  
Author(s):  
Johannes Schetelig ◽  
Henning Baldauf ◽  
Falk Heidenreich ◽  
Carolin Massalski ◽  
Sandra Frank ◽  
...  

Abstract Several studies suggest that harnessing natural killer (NK) cell reactivity mediated through killer cell immunoglobulin-like receptors (KIRs) could reduce the risk of relapse after allogeneic hematopoietic cell transplantation. Based on one promising model, information on KIR2DS1 and KIR3DL1 and their cognate ligands can be used to classify donors as KIR-advantageous or KIR-disadvantageous. This study was aimed at externally validating this model in unrelated donor hematopoietic cell transplantation. The impact of the predictor on overall survival (OS) and relapse incidence was tested in a Cox regression model adjusted for patient age, a modified disease risk index, Karnofsky performance status, donor age, HLA match, sex match, cytomegalovirus match, conditioning intensity, type of T-cell depletion, and graft type. Data from 2222 patients with acute myeloid leukemia or myelodysplastic syndrome were analyzed. KIR genes were typed by using high-resolution amplicon-based next-generation sequencing. In univariable analyses and subgroup analyses, OS and the cumulative incidence of relapse of patients with a KIR-advantageous donor were comparable to patients with a KIR-disadvantageous donor. The adjusted hazard ratio from the multivariable Cox regression model was 0.99 (Wald test, P = .93) for OS and 1.04 (Wald test, P = .78) for relapse incidence. We also tested the impact of activating donor KIR2DS1 and inhibition by KIR3DL1 separately but found no significant impact on OS and the risk of relapse. Thus, our study shows that the proposed model does not universally predict NK-mediated disease control. Deeper knowledge of NK-mediated alloreactivity is necessary to predict its contribution to graft-versus-leukemia reactions and to eventually use KIR genotype information for donor selection.


Sign in / Sign up

Export Citation Format

Share Document