scholarly journals QOL-49. THE IMPACT OF OTOTOXICITY AND VISUAL IMPAIRMENT ON EDUCATION IN CHILDREN TREATED FOR CNS TUMOURS

2020 ◽  
Vol 22 (Supplement_3) ◽  
pp. iii440-iii440
Author(s):  
Harriet Dulson ◽  
Rachel McAndrew ◽  
Mark Brougham

Abstract INTRODUCTION Children treated for CNS tumours experience a very high burden of adverse effects. Platinum-based chemotherapy and cranial radiotherapy can cause ototoxicity, which may be particularly problematic in patients who have impaired vision and cognition as a result of their tumour and associated treatment. This study assessed the prevalence of impaired hearing and vision and how this may impact upon education. METHODS 53 patients diagnosed with solid tumours in Edinburgh, UK between August 2013–2018 were included in the study. Patients were split into three groups according to treatment received: Group 1 – cisplatin-based chemotherapy and cranial radiotherapy; Group 2 - platinum-based chemotherapy, no cranial radiotherapy; Group 3 – benign brain tumours treated with surgery only. Data was collected retrospectively from patient notes. RESULTS Overall 69.5% of those treated with platinum-based chemotherapy experienced ototoxicity as assessed by Brock grading and 5.9% of patients had reduced visual acuity. Patients in Group 1 had the highest prevalence of both. 44.4% of patients in Group 1 needed increased educational support following treatment, either with extra support in the classroom or being unable to continue in mainstream school. 12.5% of Group 2 patients required such support and 31.3% in Group 3. CONCLUSIONS Children with CNS tumours frequently require support for future education but those treated with both platinum-based chemotherapy and cranial radiotherapy are at particular risk, which may be compounded by co-existent ototoxicity and visual impairment. It is essential to provide appropriate support for this patient cohort in order to maximise their educational potential.

2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Yu Liu ◽  
Jing Li ◽  
Wanyu Zhang ◽  
Yihong Guo

AbstractOestradiol, an important hormone in follicular development and endometrial receptivity, is closely related to clinical outcomes of fresh in vitro fertilization-embryo transfer (IVF-ET) cycles. A supraphysiologic E2 level is inevitable during controlled ovarian hyper-stimulation (COH), and its effect on the outcome of IVF-ET is controversial. The aim of this retrospective study is to evaluate the association between elevated serum oestradiol (E2) levels on the day of human chorionic gonadotrophin (hCG) administration and neonatal birthweight after IVF-ET cycles. The data of 3659 infertile patients with fresh IVF-ET cycles were analysed retrospectively between August 2009 and February 2017 in First Hospital of Zhengzhou University. Patients were categorized by serum E2 levels on the day of hCG administration into six groups: group 1 (serum E2 levels ≤ 1000 pg/mL, n = 230), group 2 (serum E2 levels between 1001 and 2000 pg/mL, n = 524), group 3 (serum E2 levels between 2001 and 3000 pg/mL, n = 783), group 4 (serum E2 levels between 3001 and 4000 pg/mL, n = 721), group 5 (serum E2 levels between 4001 and 5000 pg/mL, n = 548 ), and group 6 (serum E2 levels > 5000 pg/mL, n = 852). Univariate linear regression was used to evaluate the independent correlation between each factor and outcome index. Multiple logistic regression was used to adjust for confounding factors. The LBW rates were as follows: 3.0% (group 1), 2.9% (group 2), 1.9% (group 3), 2.9% (group 4), 2.9% (group 5), and 2.0% (group 6) (P = 0.629), respectively. There were no statistically significant differences in the incidences of neonatal LBW among the six groups. We did not detect an association between peak serum E2 level during ovarian stimulation and neonatal birthweight after IVF-ET. The results of this retrospective cohort study showed that serum E2 peak levels during ovarian stimulation were not associated with birth weight during IVF cycles. In addition, no association was found between higher E2 levels and increased LBW risk. Our observations suggest that the hyper-oestrogenic milieu during COS does not seem to have adverse effects on the birthweight of offspring after IVF. Although this study provides some reference, the obstetric-related factors were not included due to historical reasons. The impact of the high estrogen environment during COS on the birth weight of IVF offspring still needs future research.


2020 ◽  
Vol 79 (Suppl 1) ◽  
pp. 340.2-341
Author(s):  
V. Orefice ◽  
F. Ceccarelli ◽  
C. Barbati ◽  
R. Lucchetti ◽  
G. Olivieri ◽  
...  

Background:Systemic lupus erythematosus (SLE) is an autoimmune disease mainly affecting women of childbearing age. The interplay between genetic and environmental factors may contribute to disease pathogenesis1. At today, no robust data are available about the possible contribute of diet in SLE. Caffeine, one of the most widely consumed products in the world, seems to interact with multiple components of the immune system by acting as a non-specific phosphodiesterase inhibitor2.In vitrodose-dependent treatment with caffeine seems to down-regulate mRNA levels of key inflammation-related genes and similarly reduce levels of different pro-inflammatory cytokines3.Objectives:We evaluated the impact of caffeine consumption on SLE-related disease phenotype and activity, in terms of clinimetric assessment and cytokines levels.Methods:We performed a cross-sectional study, enrolling consecutive patients and reporting their clinical and laboratory data. Disease activity was assessed by SLE Disease Activity Index 2000 (SLEDAI-2k)4. Caffeine intake was evaluated by a 7-day food frequency questionnaire, including all the main sources of caffeine. As previously reported, patients were divided in four groups according to the daily caffeine intake: <29.1 mg/day (group 1), 29.2-153.7 mg/day (group 2), 153.8-376.5 mg/day (group 3) and >376.6 mg/day (group 4)5. At the end of questionnaire filling, blood samples were collected from each patient to assess cytokines levels. These were assessed by using a panel by Bio-Plex assays to measure the levels of IL-6, IL-10, IL-17, IL-27, IFN-γ, IFN-α and Blys.Results:We enrolled 89 SLE patients (F/M 87/2, median age 46 years, IQR 14; median disease duration 144 months, IQR 150). The median intake of caffeine was 195 mg/day (IQR 160.5). At the time of the enrollment, 8 patients (8.9%) referred a caffeine intake < 29.1 mg/day (group 1), 27 patients (30.3%) between 29.2 and 153.7 mg/day (group 2), 45 patients (51%) between 153.8 and 376.5 mg/day (group 3) and 9 patients (10.1%) >376.6 mg/day (group 4). A negative correlation between the levels of caffeine and disease activity, evaluated with SLEDAI-2K, was observed (p=0.01, r=-0.26). By comparing the four groups, a significant higher prevalence of lupus nephritis, neuropsychiatric involvement, haematological manifestations, hypocomplementemia and anti-dsDNA positivity was observed in patients with less intake of caffeine (figure 1 A-E). Furthermore, patients with less intake of caffeine showed a significant more frequent use of glucocorticoids [group 4: 22.2%,versusgroup 1 (50.0%, p=0.0001), group 2 (55.5%, p=0.0001), group 3 (40.0%, p=0.009)]. Moving on cytokines analysis, a negative correlation between daily caffeine consumption and serum level of IFNγ was found (p=0.03, r=-0.2) (figure 2A); furthermore, patients with more caffeine intake showed significant lower levels of IFNα (p=0.02, figure 2B), IL-17 (p=0.01, figure 2C) and IL-6 (p=0.003, figure 2D).Conclusion:This is the first report demonstrating the impact of caffeine on SLE disease activity status, as demonstrated by the inverse correlation between its intake and both SLEDAI-2k values and cytokines levels. Moreover, in our cohort, patients with less caffeine consumption seems to have a more severe disease phenotype, especially in terms of renal and neuropsychiatric involvement. Our results seem to suggest a possible immunoregulatory dose-dependent effect of caffeine, through the modulation of serum cytokine levels, as already suggested byin vitroanalysis.References:[1]Kaul et alNat. Rev. Dis. Prim.2016; 2. Aronsen et alEurop Joul of Pharm2014; 3. Iris et alClin Immun.2018; 4. Gladman et al J Rheumatol. 2002; 5. Mikuls et alArth Rheum2002Disclosure of Interests:Valeria Orefice: None declared, Fulvia Ceccarelli: None declared, cristiana barbati: None declared, Ramona Lucchetti: None declared, Giulio Olivieri: None declared, enrica cipriano: None declared, Francesco Natalucci: None declared, Carlo Perricone: None declared, Francesca Romana Spinelli Grant/research support from: Pfizer, Consultant of: Novartis, Gilead, Lilly, Sanofi, Celgene, Speakers bureau: Lilly, cristiano alessandri Grant/research support from: Pfizer, Guido Valesini: None declared, Fabrizio Conti Speakers bureau: BMS, Lilly, Abbvie, Pfizer, Sanofi


Circulation ◽  
2014 ◽  
Vol 130 (suppl_2) ◽  
Author(s):  
Janet W Elcano ◽  
Hui Nam Pak

Background: The incidence of atrial fibrillation (AF) is increasing in the elderly population, however, there is paucity of data on the safety outcomes of this patient subgroup thus we sought to investigate on the impact of age on the safety of catheter ablation for AF. Methods and Results: We included 1,293 (male 75%) patients enrolled in Yonsei AF Ablation Cohort database in Seoul, South Korea, from March 2009 to November 2013. We divided the patients into 4 groups according to age (Group 1, aged 17-49, N=295 ; Group 2 50-59, N=421; Group 3 60-69 N=408; and Group 4 ≥ 70, N=169) and evaluated the incidence of procedure related complications. No procedure-related death occurred in this study. There was a trend of increasing incidence of procedure related complications with age noted as follows: Group 1= 3.7%; Group 2= 4.0%; Group 3=6.6%; and Group 4 7.1%, (p= 0.15). There were 28 cases (2.2%) of major complications (Group 1=1.7%, Group 2=1.9%, Group 3=2%, Group 4 4.1%), tamponade being the most common. Major complications in group 4 include: tamponade 4 cases, phrenic nerve palsy 1 case, atrioesophaeal fistula 1 and 3rd degree AV block in 1 patient. Multivariate regression analysis shows ablation time (odds ratio (OR) 1.2 confidence interval (CI)1.0-1.017, p=0.017), procedure time (OR 1.008, CI 1.0-1.15, p=0.04), decreasing eGFR (OR 1.013, CI 1.002-1.026 p=0.018), coronary artery disease (CAD) (OR 1.847, CI 1.003-3.524, p0.04) and age (OR 1.028, CI 1.003-1.055, p=0.03) were associated with increased adjusted risk of total complications. Predictors of major complications include age (OR 1.044, CI 1.003-1.086, p0.02) and ablation time (OR 1.009, CI 0.999-1.000, p=0.033). Conclusion: Our data suggest that incidence of procedural complications in RFA of AF increase with age. Ablation time and age are independent predictors of a major complication.


Author(s):  
Kirsten E Lyke ◽  
Alexandra Singer ◽  
Andrea A Berry ◽  
Sharina Reyes ◽  
Sumana Chakravarty ◽  
...  

Abstract Background A live-attenuated Plasmodium falciparum (Pf) sporozoite (SPZ) vaccine (PfSPZ Vaccine) has shown up to 100% protection against controlled human malaria infection (CHMI) using homologous parasites (same Pf strain as in the vaccine). Using a more stringent CHMI, with heterologous parasites (different Pf strain), we assessed the impact of higher PfSPZ doses, a novel multi-dose prime regimen, and a delayed vaccine boost upon vaccine efficacy. Methods Four groups of 15 healthy, malaria-naïve adults were immunized. Group (Grp) 1 received five doses of 4.5x10 5 PfSPZ (days 1, 3, 5, 7; week 16). Grps 2, 3 and 4 received three doses (weeks 0, 8, 16) with Gp 2 receiving 9.0×10 5/dose, Grp 3 receiving 18.0×10 5/dose, and Grp 4 receiving 27.0×10 5 for dose 1 and 9.0×10 5 for doses 2 and 3. VE was assessed by heterologous CHMI after 12 or 24 weeks. Volunteers not protected at 12 weeks were boosted prior to repeat CHMI at 24 weeks. Results At 12-week CHMI, 6/15 (40%) Group 1 (P=0.04), 3/15 (20%) Group 2 vs. 0/8 controls remained aparasitemic. At 24-week CHMI, 3/13 (23%) Group 3, 3/14 (21%) Group 4 vs. 0/8 controls remained aparasitemic (Groups 2-4, VE not significant). Post-boost, 9/14 (64%) vs. 0/8 controls remained aparasitemic (3/6 Group 1, P=0.025; 6/8 Group 2, P=0.002). Conclusions Four stacked, priming injections (multi-dose priming) showed 40% VE against heterologous CHMI, while dose escalation of PfSPZ using single dose priming was not significantly protective. Boosting unprotected subjects improved VE at 24 weeks to 64%.


Blood ◽  
2007 ◽  
Vol 110 (11) ◽  
pp. 830-830
Author(s):  
J. Alejandro Madrigal ◽  
Neema P. Mayor ◽  
Hazael Maldonado-Torres ◽  
Bronwen E. Shaw ◽  
Steven G.E. Marsh

Abstract Haematopoietic Stem Cell Transplantation (HSCT) using volunteer Unrelated Donors (UD) has become an important and viable option in the treatment of Acute Leukaemia (AL). While matching donors and recipients usually refers to five of the classical HLA genes (HLA-A, -B, -C, -DRB1 and -DQB1), the impact of a sixth gene, HLA-DPB1, on the outcome of UD-HSCT is increasingly emerging. We have previously shown an increased risk of relapse with HLA-DPB1 matching and independently, with NOD2/CARD15 genotype. In light of these data, we have analysed a larger UD-HSCT cohort in order to establish the impact on transplant outcome when both HLA-DPB1 matching status and NOD2/CARD15 genotype are considered. HLA typing and NOD2/CARD15 genotyping was performed on 304 AL patients and their Anthony Nolan Trust volunteer unrelated donors. Transplants occurred between 1996 and 2005 at UK transplant centres. Diagnoses were ALL (47%) and AML (53%). 67% of the cohort were a 10/10 HLA match with 16% also being matched for HLA-DPB1. Myeloablative conditioning regimens were used in 74% of transplants. T-cell depletion was included in 82% of conditioning protocols. Bone marrow was used in 72% of transplants with the remaining 28% using peripheral blood stem cells. Two forms of post-transplant immunosuppression predominated, Cyclosporine A and Methotrexate (47%) and Cyclosporine A alone (38%). Previous studies on a subgroup of this cohort showed that HLA-DPB1 matching and NOD2/CARD15 SNPs independently caused an increase in disease relapse. Consequently, the cohort was grouped into three categories to reflect this risk, group 1 (DPB1 matched; NOD2/CARD15 SNP, n=24), group 2 (HLA-DPB1 matched; NOD2/CARD15 Wild-Type (WT) or HLA-DPB1 mismatched; NOD2/CARD15 SNP, n=112) and group 3 (HLA-DPB1 mismatched; NOD2/CARD15 WT, n=168). There was a significant difference in disease relapse between the three groups (1 year: group 1; 68%, group 2; 48%, group 3; 30%, p=0.0038). This finding persisted in multivariate analysis where being in either group 2 or 3 was protective towards relapse as compared to group 1 (RR 0.321; 95% CI 0.167–0.616; p=0.001 and RR 0.478; 95% CI 0.244–0.934; p=0.031 respectively). In the group with the highest relapse risk (group 1), this resulted in a decrease in Overall Survival (OS) (33% vs 54% in group 3, RR 0.617; 95% CI 0.359–1.060; p=0.080). The best OS was seen in the group with the lowest risk of relapse (group 3). Here, in addition to low relapse, there was increased acute and chronic Graft-versus-Host Disease (GvHD) (p=0.0019 and p=0.0058 respectively). In this cohort, cGvHD (in its limited form) was associated with a significantly lower incidence of relapse (p=0.0066) and better OS (p<0.0001). In concordance with our previous theories, it appears that being HLA-DPB1 matched and having NOD2/CARD15 SNPs predicts for the worst outcome with a significant increase in relapse and reduced OS. Conversely, the ideal pairing would be HLA-DPB1 mismatched and NOD2/CARD15 WT. These data suggest that prospectively typing AL patients for HLA-DPB1 and NOD2/CARD15 SNPs will allow the prediction of disease relapse, aGvHD and cGvHD and in addition will allow the effects of being independently HLA-DPB1 matched or having a NOD2/CARD15 SNP to be offset by intelligently selecting a suitable, less precarious donor.


Blood ◽  
2009 ◽  
Vol 114 (22) ◽  
pp. 1223-1223
Author(s):  
Alessandro Corso ◽  
Silvia Mangiacavalli ◽  
Luciana Barbarano ◽  
Annalisa Citro ◽  
Paola Brasca ◽  
...  

Abstract Abstract 1223 Poster Board I-245 Introduction This study aimed at evaluating the impact of three different pre-transplant therapies on the outcome of patients (pts) eligible for high-dose therapy. Methods two-hundred sixty eight newly diagnosed MM pts aged £65 years, Durie-Salmon stage III, II, or I in progression, were consecutively enrolled from 2000 to 2007 in three different protocols, with three different pre-transplant therapy: Group 1: (145 pts) 3 pulse-VAD cycles; Group 2: (67 pts) 3 pulse-VAD cycles plus 3 Thal-Dex cycles (thalidomide at the dose of 100 mg/day orally at bedtime, continuously for 3 months, oral dexamethasone at the dose of 20 mg on days 1-4 and 14-17 every 28 days); Group 3: (57pts) 4 Vel-Dex courses (Bortezomib 1.3 mg/m2 i.v. on days 1, 4, 8, 11; oral Dexamethasone 40 mg on days 1-4 and 8-11 every 3 weeks). After induction all pts received two DCEP-short cycles as mobilization (oral Dexamethasone 40 mg/day on days 1-4 + Cyclophosphamide 700 mg/m2/day i.v., Etoposide 100 mg/ m2/day i.v., cisPlatin 25 mg/m2/day for 2 days) with peripheral blood stem-cell (PBSC) collection prompted by G-CSF followed by one or two transplants (Tx) with melphalan 200 mg/m2 as conditioning regimen. Response was defined according to IMWG uniform criteria. Pts were considered responsive when obtaining at least a PR. Results pts in the three group were similar for age, gender, Ig type, ISS stage. A significant higher percentage of Durie and Salmon stages III was found in group 3 (83% vs 68% in group 1 and 67% in group 2, p=0.0002). The median follow-up was 46 (1-150) months for group 1, 43 (1-68) months for group 2, and 29.7 (1-79) months for group 3. At the time of this analysis in the three groups 51%, 65%, 90% of transplanted pts respectively were still alive, and progression after transplant was registered in 84%, 80%, 50% respectively. Patient flow before Tx was similar (p=0.45): 19% in group 1, 27% in group 2, 23% in group 3. In group 1, 2% of pts went off-study after VAD, and 17% after mobilization phase. In group 2, patient flow was equally distributed: 7% after pulse VAD, 10% after thal-dex, 9% after DCEP. In group 3, 12% of the pts went off-study after Vel-Dex, 11% after DCEP. Table 1 summarized responses. In group 3 (Vel-Dex) response was better along all protocol phases with respect to group 1 or 2 (p<0.00001). The number of responsive pts progressively increased from 87% after Vel-Dex (CR 31%), to 96% after transplant (CR 38%). Response rates of group 1 and 2 patients were not significantly different either after induction (p=0.6), after DCEP (p=0.5), and after Tx (p=0.65). On intention to treat basis, vel-dex induction produced a better, although not significant, PFS (34.6 months vs 29 in group 1 and 26.8 in group 2, p=0.56). OS were not statistically different among the three groups, event though the different follow-up could affect the analysis (median OS 110 in group 1, 66 months in group 2, and not reached in group 3, p=0.37). In multivariate analysis PFS was improved only by the achievement of CR (p=0.001). No significant difference was observed between VGPR or PR (p=0.43). Conclusion In this study, only CR not VGPR impacts on the outcome. Vel-Dex producing a significant high CR rate after TX (38%), seems to improve survival of MM patients candidate to high-dose therapy with respect to conventional pre-transplant strategies. Disclosures Morra: Roche:.


Blood ◽  
2008 ◽  
Vol 112 (11) ◽  
pp. 54-54 ◽  
Author(s):  
Yves Beguin ◽  
Johan Maertens ◽  
Bernard De Prijck ◽  
Rik Schots ◽  
Pascale Frere ◽  
...  

Abstract Background : We previously reported a retrospective study suggesting that erythropoietin therapy starting on day 28 after autologous HCT was highly effective to improve Hb levels and that i.v. iron might improve response in patients with low transferrin saturation (Baron et al., Clin Cancer Res 2003). This prompted us to conduct a multicenter prospective randomized study analyzing the impact of darbepoetin alfa with or without i.v. iron on erythroid recovery after autologous HCT. Patients and Methods : 127 autologous HCT recipients with lymphoid malignancies were randomized 1:2:2 between no treatment (group 1, n=25), darbepoetin alfa (Aranesp®) 300 μg QOW starting on day 28 after HCT for a total of 7 doses (group 2, n=52), or the same regimen of darbepoetin alfa plus i.v. iron sucrose (Venofer®) 200 mg on days 28, 42 and 56 after HCT (group 3, n=50). Once the target Hb (13 g/dL) was attained, the dose of darbepoetin alfa was reduced to 150 μg, while it was withheld when Hb was ≥ 14 g/dL. Primary endpoints included proportion of complete correctors (i.e. patients reaching Hb □ 13 g/dL) before day 126 post-transplant and median time to achieve Hb correction in each arm. Data were analyzed following the intention-to-treat principle. The proportion of complete correctors by day 126 in each group was compared using the Fisher’s exact test, and median times to reach 13 g/dL in each group were compared using the logrank test. Results : The proportion of complete correctors was 24% in group 1, 81% in group 2 (P&lt;0.001 compared with group 1), and 92% in group 3 (P&lt;0.001 compared to group 1, and P=0.099 compared to group 2). Median time to achieve Hb □ 13 g/dL was not reached in group 1, 42 days in group 2 (P&lt;0.001 compared to group 1), and 32 days in group 3 (P&lt;0.001 compared to group 1 and P=0.127 compared to group 2) (Fig 1A). Hb evolution in each group is shown in Fig 1B. Mean ± standard deviation total doses of darbepoetin-alfa administered were 1,445 ± 489 μg in group 2 vs 1,272 ± 443 μg in group 3 (P=0.06). Ferritin levels at the end of study were 477 ± 597, 393 ± 599 and 479 ± 488 in groups 1, 2 and 3, respectively (NS). Eight patients (2 in group 1, 4 in group 2, and 2 in group 3) required red blood cell transfusions on study, including 4 patients following early disease progression. There was no difference in rates of thrombo-embolic events or other complications among the groups. Quality-of-life data will be presented. Conclusions : This is the first prospective randomized trial demonstrating that darbepoetin alfa is safe and highly effective to ensure full erythroid reconstitution after autologous HCT when started on day 28 posttransplant. I.v. iron sucrose tended (not statistically significant) to further fasten erythroid recovery with a lower dose of darbepoetin alfa required. Future studies in this setting should aim at further investigating the impact i.v. iron might have on improving response in patients with low transferrin saturation. Figure 1. Figure 1.


Blood ◽  
2012 ◽  
Vol 120 (21) ◽  
pp. 1925-1925
Author(s):  
Pere Barba ◽  
David Valcarcel ◽  
Lucía López-Corral ◽  
Francesc fernandez-Aviles ◽  
Rodrigo Martino ◽  
...  

Abstract Abstract 1925 In recent years, several pre-transplant models have been developed to predict the outcome after hematopoietic cell transplantation (HCT) through the selection of the best candidates and conditioning regimens. Two models are the most popular one each side of the Atlantic: the HCT Comorbidity Index (HCT-CI) and the European Blood and Marrow Transplantation (EBMT) score. Their predictive capacity has been demonstrated in several studies. Since these models are focused on different pre-HCT characteristics (HCT-CI on comorbidities and the EBMT score on more classical risk factors) we hypothesized that the combination of the two could improve their individual predictive capacity. To that end, we retrospectively analyzed pre-HCT characteristics of all consecutive patients receiving a reduced-toxicity allogeneic HCT (allo-HCT) in 4 Spanish centers from 1999–2008. The HCT-CI and the EBMT scores were calculated as originally defined. Patients were then classified according to the HCT-CI in the original categoriesas originally defined and regardingto the EBMT score in two groups according to the median score of the whole cohort. Multivariate analyseis including pre-HCT characteristics were performed using Cox proportional Hazard models and taking into account the competitive risk structure. The predictive capacity of each model was calculated using the c-statistics. Patients were included in the same protocol of reduced-toxicity allo-HCT with fludarabine-based conditioning in combination with melphalan (70–140 mg/m2) or busulfan (8–10 mg/kg). The median follow-up for survivors was 51 months (range 3–123). A total of 442 recipients (80% transplanted from HLA identical siblings) were included. Most frequent diseases were acute leukemia/MDS (n=156, 35%) and non-Hodgkin lymphoma/chronic lymphocytic leukemia (n=125, 28%). The HCT-CI score distribution was: score 0 (n=87, 20%), score 1–2 (n=130, 29%) and score ≥3 (n=225, 51%) while for the EBMT score was 0–2 (n=62, 14%), 3–4 (n=194, 44%) and >4 (n=187, 42%). The probability of 100-day Non-Relapse Mortality (NRM), 4y-NRM and 4y-overall survival (OS) for the whole cohort were 12% (95%CI 11–14), 35% (95%CI 33–38) and 45% (95%CI 48–50), respectively. In the multivariate analysis, the HCT-CI had and impact on 4y-NRM (score 0: HR 1.0; scores 1–2: HR 1.6 [95%CI 0.9–3], p=0.09; scores ≥ 3: HR 2.3 [95%CI 1.3–3.8], p=0.003) and 4y-OS (score 0:HR of death 1.0; scores 1–2: HR 1.3 [95%CI 0.8–2], p=0.2; scores >2: HR 1.9 [95%CI 1.3–2.8], p=0.002) while the EBMT score did not (p=0.4 and p=0.5, respectively). Using the two models we classified the patients were classified into 3 groups: patients with low HCT-CI (0–2) and low EBMT score (<4) (Group 1), patients with high HCT-CI or high EBMT score (Group 2) and patients with both high HCT-CI and EBMT score (Group 3). The HR for 4y-NRM were: group 1 (HR 1.0), group 2 (HR 1.1 [95%CI 0.6–2], p=0.7), group 3 (HR 1.8 [95%CI 1–3], p=0.04) and for 4y-OS was: group 1 (HR 1.0), group 2 (HR 1 [95%CI 0.6–1.5], p=0.8), group 3 (HR 1.6 [95%CI 1–2.3], p=0.04). Regarding the predictive capacity of each model, the HCT-CI alone captured 58% (c- 95%CI: 53–62), the EBMT score 54% (c- 95%CI: 51–58) while the combination of the two models captured 57% (c- 95%CI: 53–61) of the patients. Finally, the impact of EBMT score was explored in each HCT-CI group. In patients with HCT-CI scores of 0 and 1–2, the EBMT score did not have an impact on NRM and OS. In the cohort of high HCT-CI score (>2), patients with low EBMT score showed a trend to lower risk of NRM (HR 0.6 [95%CI 0.3–1], p=0.08) with a similar risk as for patients with HCT-CI of 1–2 (Figure 1). In conclusion, high HCT-CI scores but not high EBMT scores are associated with worse outcome in patients undergoing reduced toxicity allo-HCT. The addition of the EBMT score contributes little to the HCT-CI, except maybe for patients with more and severe comorbidities. Figure 1. Probability of NRM according to the HCT-CI for all patients and according to the EBMT score in the 225 patients with HCT-CI >2 (MVA) Figure 1. Probability of NRM according to the HCT-CI for all patients and according to the EBMT score in the 225 patients with HCT-CI >2 (MVA) Disclosures: No relevant conflicts of interest to declare.


2019 ◽  
Vol 37 (15_suppl) ◽  
pp. 5071-5071
Author(s):  
Iris Yeong- Fung Sheng ◽  
Yu-Wei Chen ◽  
Moshe Chaim Ornstein ◽  
Timothy D. Gilligan ◽  
Brian I. Rini ◽  
...  

5071 Background: Prostate specific antigen (PSA) screening has been controversial, given unrefined screening guidelines leading to overdiagnosis and overtreatment of “indolent” PCa. In 2008, the USPSTF recommended against PSA screening for men aged ≥75 and in 2012 broadened this recommendation to include all men. The impact of these changes is unstudied. We hypothesize that these screening changes could delay the diagnosis of advanced PCa. Methods: The Surveillance, Epidemiology and End Results Program (SEER) was used to identify men (age 55-69) diagnosed with PCa between 2004-2015. PCa stage was categorized as nodal (N1M0) and metastatic (NxM1). Trend analysis was stratified based on year 2004-2008 (group 1), 2009-2012 (group 2), and 2012-2015 (group 3). Using group 2 as a reference, multivariable logistic regression was used to identify predictors for N1M0 and NxM1 in each group. Results: From 2004-2015, there were 603,323 eligible men diagnosed with PCa (group 1: 262,240 men, group 2: 210,045 men, group 3: 131,038 men). In group 1, 1.4% had N1M0 and 2.8% had NxM1. In group 2, 1.6% had N1M0 and 3.7% had NxM1. In group 3, 1.4% had N1M0, and 6.1% had NxM1. The adjusted odds ratio (AOR) of N1M0 was 0.78 (95%CI 0.74-0.82; p<0.0001) in group 1 and 1.71 (95%CI 1.63-1.80; p<0.0001) in group 3. Similar AOR trends were seen in NxM1 (group 1, 0.71; 95%CI 0.68-0.73, p< 0.0001 vs. group 3, 1.70; 95% CI 1.63-1.75, p<0.0001). (Table) Subset analysis of non-eligible patients (age >70 and <55) showed a similar stage migration. Conclusions: With each USPSTF recommendation, there have been significantly more diagnoses of advanced PCa; suggesting stage migration. The sequelae of having advanced PCa include more aggressive treatments, increased financial burden, and reduced quality of life. Future population studies are warranted to investigate whether the updated 2018 USPSTF recommendation now encapsulates the best target population.[Table: see text]


2010 ◽  
Vol 26 (11) ◽  
pp. 2039-2049 ◽  
Author(s):  
Carlos Augusto Monteiro ◽  
Renata Bertazzi Levy ◽  
Rafael Moreira Claro ◽  
Inês Rugani Ribeiro de Castro ◽  
Geoffrey Cannon

This paper describes a new food classification which assigns foodstuffs according to the extent and purpose of the industrial processing applied to them. Three main groups are defined: unprocessed or minimally processed foods (group 1), processed culinary and food industry ingredients (group 2), and ultra-processed food products (group 3). The use of this classification is illustrated by applying it to data collected in the Brazilian Household Budget Survey which was conducted in 2002/2003 through a probabilistic sample of 48,470 Brazilian households. The average daily food availability was 1,792 kcal/person being 42.5% from group 1 (mostly rice and beans and meat and milk), 37.5% from group 2 (mostly vegetable oils, sugar, and flours), and 20% from group 3 (mostly breads, biscuits, sweets, soft drinks, and sausages). The share of group 3 foods increased with income, and represented almost one third of all calories in higher income households. The impact of the replacement of group 1 foods and group 2 ingredients by group 3 products on the overall quality of the diet, eating patterns and health is discussed.


Sign in / Sign up

Export Citation Format

Share Document