Abstract 15948: Impact of Age on Safety of Radiofrequency Ablation in Patients With Atrial Fibrillation

Circulation ◽  
2014 ◽  
Vol 130 (suppl_2) ◽  
Author(s):  
Janet W Elcano ◽  
Hui Nam Pak

Background: The incidence of atrial fibrillation (AF) is increasing in the elderly population, however, there is paucity of data on the safety outcomes of this patient subgroup thus we sought to investigate on the impact of age on the safety of catheter ablation for AF. Methods and Results: We included 1,293 (male 75%) patients enrolled in Yonsei AF Ablation Cohort database in Seoul, South Korea, from March 2009 to November 2013. We divided the patients into 4 groups according to age (Group 1, aged 17-49, N=295 ; Group 2 50-59, N=421; Group 3 60-69 N=408; and Group 4 ≥ 70, N=169) and evaluated the incidence of procedure related complications. No procedure-related death occurred in this study. There was a trend of increasing incidence of procedure related complications with age noted as follows: Group 1= 3.7%; Group 2= 4.0%; Group 3=6.6%; and Group 4 7.1%, (p= 0.15). There were 28 cases (2.2%) of major complications (Group 1=1.7%, Group 2=1.9%, Group 3=2%, Group 4 4.1%), tamponade being the most common. Major complications in group 4 include: tamponade 4 cases, phrenic nerve palsy 1 case, atrioesophaeal fistula 1 and 3rd degree AV block in 1 patient. Multivariate regression analysis shows ablation time (odds ratio (OR) 1.2 confidence interval (CI)1.0-1.017, p=0.017), procedure time (OR 1.008, CI 1.0-1.15, p=0.04), decreasing eGFR (OR 1.013, CI 1.002-1.026 p=0.018), coronary artery disease (CAD) (OR 1.847, CI 1.003-3.524, p0.04) and age (OR 1.028, CI 1.003-1.055, p=0.03) were associated with increased adjusted risk of total complications. Predictors of major complications include age (OR 1.044, CI 1.003-1.086, p0.02) and ablation time (OR 1.009, CI 0.999-1.000, p=0.033). Conclusion: Our data suggest that incidence of procedural complications in RFA of AF increase with age. Ablation time and age are independent predictors of a major complication.

2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Yu Liu ◽  
Jing Li ◽  
Wanyu Zhang ◽  
Yihong Guo

AbstractOestradiol, an important hormone in follicular development and endometrial receptivity, is closely related to clinical outcomes of fresh in vitro fertilization-embryo transfer (IVF-ET) cycles. A supraphysiologic E2 level is inevitable during controlled ovarian hyper-stimulation (COH), and its effect on the outcome of IVF-ET is controversial. The aim of this retrospective study is to evaluate the association between elevated serum oestradiol (E2) levels on the day of human chorionic gonadotrophin (hCG) administration and neonatal birthweight after IVF-ET cycles. The data of 3659 infertile patients with fresh IVF-ET cycles were analysed retrospectively between August 2009 and February 2017 in First Hospital of Zhengzhou University. Patients were categorized by serum E2 levels on the day of hCG administration into six groups: group 1 (serum E2 levels ≤ 1000 pg/mL, n = 230), group 2 (serum E2 levels between 1001 and 2000 pg/mL, n = 524), group 3 (serum E2 levels between 2001 and 3000 pg/mL, n = 783), group 4 (serum E2 levels between 3001 and 4000 pg/mL, n = 721), group 5 (serum E2 levels between 4001 and 5000 pg/mL, n = 548 ), and group 6 (serum E2 levels > 5000 pg/mL, n = 852). Univariate linear regression was used to evaluate the independent correlation between each factor and outcome index. Multiple logistic regression was used to adjust for confounding factors. The LBW rates were as follows: 3.0% (group 1), 2.9% (group 2), 1.9% (group 3), 2.9% (group 4), 2.9% (group 5), and 2.0% (group 6) (P = 0.629), respectively. There were no statistically significant differences in the incidences of neonatal LBW among the six groups. We did not detect an association between peak serum E2 level during ovarian stimulation and neonatal birthweight after IVF-ET. The results of this retrospective cohort study showed that serum E2 peak levels during ovarian stimulation were not associated with birth weight during IVF cycles. In addition, no association was found between higher E2 levels and increased LBW risk. Our observations suggest that the hyper-oestrogenic milieu during COS does not seem to have adverse effects on the birthweight of offspring after IVF. Although this study provides some reference, the obstetric-related factors were not included due to historical reasons. The impact of the high estrogen environment during COS on the birth weight of IVF offspring still needs future research.


2020 ◽  
Vol 79 (Suppl 1) ◽  
pp. 340.2-341
Author(s):  
V. Orefice ◽  
F. Ceccarelli ◽  
C. Barbati ◽  
R. Lucchetti ◽  
G. Olivieri ◽  
...  

Background:Systemic lupus erythematosus (SLE) is an autoimmune disease mainly affecting women of childbearing age. The interplay between genetic and environmental factors may contribute to disease pathogenesis1. At today, no robust data are available about the possible contribute of diet in SLE. Caffeine, one of the most widely consumed products in the world, seems to interact with multiple components of the immune system by acting as a non-specific phosphodiesterase inhibitor2.In vitrodose-dependent treatment with caffeine seems to down-regulate mRNA levels of key inflammation-related genes and similarly reduce levels of different pro-inflammatory cytokines3.Objectives:We evaluated the impact of caffeine consumption on SLE-related disease phenotype and activity, in terms of clinimetric assessment and cytokines levels.Methods:We performed a cross-sectional study, enrolling consecutive patients and reporting their clinical and laboratory data. Disease activity was assessed by SLE Disease Activity Index 2000 (SLEDAI-2k)4. Caffeine intake was evaluated by a 7-day food frequency questionnaire, including all the main sources of caffeine. As previously reported, patients were divided in four groups according to the daily caffeine intake: <29.1 mg/day (group 1), 29.2-153.7 mg/day (group 2), 153.8-376.5 mg/day (group 3) and >376.6 mg/day (group 4)5. At the end of questionnaire filling, blood samples were collected from each patient to assess cytokines levels. These were assessed by using a panel by Bio-Plex assays to measure the levels of IL-6, IL-10, IL-17, IL-27, IFN-γ, IFN-α and Blys.Results:We enrolled 89 SLE patients (F/M 87/2, median age 46 years, IQR 14; median disease duration 144 months, IQR 150). The median intake of caffeine was 195 mg/day (IQR 160.5). At the time of the enrollment, 8 patients (8.9%) referred a caffeine intake < 29.1 mg/day (group 1), 27 patients (30.3%) between 29.2 and 153.7 mg/day (group 2), 45 patients (51%) between 153.8 and 376.5 mg/day (group 3) and 9 patients (10.1%) >376.6 mg/day (group 4). A negative correlation between the levels of caffeine and disease activity, evaluated with SLEDAI-2K, was observed (p=0.01, r=-0.26). By comparing the four groups, a significant higher prevalence of lupus nephritis, neuropsychiatric involvement, haematological manifestations, hypocomplementemia and anti-dsDNA positivity was observed in patients with less intake of caffeine (figure 1 A-E). Furthermore, patients with less intake of caffeine showed a significant more frequent use of glucocorticoids [group 4: 22.2%,versusgroup 1 (50.0%, p=0.0001), group 2 (55.5%, p=0.0001), group 3 (40.0%, p=0.009)]. Moving on cytokines analysis, a negative correlation between daily caffeine consumption and serum level of IFNγ was found (p=0.03, r=-0.2) (figure 2A); furthermore, patients with more caffeine intake showed significant lower levels of IFNα (p=0.02, figure 2B), IL-17 (p=0.01, figure 2C) and IL-6 (p=0.003, figure 2D).Conclusion:This is the first report demonstrating the impact of caffeine on SLE disease activity status, as demonstrated by the inverse correlation between its intake and both SLEDAI-2k values and cytokines levels. Moreover, in our cohort, patients with less caffeine consumption seems to have a more severe disease phenotype, especially in terms of renal and neuropsychiatric involvement. Our results seem to suggest a possible immunoregulatory dose-dependent effect of caffeine, through the modulation of serum cytokine levels, as already suggested byin vitroanalysis.References:[1]Kaul et alNat. Rev. Dis. Prim.2016; 2. Aronsen et alEurop Joul of Pharm2014; 3. Iris et alClin Immun.2018; 4. Gladman et al J Rheumatol. 2002; 5. Mikuls et alArth Rheum2002Disclosure of Interests:Valeria Orefice: None declared, Fulvia Ceccarelli: None declared, cristiana barbati: None declared, Ramona Lucchetti: None declared, Giulio Olivieri: None declared, enrica cipriano: None declared, Francesco Natalucci: None declared, Carlo Perricone: None declared, Francesca Romana Spinelli Grant/research support from: Pfizer, Consultant of: Novartis, Gilead, Lilly, Sanofi, Celgene, Speakers bureau: Lilly, cristiano alessandri Grant/research support from: Pfizer, Guido Valesini: None declared, Fabrizio Conti Speakers bureau: BMS, Lilly, Abbvie, Pfizer, Sanofi


Author(s):  
Kirsten E Lyke ◽  
Alexandra Singer ◽  
Andrea A Berry ◽  
Sharina Reyes ◽  
Sumana Chakravarty ◽  
...  

Abstract Background A live-attenuated Plasmodium falciparum (Pf) sporozoite (SPZ) vaccine (PfSPZ Vaccine) has shown up to 100% protection against controlled human malaria infection (CHMI) using homologous parasites (same Pf strain as in the vaccine). Using a more stringent CHMI, with heterologous parasites (different Pf strain), we assessed the impact of higher PfSPZ doses, a novel multi-dose prime regimen, and a delayed vaccine boost upon vaccine efficacy. Methods Four groups of 15 healthy, malaria-naïve adults were immunized. Group (Grp) 1 received five doses of 4.5x10 5 PfSPZ (days 1, 3, 5, 7; week 16). Grps 2, 3 and 4 received three doses (weeks 0, 8, 16) with Gp 2 receiving 9.0×10 5/dose, Grp 3 receiving 18.0×10 5/dose, and Grp 4 receiving 27.0×10 5 for dose 1 and 9.0×10 5 for doses 2 and 3. VE was assessed by heterologous CHMI after 12 or 24 weeks. Volunteers not protected at 12 weeks were boosted prior to repeat CHMI at 24 weeks. Results At 12-week CHMI, 6/15 (40%) Group 1 (P=0.04), 3/15 (20%) Group 2 vs. 0/8 controls remained aparasitemic. At 24-week CHMI, 3/13 (23%) Group 3, 3/14 (21%) Group 4 vs. 0/8 controls remained aparasitemic (Groups 2-4, VE not significant). Post-boost, 9/14 (64%) vs. 0/8 controls remained aparasitemic (3/6 Group 1, P=0.025; 6/8 Group 2, P=0.002). Conclusions Four stacked, priming injections (multi-dose priming) showed 40% VE against heterologous CHMI, while dose escalation of PfSPZ using single dose priming was not significantly protective. Boosting unprotected subjects improved VE at 24 weeks to 64%.


Circulation ◽  
2020 ◽  
Vol 142 (Suppl_3) ◽  
Author(s):  
Andrew S Wilson ◽  
Kelvin N Bush

Introduction: The efficacy and safety of pulmonary vein isolation (PVI) for atrial fibrillation (AF) in the active duty (AD) military population has not been previously reported. Hypothesis: We postulate that PVI is an efficacious and safe treatment for young AD service members with AF. Methods: AD military personnel with AF who underwent PVI from 2004 to 2019 were retrospectively analyzed in four age groups (group 1, n=26, 18 to 27 years; group 2, n=38, 28 to 37 years; group 3, n=28, 38 to 49 years; group 4, n=12, ≥50 years). Primary endpoints were (1) PVI procedural efficacy defined as no or rare AF recurrence (<6 episodes) 12 months after last PVI with or without antiarrhythmic drugs (AAD) and (2) procedure-related adverse events and complications. Results: 104 personnel (mean age 35.6+9 years, 84.6% paroxysmal AF, mean LVEF 60.2+6%, 19.2% maintained on AAD after PVI) underwent 142 PVI procedures with a mean follow up of 55.8+47 months. Procedural efficacy was attained in 96.2% of group 1, 78.9% of group 2, 75.0% of group 3, and 66.7% of group 4 (P=0.004, Figure 1). Freedom from AF was reached in 80.3% of group 1, 55.3% of group 2, 46.4% of group 3, 41.7% of group 4 (P=0.02). AADs were maintained in 11.5% of group 1, 21.0% of group 2, 14.3% of group 3, 41.7% of group 4 (P=0.144) and there was no difference in AF recurrence rates between those with AADs and those without (P=0.091). LVEF <50% trended towards being a significant predictor of AF recurrence (OR, 7; 95% CI, 0.75-65; P=0.051). Complications occurred in only 4 (3.8%) cases (pulmonary vein stenosis, cardiac tamponade, arteriovenous fistula) with no complications in the youngest group. Conclusions: This study suggests that PVI is an effective and safe therapy for younger military personnel with AF desiring to decrease their individual AF burden.


2020 ◽  
Vol 7 ◽  
Author(s):  
Lei Guo ◽  
Huaiyu Ding ◽  
Haichen Lv ◽  
Xiaoyan Zhang ◽  
Lei Zhong ◽  
...  

Background: The number of coronary chronic total occlusion (CTO) patients with renal insufficiency is huge, and limited data are available on the impact of renal insufficiency on long-term clinical outcomes in CTO patients. We aimed to investigate clinical outcomes of CTO percutaneous coronary intervention (PCI) vs. medical therapy (MT) in CTO patients according to baseline renal function.Methods: In the study population of 2,497, 1,220 patients underwent CTO PCI and 1,277 patients received MT. Patients were divided into four groups based on renal function: group 1 [estimated glomerular filtration rate (eGFR) ≥ 90 ml/min/1.73 m2], group 2 (60 ≤ eGFR &lt;90 ml/min/1.73 m2), group 3 (30 ≤ eGFR &lt;60 ml/min/1.73 m2), and group 4 (eGFR &lt;30 ml/min/1.73 m2). Major adverse cardiac event (MACE) was the primary end point.Results: Median follow-up was 2.6 years. With the decline in renal function, MACE (p &lt; 0.001) and cardiac death (p &lt; 0.001) were increased. In group 1 and group 2, MACE occurred less frequently in patients with CTO PCI, as compared to patients in the MT group (15.6% vs. 22.8%, p &lt; 0.001; 15.6% vs. 26.5%, p &lt; 0.001; respectively). However, there was no significant difference in terms of MACE between CTO PCI and MT in group 3 (21.1% vs. 28.7%, p = 0.211) and group 4 (28.6% vs. 50.0%, p = 0.289). MACE was significantly reduced for patients who received successful CTO PCI compared to patients with MT (16.7% vs. 22.8%, p = 0.006; 16.3% vs. 26.5%, p = 0.003, respectively) in group 1 and group 2. eGFR &lt; 30 ml/min/1.73 m2, age, male gender, diabetes mellitus, heart failure, multivessel disease, and MT were identified as independent predictors for MACE in patients with CTOs.Conclusions: Renal impairment is associated with MACE in patients with CTOs. For treatment of CTO, compared with MT alone, CTO PCI may reduce the risk of MACE in patients without chronic kidney disease (CKD). However, reduced MACE from CTO PCI among patients with CKD was not observed. Similar beneficial effects were observed in patients without CKD who underwent successful CTO procedures.


2021 ◽  
Vol 17 (1) ◽  
Author(s):  
Kate Averay ◽  
Gaby van Galen ◽  
Michael Ward ◽  
Denis Verwilghen

Abstract Background Equine small intestinal resection and anastomosis is a procedure where optimizing speed, without compromising integrity, is advantageous. There are a range of different needle holders available, but little is published on the impact surgical instrumentation has on surgical technique in veterinary medicine. The objectives of this study were to investigate if the needle holder type influences the anastomosis construction time, the anastomosis bursting pressure and whether the bursting pressure is influenced by the anastomosis construction time. Single layer end-to-end jejunojejunal anastomoses were performed on jejunal segments harvested from equine cadavers. These segments were randomly allocated to four groups. Three groups based on the needle holder type that was used: 16.5 cm Frimand (Group 1), 16 cm Mayo-Hegar (Group 2) or 20.5 cm Mayo-Hegar (Group 3) needle holders. One (Group 4) as control without anastomoses. Anastomosis construction time was recorded. Bursting pressure was determined by pumping green coloured fluid progressively into the lumen whilst recording intraluminal pressures. Maximum pressure reached prior to failure was recorded as bursting pressure. Construction times and bursting pressures were compared between needle holder, and the correlation between bursting pressure and construction time was estimated. Results Construction times were not statistically different between groups (P = 0.784). Segments from Group 2 and Group 3 burst at a statistically significantly lower pressure than those from Group 4; P = 0.031 and P = 0.001 respectively. Group 4 and Group 1 were not different (P = 0.125). The mean bursting pressure was highest in Group 4 (189 ± 61.9 mmHg), followed by Group 1 (166 ± 31 mmHg) and Group 2 (156 ± 42 mmHg), with Group 3 (139 ± 34 mmHg) having the lowest mean bursting pressure. Anastomosis construction time and bursting pressure were not correlated (P = 0.792). Conclusions The tested needle holders had a significant effect on bursting pressure, but not on anastomosis construction time. In an experimental setting, the Frimand needle holder produced anastomoses with higher bursting pressures. Further studies are required to determine clinical implications.


2021 ◽  
Vol 2 (7) ◽  
pp. 567-573
Author(s):  
Ogechukwu K Uche ◽  
Esiri F Ohiambe ◽  
Fabian C Amechina

Aim: There are Conflicting reports on safety profile of nanoparticles on biological cells. This study evaluated the impact of nanosilver on hemocompatibility on salt-loaded rats. Materials and Methods: Sprague-Dawley rats [(inbred) (120-140 g)] randomly divided into of 4 groups, (n = 6) were studied. Group 1(control) received normal rat chow and tap water, Group 2 received rat chow containing 8% NaCl [(salt-loaded rats (SLRs)]. Group 3 received rat chow + Nanosilver Solution (NS) 0.18 mL 10 ppm/kg/day. Group 4 comprised SLRs + NS. After 6 weeks oral gavage treatments, measurements of Blood pressure (Bp) and Heart Rate (HR) were by pressure transducer via cannulation of left common carotid artery following anaesthesia with urethane. HR was computed by the number of arterial pulse per 60 seconds. 5 ml of blood for WBC, PLATELETS, RBC, PCV, HB, MCH, MCHC and MCV analyses using automated haematology analyser and Osmotic fragility reactivity with standard spectrophotometer at 540 nm wavelength. Results: Exposure of nanosilver to normotensive rats resulted in significantly lower RBC level compared with control, whereas RBC level in Salt-Loaded Co-Treated Nanosilver (SCNS) was comparable with the SLRs. The tenet was the same for HB, PCV, MCH and MCHC. Nanosilver induced leukopenia in normotensive compared with control and prevented WBC elevation in SCNS. Platelets significantly increased in Nanosilver-Treated Normotensive Rats (NTNRs) compared with control and decreased in SCNS. Osmotic burst resistance increased in NTNRs and decreased in cells from treated groups. Conclusion: Chronic exposure of nanosilver to salt loaded rats alters haematological parameters which may worsen circulatory function and activate risk factors of cardiovascular disorders.


Blood ◽  
2004 ◽  
Vol 104 (11) ◽  
pp. 1821-1821 ◽  
Author(s):  
Mauricette Michallet ◽  
Quoc-Hung Le ◽  
Anne-Sophie Michallet ◽  
Franck E. Nicolini ◽  
Anne Thiebaut ◽  
...  

Abstract Allogeneic hematopoietic stem cell (HSC) transplantation after reduced intensity conditioning regimen (RICT) is increasingly used worldwide. Chimerism evaluation by short tandem repeat analysises is fundamental in this strategy to document donor cell engraftment and to indicate DLI after transplant. The signification and impact on transplant outcome of conversion kinetics to total donor profile is until now unknown. We performed a retrospective analysis in 85 patients [(53 males and 32 females, median age = 49 years (18–65)] who underwent RICT. The principal aim of our study was to analyze the impact of convertion delay to total donor on overall and event-free survivals (OS and EFS) within 90 days after transplant without any donor lymphocyte infusion (DLI) intervention. Diagnosis before transplantation were acute leukemia (n = 18), myelodysplasia (n = 6), chronic myeloid leukemia (n = 5), chronic lymphoid leukemia (n = 7), non hodgkin lymphoma (n = 10), hodgkin disease (n = 7), multiple myeloma (n = 21), aplastic anemia (n = 2), and solid tumor (n = 9). Before RICT, 36 patients were already transplanted, 18 patients were in complete remission (CR), 33 in partial response (PR) and 34 in evolutive disease (ED). As conditioning regimen, 32 received TBI (from 2 to 6 grays) associated to either fludarabine or cyclophosphamide, 44 fludarabine+busulfan+anti-thymocyte globulins (ATG), 6 cyclophospamide+ATG and 3 idarubicine+aracytine+fludarabine. As HSC, 26 patients received bone marrow [median nucleated cells = 2.6x108/kg (1.29–5.8)] and 59 peripheral blood stem cells [median CD34+ cells = 5.82x106/kg (2.3–5.8)] from 78 HLA identical sibling donors and 7 HLA identical unrelated donors. After transplant, 49 patients developed an acute GvHD (18 grade I, 17 grade II, 6 grade III, 8 grade IV) and 33 patients a chronic GvHD (22 limited, 11 extensive). At time of the last follow-up, 43 patients had relapsed after transplant while 35 are alive and 50 died (27 from relapse and 23 from transplant-related toxicity). Regarding chimerism post-transplant, we divided the population into 4 groups : group 1: who converted to total donor in less than 30 days (n = 44), group 2: who converted to total donor in more than 30 days (from 55 to 90 days) ( n = 11), group 3: who did never convert to total donor (n = 21 ) and group 4: who rapidly converted to total donor but returned during evolution after transplant into mixed chimerism (n = 10). The probability of OS and EFS at 2 years were 38% (95%CI 37–52) and 20.8 (13–34) respectively. OS and EFS at 2 years of 37.4% (95%CI 24–58) and 13.2 (95%CI 5.2–33.8) in the group 1, 77% (95%CI 53.5–100) and 56.6% (95%CI 31.3–100) in the group 2, 35.6% (95%CI 18.9–67) and 19.4% (95%CI 6.6–56.6) in the group 3 and 0% and not reached in the group 4. In conclusion, chimerism kinetics seems to have an impact on transplant outcome. A slow conversion to total donor profile seems benefit showing the importance to present a transient mixed chimerism status permitting tolerance.


Blood ◽  
2016 ◽  
Vol 128 (22) ◽  
pp. 981-981
Author(s):  
Yazid Belkacemi ◽  
Myriam Labopin ◽  
Sebastian Giebel ◽  
Gokoulakrichenane Loganadane ◽  
Leszek Miszyk ◽  
...  

Abstract Introduction Total-body irradiation (TBI) has an historical established role in preparative regimens used before allogeneic transplant in both acute lymphoblastic leukemia (ALL) and acute myeloid leukemia (AML). The most popular myeloablative conditioning consists of 12Gy TBI administered in 6 fractions (2Gy twice daily for 3 days) in combination with cyclophosphamide. This schedule of treatment delivery is, however, time-consuming. With limited availability of irradiation equipment, many departments in the world limit the number of patients receiving TBI due to non-medical reasons. Therefore, one of the possible solutions is to reduce the number of fractions for a similar effective dose. The aim of the SARASIN study was to analyze the impact of fractionation on outcome of patients undergoing allotransplant for ALL and AML. Patients and methods We retrospectively compared myeloablative TBI regimens of 3126 patients registered in the EBMT database transplanted between 2000 and 2014 for ALL (n=1783) or AML (n=1343). Pre-transplant chemotherapy consisted mainly of cyclophosphamide (Cy) in 92% and 97% of ALL and AML patients, respectively. TBI was delivered as either 12Gy in 6 fractions (group 1; ALL, n=1362 and AML, n=857), or single dose TBI (STBI) (group 2; ALL, n=54 and AML, n=79), or 9-12Gy in 2 fractions (group 3; ALL, n=173 and AML, n=256), or 12Gy in 3-4 fractions (group 4; ALL, n=194 and AML, n=151). The majority (70%-79%)* of ALL and AML (57%-79%) patients were grafted in 1stcomplete remission (CR1). The rate of transplants from unrelated donors was higher in ALL (24%-50%) as compared to AML (20%-37%) of the patients, with similar rates of non-in vitro T-cell depletion that ranged from 25% to 96% in the 4 TBI groups, respectively. Graft versus host disease (GvHD) prevention consisted mainly (75% to 89%) of cyclosporine and methotrexate. Results The median follow-up was 61 (1-87) months and 85 (1-192) months in the ALL and AML patients, respectively. At 5 years, leukemia free survival (LFS), overall survival (OS), relapse incidence (RI) and non-relapse mortality (NRM) were 46.5% (44.1% - 49%), 50.4% (47.9% - 52.9%), 29% (26.7% - 31.1%), 24.5% (22.5% - 26.6%) in ALL and 45.7% (43% - 48.5%), 48% (45.3% - 51%), 30.4% (28% - 33%) and 23.8% (21.5% - 26.2%), respectively. LFS at 5y in AML and ALL patients were respectively: 48% and 45%, 32% and 45%, 45% and 53%, 42% and 50% in the 4 TBI groups (p=0.082 for AML and p=0.32 for ALL). Additionally, for both AML and ALL, no statistical significance was found between the 4 TBI groups for OS (p=0.82 in ALL; p=0.11 in AML), RI (p=0.29 in ALL; p=0.23 in AML) and for NRM (p=0.58 in ALL; p=0.12 in AML). In multivariate analyses of TBI schedules, comparing the different schedules to the standard 12Gy in 6 fractions (group 1 vs group 2; group 1 vs group 3; group 1 vs group 4), fractionation was not found as independent prognostic factor neither in ALL nor in AML patients for LFS, OS, RI or NRM. Conclusion The SARASIN study showed that using a TBI dose of 12Gy as pre allogeneic transplantation, fractionation has no impact on relapse or survival neither in ALL, nor in AML patients. Furthermore, the reduction of the number of fractions even in this rather high total body radiation dose level is not associated with increased risk of NRM. Altogether, our data suggests that 12Gy could be delivered safely in less than 6 fractions. Late effects analyses are ongoing. Our findings are not only of considerable practical importance for radiotherapy departments but moreover it may lead to increase TBI availability as pre transplantation conditioning for leukemic patients undergoing allogeneic transplantation. Disclosures Michallet: Astellas Pharma: Consultancy, Honoraria; Bristol-Myers Squibb: Consultancy, Honoraria, Research Funding; Pfizer: Consultancy, Honoraria; Novartis: Consultancy, Honoraria; Pfizer: Consultancy, Honoraria; MSD: Consultancy, Honoraria; Genzyme: Consultancy, Honoraria.


2021 ◽  
Vol 10 (4) ◽  
pp. 154-161
Author(s):  
Allico Mousso Jean Maurel ◽  
Agbo Adouko Edith ◽  
Séri Kipré Laurent ◽  
Boyvin Lydie ◽  
Kouamé Christophe ◽  
...  

The objective of this work was to study the impact of food diversification based on sweet potato, soybean, and cowpea on the prognostic inflammatory and nutritional index (PINI) in school-aged children in the Nawa region. This study took place from October 2017 to May 2018 among 240 pupils aged 6 to 12, divided into four groups of 60. Four types of meals were proposed: rice with tomato soup and fish (group 1), sweet potato porridge enriched with green soybeans (group 2), sweet potato porridge enriched with white cowpea (group 3), or sweet potato porridge accompanied by white cowpea with green soybeans (group 4). There were three blood samples: before eating meals (phase 0), the end of the first trimester (phase 1), and the end of the second trimester (phase 2). Blood assay for C-reactive protein (CRP), orosomucoid, albumin, and prealbumin was performed using COBAS c311 analyzer. PINI was calculated. Groups 3 and 4 showed a slight increase in albumin values (42.24 ± 0.95 g/L and 41.51 ± 1.71 g/L, respectively) compared to group 1. CRP decreased from phase 1 for group 1 (2.06 ± 0.26 mg/L) and group 4 (2.38 ± 0.36 mg/L). Orosomucoid increased insignificantly (p > 0.05) in group 3 (0.74 ± 0.04 g/L) and group 4 (0.71 ± 0.04 g/L). PINI was reduced by 0.37 (group 1), 0.36 (group 2), 0.46 (group 3) and 0.44 (group 4). Food diversification based on sweet potato and white cowpea has a positive impact on PINI in more than 80% of pupils.


Sign in / Sign up

Export Citation Format

Share Document