Intermittent hypoxia and respiratory patterns during sleep of preterm infants aged 3 to 18 months residing at high altitude

SLEEP ◽  
2021 ◽  
Author(s):  
Elida Duenas-Meza ◽  
María Isabel Escamilla-Gil ◽  
María Angelica Bazurto-Zapata ◽  
Elizabeth Caparo ◽  
Miguel Suarez Cuartas ◽  
...  

Abstract Study Objectives The aim of this study was to determine the impact of apneas on oxygen saturation and the presence of intermittent hypoxia, during sleep of preterm infants (PTIs) born at high altitudes and compare with full-term infants (FTIs) at the same altitude. Methods PTIs and FTIs from 3 to 18 months were included. They were divided into three age groups: 3–4 months (Group 1); 6–7 months (Group 2), and 10–18 months (Group 3). Polysomnography parameters and oxygenation indices were evaluated. Intermittent hypoxia was defined as brief, repetitive cycles of decreased oxygen saturation. Kruskal-Wallis test for multiple comparisons, t-test or Mann–Whitney U-test were used. Results 127 PTI and 175 FTI were included. Total apnea-hypopnea index (AHI) was higher in PTI that FTI in all age groups (Group 1: 33.5/h vs. 12.8/h, p = 0.042; Group 2: 27.0/h vs. 7.4/h, p < 0.001; and Group 3: 11.6/h vs. 3.1/h, p < 0.001). In Group 3, central-AHI (8.0/h vs. 2.3/h, p < 0.001) and obstructive-AHI (1.8/h vs. 0.6/h, p < 0.008) were higher in PTI than FTI. T90 (7.0% vs. 0.5, p < 0.001), oxygen desaturation index (39.8/h vs. 11.3, p < 0.001) were higher in PTI than FTI, nadir SpO2 (70.0% vs. 80.0, p<0.001) was lower in PTI. Conclusion At high altitude, compared to FTI, PTI have a higher rate of respiratory events, greater desaturation, and a delayed resolution of these conditions, suggesting the persistence of intermittent hypoxia during the first 18 months of life. This indicates the need for follow-up of these infants for timely diagnosis and treatment of respiratory disturbances during sleep.

2020 ◽  
pp. 1-9
Author(s):  
Ralph T. Schär ◽  
Shpend Tashi ◽  
Mattia Branca ◽  
Nicole Söll ◽  
Debora Cipriani ◽  
...  

OBJECTIVEWith global aging, elective craniotomies are increasingly being performed in elderly patients. There is a paucity of prospective studies evaluating the impact of these procedures on the geriatric population. The goal of this study was to assess the safety of elective craniotomies for elderly patients in modern neurosurgery.METHODSFor this cohort study, adult patients, who underwent elective craniotomies between November 1, 2011, and October 31, 2018, were allocated to 3 age groups (group 1, < 65 years [n = 1008], group 2, ≥ 65 to < 75 [n = 315], and group 3, ≥ 75 [n = 129]). Primary outcome was the 30-day mortality after craniotomy. Secondary outcomes included rate of delayed extubation (> 1 hour), need for emergency head CT scan and reoperation within 48 hours after surgery, length of postoperative intensive or intermediate care unit stay, hospital length of stay (LOS), and rate of discharge to home. Adjustment for American Society of Anesthesiologists Physical Status (ASA PS) class, estimated blood loss, and duration of surgery were analyzed as a comparison using multiple logistic regression. For significant differences a post hoc analysis was performed.RESULTSIn total, 1452 patients (mean age 55.4 ± 14.7 years) were included. The overall mortality rate was 0.55% (n = 8), with no significant differences between groups (group 1: 0.5% [95% binominal CI 0.2%, 1.2%]; group 2: 0.3% [95% binominal CI 0.0%, 1.7%]; group 3: 1.6% [95% binominal CI 0.2%, 5.5%]). Deceased patients had a significantly higher ASA PS class (2.88 ± 0.35 vs 2.42 ± 0.62; difference 0.46 [95% CI 0.03, 0.89]; p = 0.036) and increased estimated blood loss (1444 ± 1973 ml vs 436 ± 545 ml [95% CI 618, 1398]; p <0.001). Significant differences were found in the rate of postoperative head CT scans (group 1: 6.65% [n = 67], group 2: 7.30% [n = 23], group 3: 15.50% [n = 20]; p = 0.006), LOS (group 1: median 5 days [IQR 4; 7 days], group 2: 5 days [IQR 4; 7 days], and group 3: 7 days [5; 9 days]; p = 0.001), and rate of discharge to home (group 1: 79.0% [n = 796], group 2: 72.0% [n = 227], and group 3: 44.2% [n = 57]; p < 0.001).CONCLUSIONSMortality following elective craniotomy was low in all age groups. Today, elective craniotomy for well-selected patients is safe, and for elderly patients, too. Elderly patients are more dependent on discharge to other hospitals and postacute care facilities after elective craniotomy.Clinical trial registration no.: NCT01987648 (clinicaltrials.gov).


2020 ◽  
Vol 22 (Supplement_3) ◽  
pp. iii440-iii440
Author(s):  
Harriet Dulson ◽  
Rachel McAndrew ◽  
Mark Brougham

Abstract INTRODUCTION Children treated for CNS tumours experience a very high burden of adverse effects. Platinum-based chemotherapy and cranial radiotherapy can cause ototoxicity, which may be particularly problematic in patients who have impaired vision and cognition as a result of their tumour and associated treatment. This study assessed the prevalence of impaired hearing and vision and how this may impact upon education. METHODS 53 patients diagnosed with solid tumours in Edinburgh, UK between August 2013–2018 were included in the study. Patients were split into three groups according to treatment received: Group 1 – cisplatin-based chemotherapy and cranial radiotherapy; Group 2 - platinum-based chemotherapy, no cranial radiotherapy; Group 3 – benign brain tumours treated with surgery only. Data was collected retrospectively from patient notes. RESULTS Overall 69.5% of those treated with platinum-based chemotherapy experienced ototoxicity as assessed by Brock grading and 5.9% of patients had reduced visual acuity. Patients in Group 1 had the highest prevalence of both. 44.4% of patients in Group 1 needed increased educational support following treatment, either with extra support in the classroom or being unable to continue in mainstream school. 12.5% of Group 2 patients required such support and 31.3% in Group 3. CONCLUSIONS Children with CNS tumours frequently require support for future education but those treated with both platinum-based chemotherapy and cranial radiotherapy are at particular risk, which may be compounded by co-existent ototoxicity and visual impairment. It is essential to provide appropriate support for this patient cohort in order to maximise their educational potential.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Yu Liu ◽  
Jing Li ◽  
Wanyu Zhang ◽  
Yihong Guo

AbstractOestradiol, an important hormone in follicular development and endometrial receptivity, is closely related to clinical outcomes of fresh in vitro fertilization-embryo transfer (IVF-ET) cycles. A supraphysiologic E2 level is inevitable during controlled ovarian hyper-stimulation (COH), and its effect on the outcome of IVF-ET is controversial. The aim of this retrospective study is to evaluate the association between elevated serum oestradiol (E2) levels on the day of human chorionic gonadotrophin (hCG) administration and neonatal birthweight after IVF-ET cycles. The data of 3659 infertile patients with fresh IVF-ET cycles were analysed retrospectively between August 2009 and February 2017 in First Hospital of Zhengzhou University. Patients were categorized by serum E2 levels on the day of hCG administration into six groups: group 1 (serum E2 levels ≤ 1000 pg/mL, n = 230), group 2 (serum E2 levels between 1001 and 2000 pg/mL, n = 524), group 3 (serum E2 levels between 2001 and 3000 pg/mL, n = 783), group 4 (serum E2 levels between 3001 and 4000 pg/mL, n = 721), group 5 (serum E2 levels between 4001 and 5000 pg/mL, n = 548 ), and group 6 (serum E2 levels > 5000 pg/mL, n = 852). Univariate linear regression was used to evaluate the independent correlation between each factor and outcome index. Multiple logistic regression was used to adjust for confounding factors. The LBW rates were as follows: 3.0% (group 1), 2.9% (group 2), 1.9% (group 3), 2.9% (group 4), 2.9% (group 5), and 2.0% (group 6) (P = 0.629), respectively. There were no statistically significant differences in the incidences of neonatal LBW among the six groups. We did not detect an association between peak serum E2 level during ovarian stimulation and neonatal birthweight after IVF-ET. The results of this retrospective cohort study showed that serum E2 peak levels during ovarian stimulation were not associated with birth weight during IVF cycles. In addition, no association was found between higher E2 levels and increased LBW risk. Our observations suggest that the hyper-oestrogenic milieu during COS does not seem to have adverse effects on the birthweight of offspring after IVF. Although this study provides some reference, the obstetric-related factors were not included due to historical reasons. The impact of the high estrogen environment during COS on the birth weight of IVF offspring still needs future research.


2020 ◽  
Vol 79 (Suppl 1) ◽  
pp. 340.2-341
Author(s):  
V. Orefice ◽  
F. Ceccarelli ◽  
C. Barbati ◽  
R. Lucchetti ◽  
G. Olivieri ◽  
...  

Background:Systemic lupus erythematosus (SLE) is an autoimmune disease mainly affecting women of childbearing age. The interplay between genetic and environmental factors may contribute to disease pathogenesis1. At today, no robust data are available about the possible contribute of diet in SLE. Caffeine, one of the most widely consumed products in the world, seems to interact with multiple components of the immune system by acting as a non-specific phosphodiesterase inhibitor2.In vitrodose-dependent treatment with caffeine seems to down-regulate mRNA levels of key inflammation-related genes and similarly reduce levels of different pro-inflammatory cytokines3.Objectives:We evaluated the impact of caffeine consumption on SLE-related disease phenotype and activity, in terms of clinimetric assessment and cytokines levels.Methods:We performed a cross-sectional study, enrolling consecutive patients and reporting their clinical and laboratory data. Disease activity was assessed by SLE Disease Activity Index 2000 (SLEDAI-2k)4. Caffeine intake was evaluated by a 7-day food frequency questionnaire, including all the main sources of caffeine. As previously reported, patients were divided in four groups according to the daily caffeine intake: <29.1 mg/day (group 1), 29.2-153.7 mg/day (group 2), 153.8-376.5 mg/day (group 3) and >376.6 mg/day (group 4)5. At the end of questionnaire filling, blood samples were collected from each patient to assess cytokines levels. These were assessed by using a panel by Bio-Plex assays to measure the levels of IL-6, IL-10, IL-17, IL-27, IFN-γ, IFN-α and Blys.Results:We enrolled 89 SLE patients (F/M 87/2, median age 46 years, IQR 14; median disease duration 144 months, IQR 150). The median intake of caffeine was 195 mg/day (IQR 160.5). At the time of the enrollment, 8 patients (8.9%) referred a caffeine intake < 29.1 mg/day (group 1), 27 patients (30.3%) between 29.2 and 153.7 mg/day (group 2), 45 patients (51%) between 153.8 and 376.5 mg/day (group 3) and 9 patients (10.1%) >376.6 mg/day (group 4). A negative correlation between the levels of caffeine and disease activity, evaluated with SLEDAI-2K, was observed (p=0.01, r=-0.26). By comparing the four groups, a significant higher prevalence of lupus nephritis, neuropsychiatric involvement, haematological manifestations, hypocomplementemia and anti-dsDNA positivity was observed in patients with less intake of caffeine (figure 1 A-E). Furthermore, patients with less intake of caffeine showed a significant more frequent use of glucocorticoids [group 4: 22.2%,versusgroup 1 (50.0%, p=0.0001), group 2 (55.5%, p=0.0001), group 3 (40.0%, p=0.009)]. Moving on cytokines analysis, a negative correlation between daily caffeine consumption and serum level of IFNγ was found (p=0.03, r=-0.2) (figure 2A); furthermore, patients with more caffeine intake showed significant lower levels of IFNα (p=0.02, figure 2B), IL-17 (p=0.01, figure 2C) and IL-6 (p=0.003, figure 2D).Conclusion:This is the first report demonstrating the impact of caffeine on SLE disease activity status, as demonstrated by the inverse correlation between its intake and both SLEDAI-2k values and cytokines levels. Moreover, in our cohort, patients with less caffeine consumption seems to have a more severe disease phenotype, especially in terms of renal and neuropsychiatric involvement. Our results seem to suggest a possible immunoregulatory dose-dependent effect of caffeine, through the modulation of serum cytokine levels, as already suggested byin vitroanalysis.References:[1]Kaul et alNat. Rev. Dis. Prim.2016; 2. Aronsen et alEurop Joul of Pharm2014; 3. Iris et alClin Immun.2018; 4. Gladman et al J Rheumatol. 2002; 5. Mikuls et alArth Rheum2002Disclosure of Interests:Valeria Orefice: None declared, Fulvia Ceccarelli: None declared, cristiana barbati: None declared, Ramona Lucchetti: None declared, Giulio Olivieri: None declared, enrica cipriano: None declared, Francesco Natalucci: None declared, Carlo Perricone: None declared, Francesca Romana Spinelli Grant/research support from: Pfizer, Consultant of: Novartis, Gilead, Lilly, Sanofi, Celgene, Speakers bureau: Lilly, cristiano alessandri Grant/research support from: Pfizer, Guido Valesini: None declared, Fabrizio Conti Speakers bureau: BMS, Lilly, Abbvie, Pfizer, Sanofi


2000 ◽  
Vol 93 (3) ◽  
pp. 662-669 ◽  
Author(s):  
Tomiei Kazama ◽  
Ken Takeuchi ◽  
Kazuyuki Ikeda ◽  
Takehiko Ikeda ◽  
Mutsuhito Kikura ◽  
...  

Background Suitable propofol plasma concentrations during gastroscopy have not been determined for suppressing somatic and hemodynamic responses in different age groups. Methods Propofol sedation at target plasma concentrations from 0.5 to 4.0 microgram/ml were performed randomly in three groups of patients (23 per group) who were undergoing elective outpatient gastroscopy: ages 17-49 yr (group 1), 50-69 yr (group 2), and 70-89 yr (group 3). Plasma propofol concentration in which 50% of patients do not respond to these different stimuli were determined by logistic regression: verbal command (Cp50ls), somatic response to gastroscopy (Cp50endo), and gag response to gastroscopy (Cp50gag). Hemodynamic responses were also investigated in the different age groups. Results Cp50ls concentrations were 2.23 microgram/ml (group 1), 1.75 microgram/ml (group 2), and 1.40 microgram/ml (group 3). The Cp50endo values in groups 1 and 2 were 2.87 and 2.34 microgram/ml, respectively, which were significantly higher than their respective Cp50ls values. Cp50endo value in group 3 was 1.64 microgram/ml, which was close to its Cp50ls value. Because of a high degree of interpatient variability, Cp50gag could not be defined. Systolic blood pressure response decreased with increasing propofol concentrations. Conclusions The authors determined the propofol concentration necessary for gastroscopy and showed that increasing age reduces it. Propofol concentration that suppresses somatic response induces loss of consciousness in almost all young patients.


Circulation ◽  
2014 ◽  
Vol 130 (suppl_2) ◽  
Author(s):  
Janet W Elcano ◽  
Hui Nam Pak

Background: The incidence of atrial fibrillation (AF) is increasing in the elderly population, however, there is paucity of data on the safety outcomes of this patient subgroup thus we sought to investigate on the impact of age on the safety of catheter ablation for AF. Methods and Results: We included 1,293 (male 75%) patients enrolled in Yonsei AF Ablation Cohort database in Seoul, South Korea, from March 2009 to November 2013. We divided the patients into 4 groups according to age (Group 1, aged 17-49, N=295 ; Group 2 50-59, N=421; Group 3 60-69 N=408; and Group 4 ≥ 70, N=169) and evaluated the incidence of procedure related complications. No procedure-related death occurred in this study. There was a trend of increasing incidence of procedure related complications with age noted as follows: Group 1= 3.7%; Group 2= 4.0%; Group 3=6.6%; and Group 4 7.1%, (p= 0.15). There were 28 cases (2.2%) of major complications (Group 1=1.7%, Group 2=1.9%, Group 3=2%, Group 4 4.1%), tamponade being the most common. Major complications in group 4 include: tamponade 4 cases, phrenic nerve palsy 1 case, atrioesophaeal fistula 1 and 3rd degree AV block in 1 patient. Multivariate regression analysis shows ablation time (odds ratio (OR) 1.2 confidence interval (CI)1.0-1.017, p=0.017), procedure time (OR 1.008, CI 1.0-1.15, p=0.04), decreasing eGFR (OR 1.013, CI 1.002-1.026 p=0.018), coronary artery disease (CAD) (OR 1.847, CI 1.003-3.524, p0.04) and age (OR 1.028, CI 1.003-1.055, p=0.03) were associated with increased adjusted risk of total complications. Predictors of major complications include age (OR 1.044, CI 1.003-1.086, p0.02) and ablation time (OR 1.009, CI 0.999-1.000, p=0.033). Conclusion: Our data suggest that incidence of procedural complications in RFA of AF increase with age. Ablation time and age are independent predictors of a major complication.


Author(s):  
Soo Hyun Park ◽  
Ji Young Min ◽  
Won Cul Cha ◽  
Ik Joon Jo ◽  
Taerim Kim

Understanding age-specific injury patterns allows the continued improvement of prevention strategies. This is a retrospective study analyzing the Korea Emergency Department-Based Injury In-depth Surveillance data, including those aged ≤19 years old between January 2011 and December 2017. In this study, we focused on changes in the modes of injury and severity, and prevention potential by dividing the patients into four age groups: group 1 (0–4 years), group 2 (5–9 years), group 3 (10–14 years), and group 4 (15–19 years). The most common mode of injury in younger age groups 1 and 2 was a fall or slip. Most injuries in older age groups 3 and 4 were unintentional and intentional collisions combined. Traumatic brain injuries (2.1%), intensive care unit admissions (1.8%), and overall death (0.4%) were the highest in group 4. The proportions of severe and critical injury (EMR-ISS ≥ 25) were 7.5% in group 4, 3.2% in group 3, 2.5% in group 1, and 1% in group 2. This study presents a comprehensive trend of injuries in the pediatric population in South Korea. Our results suggest the importance of designing specific injury-prevention strategies for targeted groups, circumstances, and situations.


2020 ◽  
Vol 8 (6) ◽  
pp. 232596712092793
Author(s):  
Christopher Antonacci ◽  
Thomas R. Atlee ◽  
Peter N. Chalmers ◽  
Christopher Hadley ◽  
Meghan E. Bishop ◽  
...  

Background: Pitching velocity is one of the most important metrics used to evaluate a baseball pitcher’s effectiveness. The relationship between age and pitching velocity after a lighter ball baseball training program has not been determined. Purpose/Hypothesis: The purpose of this study was to examine the relationship between age and pitching velocity after a lighter ball baseball training program. We hypothesized that pitching velocity would significantly increase in all adolescent age groups after a lighter baseball training program, without a significant difference in magnitude of increase based on age. Study Design: Cohort study; Level of evidence, 2. Methods: Baseball pitchers aged 10 to 17 years who completed a 15-week training program focused on pitching mechanics and velocity improvement were included in this study. Pitchers were split into 3 groups based on age (group 1, 10-12 years; group 2, 13-14 years; group 3, 15-17 years), and each group trained independently. Pitch velocity was assessed at 4 time points (sessions 3, 10, 17, and 25). Mean, maximum, and mean change in pitch velocity between sessions were compared by age group. Results: A total of 32 male baseball pitchers were included in the analysis. Mean/maximum velocity increased in all 3 age groups: 3.4/4.8 mph in group 1, 5.3/5.5 mph in group 2, and 5.3/5.2 mph in group 3. While mean percentage change in pitch velocity increased in all 3 age groups (group 1, 6.5%; group 2, 8.3%; group 3, 7.6%), the magnitude of change was not significantly different among age groups. Program session number had a significant effect on mean and maximum velocity, with higher mean and maximum velocity seen at later sessions in the training program ( P = .018). There was no interaction between age and program session within either mean or maximum velocity ( P = .316 and .572, respectively). Conclusion: Age had no significant effect on the magnitude of increase in maximum or mean baseball pitch velocity during a velocity and mechanics training program in adolescent males.


Author(s):  
Kirsten E Lyke ◽  
Alexandra Singer ◽  
Andrea A Berry ◽  
Sharina Reyes ◽  
Sumana Chakravarty ◽  
...  

Abstract Background A live-attenuated Plasmodium falciparum (Pf) sporozoite (SPZ) vaccine (PfSPZ Vaccine) has shown up to 100% protection against controlled human malaria infection (CHMI) using homologous parasites (same Pf strain as in the vaccine). Using a more stringent CHMI, with heterologous parasites (different Pf strain), we assessed the impact of higher PfSPZ doses, a novel multi-dose prime regimen, and a delayed vaccine boost upon vaccine efficacy. Methods Four groups of 15 healthy, malaria-naïve adults were immunized. Group (Grp) 1 received five doses of 4.5x10 5 PfSPZ (days 1, 3, 5, 7; week 16). Grps 2, 3 and 4 received three doses (weeks 0, 8, 16) with Gp 2 receiving 9.0×10 5/dose, Grp 3 receiving 18.0×10 5/dose, and Grp 4 receiving 27.0×10 5 for dose 1 and 9.0×10 5 for doses 2 and 3. VE was assessed by heterologous CHMI after 12 or 24 weeks. Volunteers not protected at 12 weeks were boosted prior to repeat CHMI at 24 weeks. Results At 12-week CHMI, 6/15 (40%) Group 1 (P=0.04), 3/15 (20%) Group 2 vs. 0/8 controls remained aparasitemic. At 24-week CHMI, 3/13 (23%) Group 3, 3/14 (21%) Group 4 vs. 0/8 controls remained aparasitemic (Groups 2-4, VE not significant). Post-boost, 9/14 (64%) vs. 0/8 controls remained aparasitemic (3/6 Group 1, P=0.025; 6/8 Group 2, P=0.002). Conclusions Four stacked, priming injections (multi-dose priming) showed 40% VE against heterologous CHMI, while dose escalation of PfSPZ using single dose priming was not significantly protective. Boosting unprotected subjects improved VE at 24 weeks to 64%.


Blood ◽  
2007 ◽  
Vol 110 (11) ◽  
pp. 830-830
Author(s):  
J. Alejandro Madrigal ◽  
Neema P. Mayor ◽  
Hazael Maldonado-Torres ◽  
Bronwen E. Shaw ◽  
Steven G.E. Marsh

Abstract Haematopoietic Stem Cell Transplantation (HSCT) using volunteer Unrelated Donors (UD) has become an important and viable option in the treatment of Acute Leukaemia (AL). While matching donors and recipients usually refers to five of the classical HLA genes (HLA-A, -B, -C, -DRB1 and -DQB1), the impact of a sixth gene, HLA-DPB1, on the outcome of UD-HSCT is increasingly emerging. We have previously shown an increased risk of relapse with HLA-DPB1 matching and independently, with NOD2/CARD15 genotype. In light of these data, we have analysed a larger UD-HSCT cohort in order to establish the impact on transplant outcome when both HLA-DPB1 matching status and NOD2/CARD15 genotype are considered. HLA typing and NOD2/CARD15 genotyping was performed on 304 AL patients and their Anthony Nolan Trust volunteer unrelated donors. Transplants occurred between 1996 and 2005 at UK transplant centres. Diagnoses were ALL (47%) and AML (53%). 67% of the cohort were a 10/10 HLA match with 16% also being matched for HLA-DPB1. Myeloablative conditioning regimens were used in 74% of transplants. T-cell depletion was included in 82% of conditioning protocols. Bone marrow was used in 72% of transplants with the remaining 28% using peripheral blood stem cells. Two forms of post-transplant immunosuppression predominated, Cyclosporine A and Methotrexate (47%) and Cyclosporine A alone (38%). Previous studies on a subgroup of this cohort showed that HLA-DPB1 matching and NOD2/CARD15 SNPs independently caused an increase in disease relapse. Consequently, the cohort was grouped into three categories to reflect this risk, group 1 (DPB1 matched; NOD2/CARD15 SNP, n=24), group 2 (HLA-DPB1 matched; NOD2/CARD15 Wild-Type (WT) or HLA-DPB1 mismatched; NOD2/CARD15 SNP, n=112) and group 3 (HLA-DPB1 mismatched; NOD2/CARD15 WT, n=168). There was a significant difference in disease relapse between the three groups (1 year: group 1; 68%, group 2; 48%, group 3; 30%, p=0.0038). This finding persisted in multivariate analysis where being in either group 2 or 3 was protective towards relapse as compared to group 1 (RR 0.321; 95% CI 0.167–0.616; p=0.001 and RR 0.478; 95% CI 0.244–0.934; p=0.031 respectively). In the group with the highest relapse risk (group 1), this resulted in a decrease in Overall Survival (OS) (33% vs 54% in group 3, RR 0.617; 95% CI 0.359–1.060; p=0.080). The best OS was seen in the group with the lowest risk of relapse (group 3). Here, in addition to low relapse, there was increased acute and chronic Graft-versus-Host Disease (GvHD) (p=0.0019 and p=0.0058 respectively). In this cohort, cGvHD (in its limited form) was associated with a significantly lower incidence of relapse (p=0.0066) and better OS (p<0.0001). In concordance with our previous theories, it appears that being HLA-DPB1 matched and having NOD2/CARD15 SNPs predicts for the worst outcome with a significant increase in relapse and reduced OS. Conversely, the ideal pairing would be HLA-DPB1 mismatched and NOD2/CARD15 WT. These data suggest that prospectively typing AL patients for HLA-DPB1 and NOD2/CARD15 SNPs will allow the prediction of disease relapse, aGvHD and cGvHD and in addition will allow the effects of being independently HLA-DPB1 matched or having a NOD2/CARD15 SNP to be offset by intelligently selecting a suitable, less precarious donor.


Sign in / Sign up

Export Citation Format

Share Document