Long Term Study of the Impact of Quantitative Molecular Monitoring of Bcr-Abl Transcripts on the Risk of Relapse of CML After Allogeneic HSCT.

Blood ◽  
2010 ◽  
Vol 116 (21) ◽  
pp. 1287-1287
Author(s):  
Mario Arpinati ◽  
Marilina Amabile ◽  
maria Teresa Bochicchio ◽  
Angela Poerio ◽  
Giuseppe Bandini ◽  
...  

Abstract Abstract 1287 Background: Monitoring of minimal residual disease through RT-PCR analysis of bcr-abl transcripts allows early detection of CML relapse after allogeneic HSCT. However, the introduction of more sensitive techniques, such as quantitative PCR, may result in decreased specificity, leading to false positive results. Patients and methods: In this study we reviewed the results of molecular analysis of bcr-abl transcripts in all patients with p210+ CML who underwent allogeneic HSCT from 1983 through 2007. Q-PCR analysis was started in 2002. Out of 189 patients, 87 patients had available Q-PCR data; of these, 63 patients with at least three separate Q-PCR data were included in the study. Median time to the 1st Q-PCR analysis was 2196 days (35-7823). Median age was 36 years (13-56), 62%/38% received transplant from a related/unrelated donor, 62% with BM. 32% were in accelerated phase (AP). Patients with at least one positive Q-PCR value (measured as a ratio of bcr-abl to abl of > 0.001) were classified as Major Molecular Remission (MMR) patients. Each event was defined as one or more consecutive positive results. Results: 60/63 patients are alive after a median follow up of 3693 days (898-9405). 6 have relapsed 2142 (1419-3746) days after transplant. 52 (83%) patients had at least one positive result (28 with a value of >0.01, 24 with a value of <0.01), whereas 11 (17%) had persistent undetectable transcripts. In MMR patients, 94 events occurred. 29 patients had only one event, while 6 had >3 events. In 10 patients, the event occurred within 1 year after transplant, whereas in 28 it occurred after >5 years. 6/52 MMR patients relapsed, as compared to 0/11 with persistent undetectable transcripts (p=0.19). Relapse did not correlate with the Q-PCR value, the number of events or the time to the event. Finally, of 46 MMR patients who did not relapse, 35 had undetectable transcripts at last follow up. Positive Q-PCR had low specificity (19%) and positive predictive value (12%) in predicting relapse after allogeneic HSC transplantation. Conclusion: Our data confirm that the detection of low levels of bcr-abl transcripts (as based on Q-PCR) has a poor accuracy in predicting relapse, and it should not be considered as the sole indication to start treatment. It appears that fluctuation of bcr-abl transcript levels is common as late as > 10 yrs after transplant, possibly suggesting the long term persistence of CML stem cells. Disclosures: Rosti: Novartis: Consultancy, Honoraria; BMS: Consultancy, Honoraria. Martinelli:Novartis: Consultancy, Honoraria; BMS: Consultancy, Honoraria; Pfizer: Consultancy.

Heart ◽  
2019 ◽  
Vol 106 (4) ◽  
pp. 299-306
Author(s):  
Tsukasa Kamakura ◽  
Tetsuji Shinohara ◽  
Kenji Yodogawa ◽  
Nobuyuki Murakoshi ◽  
Hiroshi Morita ◽  
...  

ObjectiveLimited data are currently available regarding the long-term prognosis of patients with J-wave syndrome (JWS). The aim of this study was to investigate the long-term prognosis of patients with JWS and identify predictors of the recurrence of ventricular fibrillation (VF).MethodsThis was a multicentre retrospective study (seven Japanese hospitals) involving 134 patients with JWS (Brugada syndrome (BrS): 85; early repolarisation syndrome (ERS): 49) treated with an implantable cardioverter defibrillator. All patients had a history of VF. All patients with ERS underwent drug provocation testing with standard and high intercostal ECG recordings to rule out BrS. The impact of global J waves (type 1 ECG or anterior J waves and inferolateral J waves in two or more leads) on the prognosis was evaluated.ResultsDuring the 91±66 months of the follow-up period, 52 (39%) patients (BrS: 37; ERS: 15) experienced recurrence of VF. Patients with BrS and ERS with global J waves showed a significantly higher incidence of VF recurrence than those without (BrS: log-rank, p=0.014; ERS: log-rank, p=0.0009). The presence of global J waves was a predictor of VF recurrence in patients with JWS (HR: 2.16, 95% CI 1.21 to 3.91, p=0.0095), while previously reported high-risk electrocardiographic parameters (high-amplitude J waves ≥0.2 mV and J waves associated with a horizontal or descending ST segment) were not predictive of VF recurrence.ConclusionsThis multicentre long-term study showed that the presence of global J waves was associated with a higher incidence of VF recurrence in patients with JWS.


Blood ◽  
2012 ◽  
Vol 120 (21) ◽  
pp. 2789-2789
Author(s):  
Frederick Pimm ◽  
Richard Szydlo ◽  
Letizia Foroni ◽  
Francesco Dazzi ◽  
Jaspal S Kaeda ◽  
...  

Abstract Abstract 2789 The use of tyrosine kinase inhibitors (TKI) in the management of chronic myeloid leukemia (CML) has dramatically improved survival, with some 80% of patients achieving a deep and durable molecular remission (MR). The current focus for these patients is the ability to withdraw long-term treatment and a number of ‘stopping’ studies have been initiated worldwide. Many of these approaches are derived from the French STIM study which showed that 40% of patients who had been real-time quantitative PCR (RT-qPCR) negative for BCR-ABL1 for two years could cease treatment without experiencing disease relapse. However, the RT-qPCR assay used in this study was particularly stringent with a sensitivity of 10−5, compatible with a five log reduction in BCR-ABL1 transcripts (MR5), and it is not clear that the same level of success will result from studies using MR4 and MR4.5 as the indication for treatment cessation. Furthermore, because of the lack of accuracy in RT-qPCR assays when the number of BCR-ABL1 transcripts approach zero, some laboratories report as undetectable, transcript numbers <6 or even <11. In order to investigate the importance of the depth of molecular response on the risk of subsequent disease recurrence, we studied the long-term follow-up of, and RT-qPCR results from, patients who received allogeneic stem cell transplantation as treatment for CML at a time when minimal residual disease detection was performed by RT-qPCR using ABL1 as the control gene. We analysed data from 180 patients transplanted from January 1998 onwards who received an allo-SCT from an HLA-identical sibling or a matched unrelated donor and who had survived for at least 6 months post-transplant with a consistent sequence of 5 or more RT-qPCR results from the time of transplant to the end of follow-up. Patients were assessed on the depth of their MR; 9 categories of ‘complete’ MR were defined based on BCR-ABL1 transcript threshold for negativity (BCR-ABL1=0, BCR-ABL1>0 but <6, BCR-ABL1>5 but <11) and control transcript number (CTN) (CTN>104 but <104.5, CTN>104.5 but <105, but CTN>105). We ranked these categories, firstly by BCR-ABL1 transcript threshold, defining negativity at a lower threshold as a deeper response, and then sub-ranked by CTN, defining a larger CTN as a deeper response. Of the 180 patients, 49 (27%) did not achieve ‘complete’ MR by any definition and for the 131 (73%) patients who did reach some degree of ‘complete’ MR, the median time from transplant to best molecular response was 8.7 months (range, 1.0–103 months). We defined relapse as progression to an RT-qPCR level that triggered the use of donor lymphocyte infusions i.e. BCR-ABL1/ABL ratio exceeded 0.02% in 3 samples, or exceeded 0.05% in 2 samples, or showed rising levels with the last 2 samples higher than 0.02%, or worse (loss of cytogenetic or haematological remission). The 2 year relapse incidence post SCT was 94% in the group who did not achieve any degree of ‘complete’ MR, 94% in the group who achieved MR with BCR-ABL1<11 and >0, CTN>104 (n=32, 17.8%), 55% in the group BCR-ABL1=0, CTN>104 and <104.5 (n=19, 11%), 26% in the group BCR-ABL1=0, CTN>104.5 and <105 (n=47, 26%), and 6% in the group BCR-ABL1=0, CTN>105 (n=33, 18%) (p<0.0001). In multivariate analysis with adjustment for donor type, classifying the 33 patients who achieved BCR-ABL1=0, CTN>105 as the optimal molecular responders the relative risk of relapse was 90.1 in 49 patients who never achieved MR by any definition, (p<0.0001), 21.7 in the group BCR-ABL1<11 and >0, CTN>104 (n=32) (p<0.0001), 8.1 in the group BCR-ABL1=0, CTN>104 and<104.5 (n=19) (p<0.0001), and 2.11 in the group BCR-ABL1=0, CTN>104.5 and <105 (n=47) (p=0.002). In conclusion, fewer detectable BCR-ABL1 transcripts with larger numbers of control transcripts, i.e. a deeper response, predict a lower risk of relapse in post-transplant survivors and may have important implications for the ability to stop long-term TKI therapy. Disclosures: No relevant conflicts of interest to declare.


Blood ◽  
1996 ◽  
Vol 87 (11) ◽  
pp. 4894-4902 ◽  
Author(s):  
C Peters ◽  
M Balthazor ◽  
EG Shapiro ◽  
RJ King ◽  
C Kollman ◽  
...  

Long-term survival and improved neuropsychological function have occurred in selected children with Hurler syndrome (MPS I H) after successful engraftment with genotypically matched sibling bons marrow transplantation (BMT). However, because few children have HLA-identical siblings, the feasibility of unrelated donor (URD) BMT as a vehicle for adoptive enzyme therapy was evaluated in this retrospective study. Forty consecutive children (median, 1.7 years; range, 0.9 to 3.2 years) with MPS I H received high-dose chemotherapy with or without radiation followed by BMT between January 27, 1989 and May 13, 1994. Twenty-five of the 40 patients initially engrafted. An estimated 49% of patients are alive at 2 years, 63% alloengrafted and 37% autoengrafted. The probability of grade II to IV acute graft-versus-host disease (GVHD) was 30%, and the probability of extensive chronic GVHD was 18%. Eleven patients received a second URD BMT because of graft rejection or failure. Of the 20 survivors, 13 children have complete donor engraftment, two children have mixed chimeric grafts, and five children have autologous marrow recovery. The BM cell dose was correlated with both donor engraftment and survival. Thirteen of 27 evaluable patients were engrafted at 1 year following URD BMT. Neither T-lymphocyte depletion (TLD) of the bone marrow nor irradiation appeared to influence the likelihood of engraftment. Ten of 16 patients alive at 1 year who received a BM cell dose greater than or equal to 3.5 x 10(8) cells/kg engrafted, and 62% are estimated to be alive at 3 years. In contrast, only 3 of 11 patients receiving less than 3.5 x 10(8) cells/kg engrafted, and 24% are estimated to be alive at 3 years (P = .05). The mental developmental index (MDI) was assessed before BMT. Both baseline and post-BMT neuropsychological data were available for 11 engrafted survivors. Eight children with a baseline MDI greater than 70 have undergone URD BMT (median age, 1.5 years; range, 1.0 to 2.4 years). Of these, two children have had BMT too recently for developmental follow-up. Of the remaining six, none has shown any decline in age equivalent scores. Four children are acquiring skills at a pace equal to or slightly below their same age peers; two children have shown a plateau in learning or extreme slowing in their learning process. For children with a baseline MDI less than 70 (median age, 2.5 years; range, 0.9 to 2.9 years), post-BMT follow-up indicated that two children have shown deterioration in their developmental skills. The remaining three children are maintaining their skills and are adding to them at a highly variable rate. We conclude that MPS I H patients with a baseline MDI greater than 70 who are engrafted survivors following URD BMT can achieve a favorable long-term outcome and improved cognitive function. Future protocols must address the high risk of graft rejection or failure and the impact of GVHD in this patient population.


Foods ◽  
2021 ◽  
Vol 10 (5) ◽  
pp. 909
Author(s):  
Klelia Karagiannaki ◽  
Christian Ritz ◽  
Ditte Søbye Andreasen ◽  
Raphaela Achtelik ◽  
Per Møller ◽  
...  

Although it is well evident that a healthy diet rich in fruit and vegetables could prevent a number of major chronic diseases, national and international guidelines concerning their intake are not being reached by a large percentage of the population, including children. Thus, it is of interest to investigate how the consumption of this food group by children could be increased. The aim of this study was to examine the impact of serving style on the consumption of a raw snack vegetable (daikon) and the influence of its exposure on liking and intake of the vegetable. A group of 185 children 3–5 years old participated in the study. Two kindergartens served as intervention groups, while the third was assigned to be the control group of the study (n = 50). The intervention groups were repeatedly exposed to one of three different serving styles of daikon: sticks (n = 42), triangles (n = 46) or grated (n = 47), and they were all visited 7 times during the exposure period, on the same frequency (twice per week). Familiarity and liking of the target vegetable, daikon, and six other vegetables (cucumber, celery, celeriac, broccoli, cauliflower and beetroot) were measured at baseline, post-intervention and two follow up sessions (3- and 6-month) to investigate the likelihood of generalisation effects. Intake of daikon was measured at all control sessions and exposures. Moreover, children were asked to rank their favourite serving style of daikon and beetroot, among triangle, stick and grated, towards understanding the influence of shape on the efficacy of the exposure. The results revealed significant changes between liking and intake of daikon for the groups of triangles and sticks and the control group (p < 0.05). The group that received grated daikon did not show significant differences in liking and at intake levels during the exposures but performed well in the long-term. Throughout the exposure period, intake levels followed an overall increasing pattern, with all the groups to demonstrate a decrease of their intake at the last session, which was not found significant for the triangle group. Mere exposure was efficient towards increasing liking and intake of the novel vegetable with all the shapes to deliver positive results, but based on this study no particular serving style can be recommended.


2002 ◽  
Vol 18 (3) ◽  
pp. 229-241 ◽  
Author(s):  
Kurt A. Heller ◽  
Ralph Reimann

Summary In this paper, conceptual and methodological problems of school program evaluation are discussed. The data were collected in conjunction with a 10 year cross-sectional/longitudinal investigation with partial inclusion of control groups. The experiences and conclusions resulting from this long-term study are revealing not only from the vantage point of the scientific evaluation of new scholastic models, but are also valuable for program evaluation studies in general, particularly in the field of gifted education.


2020 ◽  
Vol 4 (Supplement_1) ◽  
pp. 176-176
Author(s):  
Hiroto Yoshida ◽  
Yuriko Kihara

Abstract This study examined the impact of frailty on medical and long-term care expenditures in an older Japanese population. The subjects were those aged 75 years and over who responded to the survey (March 2018) in Bibai, Hokkaido, Japan (n=1,203) and have never received certification of long-term care insurance at the survey. We followed up 867 individuals (72.1%) until the end of December 2018 (10 month-period). We defined frailty as a state in performing 4 items and over of 15 items which were composed of un-intentional weight loss, history of falls, etc. Among 867 subjects, 233 subjects (26.9%) were judged to be frailty group, and 634 subjects (73.1%) non-frailty group. We compared period to the new certification of long-term care insurance (LTCI), accumulated medical and long-term care expenditures adjusted for age and gender between the two groups during the follow-up period. Cox proportional hazard models were used to examine the association between baseline frailty and the new certification of LTCI. The relative hazard ratio (HR) was higher in frailty group than non-frailty group (HR=3.51, 95% CI:1.30-9.45, P=.013). The adjusted mean accumulated medical and long-term care expenditures per capita during the follow-up were significantly (P=.002) larger for those in the frailty group (629,699 yen), while those in the non-frailty group were 450,995 yen. We confirmed strong economic impact of frailty in the elderly aged 75 or over in Japan.


2021 ◽  
Vol 14 (1) ◽  
Author(s):  
Robert J. Kreitman ◽  
◽  
Claire Dearden ◽  
Pier Luigi Zinzani ◽  
Julio Delgado ◽  
...  

Abstract Background Moxetumomab pasudotox is a recombinant CD22-targeting immunotoxin. Here, we present the long-term follow-up analysis of the pivotal, multicenter, open-label trial (NCT01829711) of moxetumomab pasudotox in patients with relapsed/refractory (R/R) hairy cell leukemia (HCL). Methods Eligible patients had received ≥ 2 prior systemic therapies, including ≥ 2 purine nucleoside analogs (PNAs), or ≥ 1 PNA followed by rituximab or a BRAF inhibitor. Patients received 40 µg/kg moxetumomab pasudotox intravenously on Days 1, 3, and 5 of each 28-day cycle for up to six cycles. Disease response and minimal residual disease (MRD) status were determined by blinded independent central review. The primary endpoint was durable complete response (CR), defined as achieving CR with hematologic remission (HR, blood counts for CR) lasting > 180 days. Results Eighty adult patients were treated with moxetumomab pasudotox and 63% completed six cycles. Patients had received a median of three lines of prior systemic therapy; 49% were PNA-refractory, and 38% were unfit for PNA retreatment. At a median follow-up of 24.6 months, the durable CR rate (CR with HR > 180 days) was 36% (29 patients; 95% confidence interval: 26–48%); CR with HR ≥ 360 days was 33%, and overall CR was 41%. Twenty-seven complete responders (82%) were MRD-negative (34% of all patients). CR lasting ≥ 60 months was 61%, and the median progression-free survival without the loss of HR was 71.7 months. Hemolytic uremic and capillary leak syndromes were each reported in ≤ 10% of patients, and ≤ 5% had grade 3–4 events; these events were generally reversible. No treatment-related deaths were reported. Conclusions Moxetumomab pasudotox resulted in a high rate of durable responses and MRD negativity in heavily pre-treated patients with HCL, with a manageable safety profile. Thus, it represents a new and viable treatment option for patients with R/R HCL, who currently lack adequate therapy. Trial registration ClinicalTrials.gov identifier: NCT01829711; first submitted: April 9, 2013. https://clinicaltrials.gov/ct2/show/NCT01829711


Rheumatology ◽  
2021 ◽  
Vol 60 (Supplement_1) ◽  
Author(s):  
Rosie Barnett ◽  
Anita McGrogan ◽  
Matthew Young ◽  
Charlotte Cavill ◽  
Mandy Freeth ◽  
...  

Abstract Background/Aims  Axial spondyloarthritis (axSpA) is a chronic rheumatic condition, characterised by inflammatory back pain - often associated with impaired function and mobility, sleep disturbance, fatigue, and reduced quality of life. Despite the vast advances in pharmacological treatments for axSpA over the last few decades, physical activity and rehabilitation remain vital for effective disease management. At the Royal National Hospital for Rheumatic Diseases in Bath (RNHRD), the 2-week inpatient axSpA rehabilitation programme has been integral to axSpA care since the 1970’s. Prior research has demonstrated significant short-term improvements in spinal mobility (BASMI), function (BASFI) and disease activity (BASDAI) following course attendance. However, the long-term outcomes are yet to be evaluated in this unique cohort. Methods  Since the early 1990’s, clinical measures of spinal mobility, function and disease activity have been routinely collected at the RNHRD at all clinical appointments through administration of the BASMI, BASFI and BASDAI, respectively. Dates of attending the axSpA course and standard clinical and treatment follow-up data were also collected. Multiple linear regression models were used to investigate the impact of course attendance on final reported BASMI, BASDAI and BASFI scores (final score=most recent). Length of follow-up was defined as time between first and last recorded BASMI. Results  Of the 203 patients within the Bath SPARC200 cohort, 77.8% (158/203) had attended at least one rehabilitation course throughout follow-up. 70.0% (140/203) of patients were male. The mean duration of follow-up was 13.5 years (range 0-35 years); 28.1% (57/203) of individuals with 20+ years of follow-up. Course attendance (yes versus no) significantly reduced final BASMI score by 0.84 (p = 0.001, 95%CI -1.31 to -0.37) and final BASDAI score by 0.74 (p = 0.018, 95%CI -1.34 to -0.13). Although course attendance reduced final BASFI by 0.45 (95%CI -1.17 to 0.28), this relationship did not reach significance (p = 0.225). Whilst minimally clinically important difference (MCID) is, to our knowledge, yet to be defined for BASMI, MCIDs were achieved long-term for both BASDAI and BASFI - defined by van der Heijde and colleagues in 2016 as 0.7 and 0.4 for BASDAI and BASFI, respectively. Conclusion  These results provide novel evidence to support the integral role of education, physical activity and rehabilitation in the management of axSpA. Future work should investigate additional outcomes of critical importance to patients and clinicians, such as fatigue, quality of life and work productivity. Furthermore, a greater understanding of the factors that confound these outcomes may provide insights into those patients who may most benefit from attending a 2-week rehabilitation course. In addition to facilitating identification of those patients who may require additional clinical support. Disclosure  R. Barnett: None. A. McGrogan: None. M. Young: None. C. Cavill: None. M. Freeth: None. R. Sengupta: Honoraria; Biogen, Celgene, AbbVie, Novartis, MSD. Grants/research support; Novartis, UCB.


2021 ◽  
Author(s):  
Shibley Rahman ◽  
Kit Byatt

Abstract Delirium is a common presentation in older inpatients with coronavirus disease 2019 (COVID-19), and a risk factor for cognitive decline at discharge. The glaring gaps in the service provision in delirium care, regardless of aetiology, after a hospital admission pre-existed the pandemic, but the pandemic arguably offers an opportunity now to address them. Whilst a delirium episode in itself is not a long-term condition, the context of it may well be, and therefore patients might benefit from personalised care and support planning. There is no reason to believe that the delirium following COVID-19 is fundamentally different from any other delirium. We propose that the needs of older patients who have experienced delirium including from COVID-19 could be addressed through a new model of post-acute delirium care that combines early supported discharge, including discharge-to-assess, with community-based follow-up to assess for persistent delirium and early new long-term cognitive impairment. Such a drive could be structurally integrated with existing memory clinic services. To succeed, such an ambition has to be flexible, adaptable and person-centred. To understand the impact on resource and service utilisation, techniques of quality improvement should be implemented, and appropriate metrics reflecting both process and outcome will be essential to underpin robust and sustainable business cases to support implementation of delirium care as a long-term solution.


Sign in / Sign up

Export Citation Format

Share Document