CTIM-10. REPRODUCIBILITY OF CLINICAL TRIALS USING CMV-TARGETED DENDRITIC CELL VACCINES IN PATIENTS WITH GLIOBLASTOMA

2021 ◽  
Vol 23 (Supplement_6) ◽  
pp. vi51-vi51
Author(s):  
Kristen Batich ◽  
Duane Mitchell ◽  
Patrick Healy ◽  
James Herndon ◽  
Gloria Broadwater ◽  
...  

Abstract INTRODUCTION Vaccination with dendritic cells (DCs) fares poorly in primary and recurrent glioblastoma (GBM). Moreover, GBM vaccine trials are often underpowered due to limited sample size. METHODS To address these limitations, we conducted three sequential clinical trials utilizing Cytomegalovirus (CMV)-specific DC vaccines in patients with primary GBM. Autologous DCs were generated and electroporated with mRNA encoding for the CMV protein pp65. Serial vaccination was given throughout adjuvant temozolomide cycles, and 111Indium radiolabeling was implemented to assess migration efficiency of DC vaccines. Patients were followed for median overall survival (mOS) and OS. RESULTS Our initial study was the phase II ATTAC study (NCT00639639; total n=12) with 6 patients randomized to vaccine site preconditioning with tetanus-diphtheria (Td) toxoid. This led to an expanded cohort trial (ATTAC-GM; NCT00639639) of 11 patients receiving CMV DC vaccines containing granulocyte-macrophage colony-stimulating factor (GM-CSF). Follow-up data from ATTAC and ATTAC-GM revealed 5-year OS rates of 33.3% (mOS 38.3 months; CI95 17.5-undefined) and 36.4% (mOS 37.7 months; CI95 18.2-109.1), respectively. ATTAC additionally revealed a significant increase in DC migration to draining lymph nodes following Td preconditioning (P=0.049). Increased DC migration was associated with OS (Cox proportional hazards model, HR=0.820, P=0.023). Td-mediated increased migration has been recapitulated in our larger confirmatory trial ELEVATE (NCT02366728) of 43 patients randomized to preconditioning (Wilcoxon rank sum, Td n=24, unpulsed DC n=19; 24h, P=0.031 and 48h, P=0.0195). In ELEVATE, median follow-up of 42.2 months revealed significantly longer OS in patients randomized to Td (P=0.026). The 3-year OS for Td-treated patients in ELEVATE was 34% (CI95 19-63%) compared to 6% given unpulsed DCs (CI95 1-42%). CONCLUSION We report reproducibility of our findings across three sequential clinical trials using CMV pp65 DCs. Despite their small numbers, these successive trials demonstrate consistent survival outcomes, thus supporting the efficacy of CMV DC vaccine therapy in GBM.

2021 ◽  
Vol 21 (1) ◽  
Author(s):  
Maryam Farhadian ◽  
Sahar Dehdar Karsidani ◽  
Azadeh Mozayanimonfared ◽  
Hossein Mahjub

Abstract Background Due to the limited number of studies with long term follow-up of patients undergoing Percutaneous Coronary Intervention (PCI), we investigated the occurrence of Major Adverse Cardiac and Cerebrovascular Events (MACCE) during 10 years of follow-up after coronary angioplasty using Random Survival Forest (RSF) and Cox proportional hazards models. Methods The current retrospective cohort study was performed on 220 patients (69 women and 151 men) undergoing coronary angioplasty from March 2009 to March 2012 in Farchshian Medical Center in Hamadan city, Iran. Survival time (month) as the response variable was considered from the date of angioplasty to the main endpoint or the end of the follow-up period (September 2019). To identify the factors influencing the occurrence of MACCE, the performance of Cox and RSF models were investigated in terms of C index, Integrated Brier Score (IBS) and prediction error criteria. Results Ninety-six patients (43.7%) experienced MACCE by the end of the follow-up period, and the median survival time was estimated to be 98 months. Survival decreased from 99% during the first year to 39% at 10 years' follow-up. By applying the Cox model, the predictors were identified as follows: age (HR = 1.03, 95% CI 1.01–1.05), diabetes (HR = 2.17, 95% CI 1.29–3.66), smoking (HR = 2.41, 95% CI 1.46–3.98), and stent length (HR = 1.74, 95% CI 1.11–2.75). The predictive performance was slightly better by the RSF model (IBS of 0.124 vs. 0.135, C index of 0.648 vs. 0.626 and out-of-bag error rate of 0.352 vs. 0.374 for RSF). In addition to age, diabetes, smoking, and stent length, RSF also included coronary artery disease (acute or chronic) and hyperlipidemia as the most important variables. Conclusion Machine-learning prediction models such as RSF showed better performance than the Cox proportional hazards model for the prediction of MACCE during long-term follow-up after PCI.


Author(s):  
Yuko Yamaguchi ◽  
Marta Zampino ◽  
Toshiko Tanaka ◽  
Stefania Bandinelli ◽  
Yusuke Osawa ◽  
...  

Abstract Background Anemia is common in older adults and associated with greater morbidity and mortality. The causes of anemia in older adults have not been completely characterized. Although elevated circulating growth and differentiation factor 15 (GDF-15) has been associated with anemia in older adults, it is not known whether elevated GDF-15 predicts the development of anemia. Methods We examined the relationship between plasma GDF-15 concentrations at baseline in 708 non-anemic adults, aged 60 years and older, with incident anemia during 15 years of follow-up among participants in the Invecchiare in Chianti (InCHIANTI) Study. Results During follow-up, 179 (25.3%) participants developed anemia. The proportion of participants who developed anemia from the lowest to highest quartile of plasma GDF-15 was 12.9%, 20.1%, 21.2%, and 45.8%, respectively. Adults in the highest quartile of plasma GDF-15 had an increased risk of developing anemia (Hazards Ratio 1.15, 95% Confidence Interval 1.09, 1.21, P<.0001) compared to those in the lower three quartiles in a multivariable Cox proportional hazards model adjusting for age, sex, serum iron, soluble transferrin receptor, ferritin, vitamin B12, congestive heart failure, diabetes mellitus, and cancer. Conclusions Circulating GDF-15 is an independent predictor for the development of anemia in older adults.


2020 ◽  
Vol 41 (Supplement_2) ◽  
Author(s):  
A Fujino ◽  
H Ogawa ◽  
S Ikeda ◽  
K Doi ◽  
Y Hamatani ◽  
...  

Abstract Background Atrial fibrillation (AF) commonly progresses from paroxysmal type to sustained type in the natural course of the disease, and we previously demonstrated that the progression of AF was associated with increased risk of clinical adverse events. There are some patients, though less frequently, who regress from sustained to paroxysmal AF, but the clinical impact of the regression of AF remains unknown. Purpose We sought to investigate whether regression from sustained to paroxysmal AF is associated with better clinical outcomes. Methods Using the dataset of the Fushimi AF Registry, patients who were diagnosed as sustained (persistent or permanent) AF at baseline were studied. Conversion of sustained AF to paroxysmal AF during follow-up was defined as regression of AF. Major adverse cardiac events (MACE) were defined as the composite of cardiac death, stroke, and hospitalization for heart failure (HF). Event rates were compared between the patients with and without regression of AF. In patients with sustained AF at baseline, predictors of MACE were identified using Cox proportional hazards model. Results Among 2,253 patients who were diagnosed as sustained AF at baseline, regression of AF was observed in 9.0% (202/2,253, 2.0 per 100 patient-years) during a median follow-up of 4.0 years. Of these, 24.3% (49/202, 4.6 per 100 patient-years) of the patients finally recurred to sustained AF during follow-up. The proportion of asymptomatic patients was lower in patients with regression of AF than those without (with vs without regression; 49.0% vs 69.5%, p<0.01). The percentage of beta-blocker use at baseline was similar between the two groups (37.2% vs 33.8%, p=0.34). The prevalence of patients who underwent catheter ablation or electrical cardioversion during follow-up was higher in patients with regression of AF (catheter ablation: 15.8% vs 5.5%; p<0.01, cardioversion: 4.0% vs 1.4%; p<0.01, respectively). The rate of MACE was significantly lower in patients with regression of AF as compared with patients who maintained sustained AF (3.7 vs 6.2 per 100 patient-years, log-rank p<0.01). Figure shows the Kaplan-Meier curves for MACE, cardiac death, hospitalization for heart failure, and stroke. In patients with sustained AF at baseline, multivariable Cox proportional hazards model demonstrated that regression of AF was an independent predictor of lower MACE (adjusted hazard ratio [HR]: 0.50, 95% confidence interval [CI]: 0.28 to 0.88, p=0.02), stroke (HR: 0.51, 95% CI: 0.30 to 0.88, p=0.02), and hospitalization for HF (HR: 0.50, 95% CI: 0.29 to 0.85, p=0.01). Conclusion Regression from sustained to paroxysmal AF was associated with a lower incidence of adverse cardiac events. Funding Acknowledgement Type of funding source: None


Circulation ◽  
2020 ◽  
Vol 142 (Suppl_3) ◽  
Author(s):  
Jian-jun Li ◽  
Yexuan Cao ◽  
Hui-Wen Zhang ◽  
Jing-Lu Jin ◽  
Yan Zhang ◽  
...  

Introduction: The atherogenicity of residual cholesterol (RC) has been underlined by recent guidelines, which was linked to coronary artery disease (CAD), especially for patients with diabetes mellitus (DM). Hypothesis: This study aimed to examine the prognostic value of plasma RC, clinically presented as triglyceride-rich lipoprotein-cholesterol (TRL-C) or remnant-like lipoprotein particles-cholesterol (RLP-C), in CAD patients with different glucose metabolism status. Methods: Fasting plasma TRL-C and RLP-C levels were directly calculated or measured in 4331 patients with CAD. Patients were followed for incident MACEs for up to 8.6 years and categorized according to both glucose metabolism status [DM, pre-DM, normal glycaemia regulation (NGR)] and RC levels. Cox proportional hazards model was used to calculate hazard ratios (HRs) with 95% confidence intervals. Results: During a mean follow-up of 5.1 years, 541 (12.5%) MACEs occurred. The risk for MACEs was significantly higher in patients with elevated RC levels after adjustment for potential confounders. No significant difference in MACEs was observed between pre-DM and NGR groups (p>0.05). When stratified by status of glucose metabolism and RC levels, highest levels of RLP-C, calculated and measured TRL-C were significant and independent predictors of developing MACEs in pre-DM (HR: 2.10, 1.98, 1.92, respectively; all p<0.05) and DM (HR: 2.25, 2.00, 2.16, respectively; all p<0.05). Conclusions: In this large cohort study with long-term follow-up, data firstly demonstrated that higher RC levels were significantly associated with the worse prognosis in DM and pre-DM patients with CAD, suggesting RC might be a target for patients with impaired glucose metabolism.


Stroke ◽  
2013 ◽  
Vol 44 (suppl_1) ◽  
Author(s):  
James Torner ◽  
Jie Zhang ◽  
David Piepgras ◽  
John Huston ◽  
Irene Meissner ◽  
...  

INTRODUCTION: The decision regarding whether to perform an interventional procedure as a strategy to prevent hemorrhage of an unruptured intracranial aneurysm (UIA) requires careful consideration of procedural risk and the UIA natural history. No randomized trial data are available. The International Study of Unruptured Intracranial Aneurysms (ISUIA) included a prospective cohort, examining hemorrhage risk and treatment risk. Hypothesis: The purpose of this analysis was to compare the factors related to treatment selection and determination of the number of hemorrhages prevented. Methods: Patients were allocated into the initial treatment and untreated cohorts based upon observation or treatment practices in 61 centers from 1991-1998. 1691 patients were in the observational cohort, 471 were in the endovascular cohort and 1917 patients were in the surgical cohort. The cohorts were followed for a median follow-up of 9.2 years. Outcomes were determined prospectively and with central review. The data were grouped together and analyzed to determine treatment decisions. A Cox proportional hazards model predicting hemorrhage developed in the observation cohort and was applied to the surgery and endovascular cohorts across the follow-up period. Results: Significant baseline variable differences between treated and observed patients were aneurysm size, symptoms, age, prior SAH group, geographical region, treatment percentage, aneurysm daughter sacs or multiple lobes, and history of hypertension, smoking and myocardial infarction. Aneurysm site and family history were not significant. Site, size, and aspirin use were significant predictors of hemorrhage. Long-term the predicted hemorrhage rates were 6.7% at 5 years and 8.0% at 10 years in the surgery group and 8.1% and 9.6% for the endovascular group, respectively. For comparison the rates in the observed cohort were 4.1% and 4.8%, respectively. Conclusions: Decisions for treatment are influenced by patient characteristics such as age and medical history, aneurysm characteristics such as size and morphology and center and regional practices. Patients in the treated cohorts were at moderately increased risk for hemorrhage compared to those in the observed cohort.


2019 ◽  
Vol 104 (1) ◽  
pp. 81-86 ◽  
Author(s):  
Sung Uk Baek ◽  
Ahnul Ha ◽  
Dai Woo Kim ◽  
Jin Wook Jeoung ◽  
Ki Ho Park ◽  
...  

Background/AimsTo investigate the risk factors for disease progression of normal-tension glaucoma (NTG) with pretreatment intraocular pressure (IOP) in the low-teens.MethodsOne-hundred and two (102) eyes of 102 patients with NTG with pretreatment IOP≤12 mm Hg who had been followed up for more than 60 months were retrospectively enrolled. Patients were divided into progressor and non-progressor groups according to visual field (VF) progression as correlated with change of optic disc or retinal nerve fibre layer defect. Baseline demographic and clinical characteristics including diurnal IOP and 24 hours blood pressure (BP) were compared between the two groups. The Cox proportional hazards model was used to identify the risk factors for disease progression.ResultsThirty-six patients (35.3%) were classified as progressors and 66 (64.7%) as non-progressors. Between the two groups, no significant differences were found in the follow-up periods (8.7±3.4 vs 7.7±3.2 years; p=0.138), baseline VF mean deviation (−4.50±5.65 vs −3.56±4.30 dB; p=0.348) or pretreatment IOP (11.34±1.21 vs 11.17±1.06 mm Hg; p=0.121). The multivariate Cox proportional hazards model indicated that greater diurnal IOP at baseline (HR=1.609; p=0.004), greater fluctuation of diastolic BP (DBP; HR=1.058; p=0.002) and presence of optic disc haemorrhage during follow-up (DH; HR=3.664; p=0.001) were risk factors for glaucoma progression.ConclusionIn the low-teens NTG eyes, 35.3% showed glaucoma progression during the average 8.7 years of follow-up. Fluctuation of DBP and diurnal IOP as well as DH were significantly associated with greater probability of disease progression.


2017 ◽  
Vol 5 (5_suppl5) ◽  
pp. 2325967117S0019
Author(s):  
Ben Parkinson ◽  
Nicholas Smith ◽  
Peter Thompson ◽  
Tim Spalding

Background: Meniscal allograft transplantation (MAT) has been shown to provide a significant improvement in patient reported outcomes for individuals with post-menisectomy syndrome. The typical patients undergoing MAT often have multiple other pathologies that require treatment at the time of surgery and it is difficult to ascertain which factors influence the outcome. Hypothesis / Purpose: The aim of this study was to determine the predictors of meniscal allograft transplantation failure in a large series in order to refine the indications for surgery and better inform future patients. Study Design: Prospective case series. Methods: All patients undergoing MAT at a single institution between May 2005 and May 2014, with a minimum of one year follow up were prospectively evaluated and included in this study. Failure was defined as removal of the allograft, revision transplantation or conversion to a joint replacement. Patients were grouped according to the articular cartilage status at the time of surgery; Group 1 – intact or partial thickness chondral loss; Group 2 - full thickness chondral loss one condyle; Group 3 - full thickness chondral loss both condyles. The Cox proportional hazards model was used to determine significant predictors of failure (cartilage grade at the time of MAT, IKDC score, lateral or medial allografts, gender, additional procedures and tissue bank source), independently of other factors. Kaplan-Meier survival curves were produced for overall survival and significant predictors of failure in the Cox proportional hazards model. Results: There were 125 consecutive MATs performed, with one patient lost to follow up. The median follow up was 3 years (range 1 – 10 years). The 5 year graft survival for the entire cohort was 82% (97% group 1, 82% group 2, 62% group 3). The probability of failure in group 1 was 85% lower (95% confidence interval 13 – 97%) than in group 3 at any time. The probability of failure with lateral allografts was 76% lower (95% confidence interval 16 – 89%) than medial allografts at any time. Conclusion: This study showed that the presence of severe cartilage damage at the time of MAT and medial allografts were significantly predictive of failure. Surgeons and patients can use this information when considering the risks and benefits of surgery.


2017 ◽  
Vol 35 (15_suppl) ◽  
pp. 8014-8014
Author(s):  
Arjun Lakshman ◽  
Muhamad Alhaj Moustafa ◽  
S.Vincent Rajkumar ◽  
Angela Dispenzieri ◽  
Morie A. Gertz ◽  
...  

8014 Background: t(11;14) is a standard risk cytogenetic marker in MM. Methods: We reviewed 366 patients with MM who had t(11;14) by FISH and 732 age and period-matched controls without t(11;14), seen at our institution from 2004 to 2014 and outcomes were analyzed using time to first progression or death (PFS1) and overall survival (OS). Results: For the t(11;14) group at diagnosis, the median age was 63.7 yr (range, 22.1-95.4) with 64.5% of patients being male. Eighty nine (24.3%) patients were above 70 yr of age at diagnosis. 33.8%, 40.3% and 25.9% patients belonged to ISS 1, II and III stages respectively. 13% patients had elevated LDH. Monosomy 17 or del 17p were identified in 10.6% patients. The median follow up period was 56.9 months (m) (95% CI: 54.6-62.2) and 209 (57.1%) patients were alive at last follow-up. Among patients receiving proteasome inhibitor (PI)-based, immunomodulator (IMiD)-based, PI+IMiD based or other agent based induction therapy, 71.2%, 70.3%, 90.4% and 37.5% patients respectively attained ≥PR as best response to induction (p < 0.01). During their course, 223 (60.9%) patients underwent stem cell transplant. Median PFS1 and OS were 23.1 (CI: 20.8-27.9) and 78.6 (CI: 66.7-105.9) m respectively. Among the controls, high risk cytogenetics (HRC) was present in 142 (19.4%), and the median OS was 83.8 m (CI: 70.8-97.0) being comparable to t(11;14) group (p = 0.8). For all 1098 patients, using a Cox-proportional hazards model with age > 70 years, induction therapy (novel agent-based vs others), cytogenetics [HRC vs t(11;14) without HRC vs no HRC or t(11;14)], and ISS stage III vs II/I as predictors, age > 70 years [HR-2.2 (CI: 1.8-2.8) and p < 0.01], ISS III vs ISS II/I [HR-1.4 (CI: 1.1-1.8) and p < 0.01] and HRC [HR of 2.1 (CI: 1.6-2.8) vs no HRC or t(11;14) (p < 0.01) and 1.9 (CI = 1.4-2.6) for t(11;14) without HRC (p < 0.01)were associated with reduced OS. The risk for reduced OS did not differ between t(11;14) without HRC, and those without t(11;14) or HRC [HR-1.1 (CI: 0.9-1.4), p = 0.4]. Conclusions: Our study characterizes the outcomes of a large cohort of MM patients with t(11;14) at diagnosis. Advanced age, HRC and advanced stage at diagnosis were associated with worse OS in our cohort. t(11;14) MM without HRC does not differ in outcome compared to non-t(11;14) MM without HRC.


2019 ◽  
Vol 37 (15_suppl) ◽  
pp. e19058-e19058
Author(s):  
Alfadel Alshaibani ◽  
Christina Lee ◽  
Sarah Camp Rutherford ◽  
Kah Poh Loh ◽  
Andrea M Baran ◽  
...  

e19058 Background: Diffuse large B-cell lymphoma (DLBCL) is the most common subtype of non-Hodgkin lymphoma. In this study, we explore reasons for non-enrollment in clinical trials for DLBCL and implications on trial design and interpretation. Methods: This is a retrospective analysis of patients (pts) with a pathological diagnosis of DLBCL or high grade B-cell lymphoma (HGBL) at University of Rochester (4/14-6/16) and New York-Presbyterian Hospital/Weill Cornell Medicine (NYP/WCM) (4/14-4/17).Ten clinical trials were opened during this time. Participants were divided into 3 groups: those treated in trial, those not enrolled in trial because of need for urgent treatment, and those not enrolled in trial for any other reason. We used a center-stratified Cox proportional hazards model to estimate association of trial enrollment with progression-free survival (PFS; time from start of treatment until progression/death or the last date the pt was known to be progression free) and overall survival (OS). Results: We identified 263 pts; 17% (n = 45) enrolled in a trial. Reasons for non-enrollment included not meeting eligibility criteria (n = 98), physician choice (n = 50), and pt choice (n = 38). For 32 pts, reasons were unclear. Of the 50 pts who were not enrolled because of physician choice, the primary reason for non-enrollment was the need for urgent treatment (n = 46). Pts who needed urgent treatment had higher risk clinical features compared with pts in trial (Table). Compared with those treated in trial and those not enrolled in trial for any other reason, those not enrolled in trial due to need for urgent treatment had an inferior PFS (HR 2.61, 95% CI 1.23–5.16) and OS (HR 2.27, 95% CI 1.21–4.06). Conclusions: At 2 academic institutions, 52% of patients with DLBCL or HGBL required urgent chemotherapy and failed to enroll on trials. Exclusion of such patients limits the applicability and generalizability of clinical trials in DLBCL. This barrier must be overcome so clinical trials may better reflect true DLBCL demographics. [Table: see text]


2016 ◽  
Author(s):  
Michael S. Lauer

AbstractTo inform the retirement of NIH-owned chimpanzees, we analyzed the outcomes of 764 NIH-owned chimpanzees that were located at various points in time in at least one of 4 specific locations. All chimpanzees considered were alive and at least 10 years of age on January 1, 2005; transfers to a federal sanctuary began a few months later. During a median follow-up of just over 7 years, there were 314 deaths. In a Cox proportional hazards model that accounted for age, sex, and location (which was treated as a time-dependent covariate), age and sex were strong predictors of mortality, but location was only marginally predictive. Among 273 chimpanzees who were transferred to the federal sanctuary, we found no material increased risk in mortality in the first 30 days after arrival. During a median follow-up at the sanctuary of 3.5 years, age was strongly predictive of mortality, but other variables – sex, season of arrival, and ambient temperature on the day of arrival – were not predictive. We confirmed our regression findings using random survival forests. In summary, in a large cohort of captive chimpanzees, we find no evidence of materially important associations of location of residence or recent transfer with premature mortality.


Sign in / Sign up

Export Citation Format

Share Document