CD14 and Tissue Factor Positive Extracellular Vesicles Predict Response to Dovitinib in Patients with GBM: A Pilot Study

Blood ◽  
2014 ◽  
Vol 124 (21) ◽  
pp. 4999-4999
Author(s):  
Nikolaos Papadantonakis ◽  
Manmeet S Ahluwalia ◽  
Micheal Khoury ◽  
Shruti Chaturvedi ◽  
Keith R. McCrae

Abstract BACKGROUND: Glioblastoma (GBM) is most common primary malignant brain tumor, and has a median survival of 15-18 months. Dovitinib, an oral multi-tyrosine kinase inhibitor of vascular endothelial growth factor (VEGF), fibroblast growth factor (FGF), and platelet derived growth factor (PDGF) is currently under study in a phase II trial for GBM at the Cleveland Clinic. Dovitinib is administrated 5 days on and 2 days off every 4 weeks until progressive disease (PD) or intolerable toxicity are observed. Extracellular vesicles (EV) are submicron particles that express or contain cellular proteins and nucleic acids and are released from a variety of non-malignant cells (e.g. endothelial cells, platelets, leucocytes) and malignant cells. In some settings, EV may serve as biomarkers of inflammation, thrombosis and tumor spread/burden. OBJECTIVE: The aim of our study was to characterize levels of circulating EV and their relation to disease course in patients with GBM enrolled in the Dovitinib study (with or without prior treatments with anti-angiogenic agents). We also examined the association between EV levels and the development of venous thromboembolism (VTE). METHODS: Patients previously treated with anti-angiogenic therapy (Group 1, n=14) or without prior anti-angiogenic treatment (Group 2, n=14) were examined separately. EV were measured at study enrollment (pre-treatment), at the end of cycle 1 (day 28), and at PD. EV were isolated from citrated whole blood by differential centrifugation and incubated with fluorochrome-conjugated monoclonal antibodies to CD144-PE (endothelial cells), CD41-PECy4 (platelets), CD14-PE (monocytes) and CD142 (tissue factor, Alexa Fuor 647), then analyzed by flow cytometry. Depending on sample size, the Student t-test or Wilcoxon test was used to compare EV levels (due to the small sample size and skewed distribution of EV levels). P<0.05 was considered significant for all analyses. RESULTS: Three patients from group 1 and 6 patients from group 2 were not included in the analysis secondary to lack of an EV sample, withdrawal of consent or complications leading to early drug discontinuation. Of theremaining 11 patients in Group 1, 3 had PD and 8 had stable disease (SD) at the end of cycle 1. Of the 8 patients in group 2 available for analyses after cycle 1, 2 had PD and 6 had SD (one of these developed VTE but continued on the study). In the pretreatment sample of patients from group 1, patients who developed PD had significantly higher levels of CD14+ EV (89977±12121 vs. 42237±27651, p =0.048) and CD142+ EV (68701±9010 vs. 9695±12462, p=0.048) compared to those with SD. However, there was no statistically significant difference in EV levels (all sub-populations) from pre-treatment to the end of cycle 1 in patents with either PD or SD. EV levels did not correlate with peripheral blood counts. Due to the small number of patients in group 2 with progressive disease, we were unable to assess the correlation with EV. Six (2 in group 1, 4 in group 2) of the 27 patients for which pre-treatment EV were available developed VTE during the study. The EV levels were not significantly different between patients who developed VTE compared to those who did not both at pretreatment and at the day 28 evaluation. However, most patients who developed VTE demonstrated profound increases in EV before or in association with their thrombotic event. CONCLUSIONS: In patients with GBM receiving Dovitinib without prior exposure to anti-angiogenic therapy, elevated pre-treatment levels of CD14+ and CD142+ EV were associated with progressive disease, suggesting their potential role as a predictor of poor response to Dovitinib. Due to the relatively small sample size, no significant differences were observed between patients that developed VTE and those that did not, either pretreatment or at the Day 28 evaluation; however, these studies are ongoing. In the majority of patients with VTE, EV levels increased substantially before or in association with VTE development. Acknowledgment: This work was supported by a grant from the Scott Hamilton Cares Initiative Disclosures No relevant conflicts of interest to declare.

2021 ◽  
Vol 10 (01) ◽  
pp. 026-036
Author(s):  
Sudip Kumar Sengupta ◽  
Andrews Navin Kumar ◽  
Vinay Maurya ◽  
Harish Bajaj ◽  
Krishan Kumar Yadav ◽  
...  

Abstract Introduction Absence of sufficient number of prospective randomized controlled studies and comparatively small sample size and short follow-up period of most of the studies, available so far, have left ambiguity and lack of standardization of different aspects of cranioplasty. Materials and Methods This is an early report of a computed tomography scan image-based ambidirectional study on cranioplasties performed with autologous subcutaneous pocket preserved bone flaps. Retrospective arm compared bony union and factors influencing it between cranioplasties and craniotomies. Patients with poor bony union and aseptic resorption were followed up in the prospective arm. Results Retrospective arm of the study, followed up for five years (mean 32.2 months), comprised 42 patients as cases (Group 1) and 29 as controls (Group 2). Twenty-seven individuals (64.3%) in Group 1 had good bony union, as compared with 20 (68.9%) good unions in Group 2 out of the 29 patients. Four patients (9.5%) in Group 1 showed evidence of flap resorption, a finding absent in any patient in Group 2. Age, sex, smoking habits, superficial skin infection, and method of fixation did not appear to have any implication on bony union. Craniotomies done using Gigli saws fared better as compared with those done with pneumatic saw with lesser flap size–craniectomy size discrepancy, though it was not statistically significant. Fifteen patients have been included in the Prospective arm at the time of submission of this article. Conclusion Ours is a study with a small sample size, unable to put its weight on any side, but can surely add some more data to help the Neurosurgeons in choosing the best for their patients.


Author(s):  
Utkarsh Deshmukh ◽  
Rishi Mehta

Background: Among the refractive errors, myopia is the most common in school children. Due to myopia, school children are unable to see the blackboard which severely affects their performance. Moreover, they are unable to play outdoor sports thereby hampering their all-round development.Methods: This is a cross-sectional observational and analytical study. All children from 5-12 years attending eye OPD were included. Detailed history was taken and complete ophthalmic examination was done. Low myopia was defined as refractive error of -0.25D to -3D. Moderate myopia was defined refractive error of -3.25 D to -6 D. High myopia was defined as refractive error of ≥-6D. The children were divided into 3 groups according to their age (in years): group-1 (5-7), group-2 (8-9) and group-3 (10-12). The data obtained was subjected to statistical analysis using IBM SPSS version 24.  P value was calculated by chi-square test. P<0.05 was considered statistically significant.Results:   153 children were examined, out of which 72 (47.1%) were males and 81 (52.9%) were females. Group-1, group-2 and group-3 had 38, 38, 77 children respectively. Out of 153 children, 26 (16.99%) were found to be myopic. Out of 26 myopic children, males were 11 (42.3%) and females were 15 (57.69%) (p>0.05), Low, moderate and high myopia was found in 19 (73.07%), 6 (23.07%) and 1 (3.84%) child respectively.Conclusions: The prevalence of myopia in school children is 16.99%. There is a need for regular screening of school children to diagnose myopia in them. The limitation of this study is hospital based and small sample size. So, we recommend a community-based study with a larger sample size.


1999 ◽  
Vol 2 (2) ◽  
pp. 187-197 ◽  
Author(s):  
Sandra Drummond ◽  
Terry Kirk

AbstractObjectiveTo compare the effect of advice to reduce both dietary fat and sugar with advice to reduce fat alone on subsequent dietary intake in Scottish men.DesignA parallel design intervention study was employed to measure compliance to the two types of dietary advice. Subjects were randomly assigned to Group 1 (advice to reduce fat and non-milk extrinsic (NME) sugar), Group 2 (advice to reduce fat only, ad libitum sugar) or a control Group 0 (no advice). Compliance was assessed by two 4-day food diaries over 6 months.SettingThe study was conducted in the Strathclyde area of Scotland.SubjectsSubjects were normal to moderately overweight Scottish men. The men recruited were non-dieting and volunteered for a ‘healthy eating’ study with the aim to improve the ‘healthiness’ of their diet.ResultsGroups 1 and 2 achieved the dietary target for fat, reducing their mean intake to below 35% energy. Group 1 achieved a statistically significant reduction in percentage energy from NME sugar in the short term (6 weeks), decreasing their mean intake from 9.9% to 7.2% energy. This initial decrease appeared to slip back towards baseline levels at 6 months (8.1% energy from NME sugar) and was no longer significantly different from baseline. At 6 months Group 1 reported a significantly lower mean energy intake than at baseline, whereas Group 2 adjusted for an initial decrease in energy intake and by 6 months energy intakes were not significantly different from baseline intakes. Group 2 appeared to compensate for the absolute reduction in dietary fat with a slight increase in total sugars and the maintenance of NME sugar intakes.ConclusionsSubjects in Group 1 complied with advice to reduce both fat and sugar over 6 weeks but to a lesser extent over 6 months. The 1.8% reduction in percentage energy from NME sugars in Group 1 at 6 months may not have reached significance due to the small sample size. Alternatively it may be that free-living populations find it hard to maintain concurrent reductions in fat and sugar owing to the well-documented inverse relationship between intakes of these macronutrients when expressed as a proportion of energy.


2020 ◽  
Vol 8 (1) ◽  
Author(s):  
Pavel Klein ◽  
Ivana Tyrlikova ◽  
Giulio Zuccoli ◽  
Adam Tyrlik ◽  
Joseph C. Maroon

Abstract Introduction Glioblastoma (GBM) has poor survival with standard treatment. Experimental data suggest potential for metabolic treatment with low carbohydrate ketogenic diet (KD). Few human studies of KD in GBM have been done, limited by difficulty and variability of the diet, compliance, and feasibility issues. We have developed a novel KD approach of total meal replacement (TMR) program using standardized recipes with ready-made meals. This pilot study evaluated feasibility, safety, tolerability, and efficacy of GBM treatment using TMR program with “classic” 4:1 KD. Method GBM patients were treated in an open-label study for 6 months with 4:1 [fat]:[protein + carbohydrate] ratio by weight, 10 g CH/day, 1600 kcal/day TMR. Patients were either newly diagnosed (group 1) and treated adjunctively to radiation and temozolomide or had recurrent GBM (group 2). Patients checked blood glucose and blood and urine ketone levels twice daily and had regular MRIs. Primary outcome measures included retention, treatment-emergent adverse events (TEAEs), and TEAE-related discontinuation. Secondary outcome measures were survival time from treatment initiation and time to MRI progression. Results Recruitment was slow, resulting in early termination of the study. Eight patients participated, 4 in group 1 and 4 in group 2. Five (62.5%) subjects completed the 6 months of treatment, 4/4 subjects in group 1 and 1/4 in group 2. Three subjects stopped KD early: 2 (25%) because of GBM progression and one (12.5%) because of diet restrictiveness. Four subjects, all group 1, continued KD on their own, three until shortly before death, for total of 26, 19.3, and 7 months, one ongoing. The diet was well tolerated. TEAEs, all mild and transient, included weight loss and hunger (n = 6) which resolved with caloric increase, nausea (n = 2), dizziness (n = 2), fatigue, and constipation (n = 1 each). No one discontinued KD because of TEAEs. Seven patients died. For these, mean (range) survival time from diet initiation was 20 months for group 1 (9.5–27) and 12.8 months for group 2 (6.3–19.9). Mean survival time from diagnosis was 21.8 months for group 1 (11–29.2) and 25.4 months for group 2 ( 13.9–38.7). One patient with recurrent GBM and progression on bevacizumab experienced a remarkable symptom reversal, tumor shrinkage, and edema resolution 6–8 weeks after KD initiation and survival for 20 months after starting KD. Conclusion Treatment of GBM patients with 4:1 KD using total meal replacement program with standardized recipes was well tolerated. The small sample size limits efficacy conclusions. Trial registration NCT01865162 registered 30 May 2013, and NCT02302235 registered 26 November 2014, https://clinicaltrials.gov/


2018 ◽  
Vol 20 (1) ◽  
Author(s):  
Morgan Leigh Richburg

Background:Mothers of preterm infants often produce insufficient amounts of breast milk (BM).  Early initiation of BM expression following delivery is associated with increased lactation success, but lack of nursing time delays BM expression in this population.Purpose:To determine if providing antenatal breast expression education to the support person (SP) of mothers at risk of preterm delivery improves lactation success.Methods: Twenty women at risk for delivering a preterm infant and their SP were randomized into two groups. Mothers and their SP in Group 1 received education regarding how to use a breast pump and a breast pump was placed into their hospital room. Group 2 received standard care. Data regarding BM volume produced, time to initiation of BM expression and time to onset of lactogenesis stage II was collected.Results: While there was no difference in time to onset of lactogenesis stage II, mothers in Group 1 initiated BM expression 2.5 hours earlier than those in Group 2.   Overall, BM production was higher in Group 1.Limitations: This study had a relatively small sample size (n=19). Only mothers delivering 31-33 weeks gestation were included. This is a single-center study, which may limit generalizability.Conclusion: Prenatal lactation education of mothers of preterm infants and their SP is feasible and may increase lactation success.


Blood ◽  
2006 ◽  
Vol 108 (11) ◽  
pp. 885-885
Author(s):  
Anjali A. Sharathkumar ◽  
Jin-Jar Lin ◽  
Ronald Hirschl ◽  
Steven Pipe

Abstract Background: Arteriovenous fistula (AVF) is the preferred vascular access for children with end stage renal disease (ESRD) requiring hemodialysis. Once AVF is surgically created, it takes 6 to 12 weeks to mature. Nearly 20 to 50% of AVFs fail to mature due to development of primary or secondary thrombosis. Currently there is no uniform strategy to prevent the thrombosis at AVF. We report our experience of using primary thromboprophylaxis (PTP) for prevention of thrombosis at AVF. Methods & Results: A strategy of PTP constituted an infusion of unfractionated heparin (UFH, 10 IU/kg/hr) for the first 24 hours after AVF surgery followed by subcutaneous injection of low molecular weight heparin (LMWH, 0.5 to 1 mg/kg/dose) twice daily until AVF was matured and successfully accessed. LMWH therapy was monitored by peak and trough anti-Xa levels. Target anti-Xa levels were maintained in therapeutic range (0.5 to 1.0 IU/ml) for those with history of thrombosis or associated risk factors for thrombosis while remaining patients were maintained in prophylactic range (0.2 to 0.5 IU/ml). Trough anti-Xa level was aimed to be les than 0.2 IU/ml. Total of 26 AVF were performed on 18 children from January 2001 to July 2006: 19 (73%) historical controls; 7 (27%) received PTP. Mean time for AVF maturation was 60 days (range: 33 to 88). Among 19 children, 14 received no thromboprophylaxis while 5 received aspirin (81 mg once daily). Eleven (79%) of 14 AVF in no treatment group failed: 9/14 (65%) due to thrombosis, 2/14 (14%) due to poor growth of venous segment. Among 5 children who received aspirin prophylaxis, 2 (40%) AVFs failed, 1 (20%) developed hematoma and 1 (20%) had poor growth. In PTP group, 2/7 (29%) AVF failed: 1 due to hematoma, 1 due to poor growth. Additional events in PTP group included: vasospasm-induced thrombosis requiring thrombectomy (n=1) and hematoma (n=2, one was salvaged by surgical evacualtion). Two children who developed hematoma had anti-Xa levels at 1.56 IU/ml and 0.6 IU/ml respectively. Presently 4/7 (57%) AVFs in PTP group are functioning well (Figure 1). The 7th patient does not require hemodialysis. Three of the 5 children in the PTP group are still on LMWH (mean duration 6 months, mean anti-Xa level 0.6 IU/ml). Mean AVF survival was higher in children who received PTP (Day 100 survival: 57.14±18.7% versus 42.10±11.32% respectively; p 0.20; Figure 2). Small sample size thus far limits the meaningful statistical analysis. Conclusion: Our experience of LMWH thromboprophylaxis appears encouraging for prevention of AVF failure due to thrombosis. Close clinical and laboratory monitoring is required to prevent bleeding complications related to LMWH. More prospective data to expand our sample size will be required to clarify our observation. Institutional Experience of AVF from 2001 to 2006: Comparison between heparin thromboprophylaxis and historical controls D100 AVF survial: Comparison between thromboprophylaxis & historical controls D100 AVF survial: Comparison between thromboprophylaxis & historical controls


2016 ◽  
Vol 7;19 (7;9) ◽  
pp. 457-464
Author(s):  
Yasser Mohamed Amr

Background: Pharmacotherapy is the main treatment for management of trigeminal neuralgia. However, many patients become refractory to drugs. Objectives: The present study aimed to evaluate the effect of adding calcitonin to local anesthetic and methylprednisolone using a modified coronoid approach in management of trigeminal neuralgia pain involving the mandibular and/or maxillary branches. Study Design: Randomized double blind clinical trial. Setting: Hospital outpatient setting. Methods: Thirty-three patients received maxillary and mandibular blocks by a modified coronoid approach. Patients were allocated into 2 groups. Group 1 received a block with 3 mL of lidocaine 0.5% plus 40 mg of methylprednisolone and another syringe contained 1 mL of 0.9% saline. Group 2 received a block with 3 mL of lidocaine 0.5% plus 40 mg of methylprednisolone and another syringe contained 50 international units of calcitonin. Pain was evaluated by visual analog scale (VAS) before the block (basal), at 2 weeks, one month after the procedure, and monthly for one year. Duration of the effective pain relief of the first block (VAS ≤ 3) was reported. Repeated blockade was allowed for any patient reporting a VAS > 30 mm during one year of follow-up and the number of blocks were reported. Adverse effects were also reported. Results: A significantly longer duration of effective pain relief was noticed in group 2 compared with group 1 (P < 0.0004) while the duration of effective pain relief of the second block in group 1 was 28.5 ± 8.9 weeks. Four patients did not need repeated blocks in group 1 versus 15 in group 2. Six patients received 2 blocks versus 2 patients in each group, respectively. Moreover, 6 patients needed 3 blocks in group1 versus none in group 2. No serious adverse events were reported during or after the interventional procedure. VAS was comparable in both groups (P > 0.05). Limitations: Small sample size. Conclusion: Calcitonin may be a useful additive to local anesthetic and steroid in management of trigeminal neuralgia. Also, a modified coronoid approach for maxillary and mandibular nerve is simple, free of radiation, safe, and may be an effective percutaneous procedure in trigeminal neuralgia. Key words: Calcitonine, modifed, coronoid approach, trigeminal neuralgia


2020 ◽  
Vol 21 ◽  
Author(s):  
Roberto Gabbiadini ◽  
Eirini Zacharopoulou ◽  
Federica Furfaro ◽  
Vincenzo Craviotto ◽  
Alessandra Zilli ◽  
...  

Background: Intestinal fibrosis and subsequent strictures represent an important burden in inflammatory bowel disease (IBD). The detection and evaluation of the degree of fibrosis in stricturing Crohn’s disease (CD) is important to address the best therapeutic strategy (medical anti-inflammatory therapy, endoscopic dilation, surgery). Ultrasound elastography (USE) is a non-invasive technique that has been proposed in the field of IBD for evaluating intestinal stiffness as a biomarker of intestinal fibrosis. Objective: The aim of this review is to discuss the ability and current role of ultrasound elastography in the assessment of intestinal fibrosis. Results and Conclusion: Data on USE in IBD are provided by pilot and proof-of-concept studies with small sample size. The first type of USE investigated was strain elastography, while shear wave elastography has been introduced lately. Despite the heterogeneity of the methods of the studies, USE has been proven to be able to assess intestinal fibrosis in patients with stricturing CD. However, before introducing this technique in current practice, further studies with larger sample size and homogeneous parameters, testing reproducibility, and identification of validated cut-off values are needed.


Sign in / Sign up

Export Citation Format

Share Document