scholarly journals Clinical Application of Radioembolization in Hepatic Malignancies: Protocol for a Prospective Multicenter Observational Study (Preprint)

2019 ◽  
Author(s):  
Thomas Helmberger ◽  
Dirk Arnold ◽  
José I Bilbao ◽  
Niels de Jong ◽  
Geert Maleux ◽  
...  

BACKGROUND Radioembolization, also known as transarterial radioembolization or selective internal radiation therapy with yttrium-90 (90Y) resin microspheres, is an established treatment modality for patients with primary and secondary liver tumors. However, large-scale prospective observational data on the application of this treatment in a real-life clinical setting is lacking. OBJECTIVE The main objective is to collect data on the clinical application of radioembolization with 90Y resin microspheres to improve the understanding of the impact of this treatment modality in its routine practice setting. METHODS Eligible patients are 18 years or older and receiving radioembolization for primary and secondary liver tumors as part of routine practice, as well as have signed informed consent. Data is collected at baseline, directly after treatment, and at every 3-month follow-up until 24 months or study exit. The primary objective of the Cardiovascular and Interventional Radiological Society of Europe Registry for SIR-Spheres Therapy (CIRT) is to observe the clinical application of radioembolization. Secondary objectives include safety, effectiveness in terms of overall survival, progression-free survival (PFS), liver-specific PFS, imaging response, and change in quality of life. RESULTS Between January 2015 and December 2017, 1047 patients were included in the study. The 24-month follow-up period ended in December 2019. The first results are expected in the third quarter of 2020. CONCLUSIONS The CIRT is the largest observational study on radioembolization to date and will provide valuable insights to the clinical application of this treatment modality and its real-life outcomes. CLINICALTRIAL ClinicalTrials.gov NCT02305459; https://clinicaltrials.gov/ct2/show/NCT02305459 INTERNATIONAL REGISTERED REPORT DERR1-10.2196/16296

10.2196/16296 ◽  
2020 ◽  
Vol 9 (4) ◽  
pp. e16296 ◽  
Author(s):  
Thomas Helmberger ◽  
Dirk Arnold ◽  
José I Bilbao ◽  
Niels de Jong ◽  
Geert Maleux ◽  
...  

Background Radioembolization, also known as transarterial radioembolization or selective internal radiation therapy with yttrium-90 (90Y) resin microspheres, is an established treatment modality for patients with primary and secondary liver tumors. However, large-scale prospective observational data on the application of this treatment in a real-life clinical setting is lacking. Objective The main objective is to collect data on the clinical application of radioembolization with 90Y resin microspheres to improve the understanding of the impact of this treatment modality in its routine practice setting. Methods Eligible patients are 18 years or older and receiving radioembolization for primary and secondary liver tumors as part of routine practice, as well as have signed informed consent. Data is collected at baseline, directly after treatment, and at every 3-month follow-up until 24 months or study exit. The primary objective of the Cardiovascular and Interventional Radiological Society of Europe Registry for SIR-Spheres Therapy (CIRT) is to observe the clinical application of radioembolization. Secondary objectives include safety, effectiveness in terms of overall survival, progression-free survival (PFS), liver-specific PFS, imaging response, and change in quality of life. Results Between January 2015 and December 2017, 1047 patients were included in the study. The 24-month follow-up period ended in December 2019. The first results are expected in the third quarter of 2020. Conclusions The CIRT is the largest observational study on radioembolization to date and will provide valuable insights to the clinical application of this treatment modality and its real-life outcomes. Trial Registration ClinicalTrials.gov NCT02305459; https://clinicaltrials.gov/ct2/show/NCT02305459 International Registered Report Identifier (IRRID) DERR1-10.2196/16296


2020 ◽  
Vol 79 (Suppl 1) ◽  
pp. 1207.2-1207
Author(s):  
A. García Fernández ◽  
A. Briones-Figueroa ◽  
L. Calvo Sanz ◽  
Á. Andreu-Suárez ◽  
J. Bachiller-Corral ◽  
...  

Background:Biological therapy (BT) has changed the treatment and perspectives of JIA patients but little is known about when is the best moment to start BT and the impact of this prompt iniciation.Objectives:To analyze the response to BT of Juvenile Idiophatic Arthritis (JIA) patients according to the time when the BT was started.Methods:A retrospective, descriptive study was conducted on JIA patients followed up in a referal hospital that started BT up to 24 months after diagnosis from 2000 to 2018. Disease activity was measured, at 2 years after diagnosis, according to Wallace criteria for remission (absence of: active arthritis, active uveitis, fever, rash or any other manifestation attributable to JIA, normal CRP and ESR, PGA indicating no active disease) for at least 6 months.Results:55 JIA patients that started BT up to 24 months from diagnosis were analyzed. 69,1% were girls with a median age at diagnosis of 8 years old IQR(3-13), median age at the start of BT of 9 years old IQR(3-13). Regarding JIA categories: 25,5% were Oligoarticular Persistent (OligP), 18,2% Systemic JIA (sJIA), 16,4% Entesitis related Arthritis (ERA), 12,7% Psoriatic Arthritis (APso) and Polyarticular RF- (PolyRF-), 5,5% Oligoarticular Extended (OligE) and Polyarticular RF+ (PolyRF+), 3,6% Undifferentiated (Und). 20% of patients had uveitis during followup. Conventional DMARD (cDMARD) was indicated in 83,6% of patients (95,7% Methotrexate) at diagnosis [median 0 months IQR(0-2,3)]. At the end of followup (2 years) only 30,9% of patients continued with cDMARDs. The main causes of discontinuation were: adverse events (46,7%), remission (36,7%). TNF inhibitors were precribed in 81,8% of patients and 18,2% of patients recieved two BT during the first 2 years from diagnosis. 54,5% of BT were indicated during the first 6 months from diagnosis, 27,3% from 7 to 12 months, 12,7% from 13 to 18 months, 5,5% from 19 to 24 months.After 2 years from diagnosis, 78,2% of patients were on remission and 21,8% active. Among patients with active disease: 75% had arthritis, 16,7% had uveitis and 8,3% had both. There were no differences regarding disease activity among patients with uveitis and neither taking cDMARDs. Regarding JIA categories: 66,7% of OligE, 57,1% of PolyRF- and 57,1% of APso patients were active at 2 years from diagnosis when compared to the other categories (p=0.004).Patients on remission at 24 months from diagnosis started sooner the BT than active patients [CI 95% (0,46-8,29) p=0,029]. The time when the BT was started was correlated to the activity at 2 years (K= 0,294 p=0,029). When the BT was prescribed after 7,5months from diagnosis it was correlated, in a COR curve, with a higher probability of active disease at 2 years (S= 0,67 E= 0,63). There was a correlation, among patients on remission at 2 years, between prompt start of BT and less time to reach remission (K= -0,345 p=0,024). Patients with active disease at 2 years, regardless of moment of BT iniciation, required more BT during follow-up (p=0,002).Conclusion:Prompt iniciation of BT was correlated with a better outcome. JIA patients that started BT early after diagnosis had a higher probability of remission after 2 years. Starting BT after 7,5 months was correlated with a higher probability of active disease at 2 years. Active disease at 24 months was correlated with persistent active disease during follow-up.Disclosure of Interests:None declared


Author(s):  
Krzysztof Jurczuk ◽  
Marcin Czajkowski ◽  
Marek Kretowski

AbstractThis paper concerns the evolutionary induction of decision trees (DT) for large-scale data. Such a global approach is one of the alternatives to the top-down inducers. It searches for the tree structure and tests simultaneously and thus gives improvements in the prediction and size of resulting classifiers in many situations. However, it is the population-based and iterative approach that can be too computationally demanding to apply for big data mining directly. The paper demonstrates that this barrier can be overcome by smart distributed/parallel processing. Moreover, we ask the question whether the global approach can truly compete with the greedy systems for large-scale data. For this purpose, we propose a novel multi-GPU approach. It incorporates the knowledge of global DT induction and evolutionary algorithm parallelization together with efficient utilization of memory and computing GPU’s resources. The searches for the tree structure and tests are performed simultaneously on a CPU, while the fitness calculations are delegated to GPUs. Data-parallel decomposition strategy and CUDA framework are applied. Experimental validation is performed on both artificial and real-life datasets. In both cases, the obtained acceleration is very satisfactory. The solution is able to process even billions of instances in a few hours on a single workstation equipped with 4 GPUs. The impact of data characteristics (size and dimension) on convergence and speedup of the evolutionary search is also shown. When the number of GPUs grows, nearly linear scalability is observed what suggests that data size boundaries for evolutionary DT mining are fading.


Author(s):  
Gianluca Bardaro ◽  
Alessio Antonini ◽  
Enrico Motta

AbstractOver the last two decades, several deployments of robots for in-house assistance of older adults have been trialled. However, these solutions are mostly prototypes and remain unused in real-life scenarios. In this work, we review the historical and current landscape of the field, to try and understand why robots have yet to succeed as personal assistants in daily life. Our analysis focuses on two complementary aspects: the capabilities of the physical platform and the logic of the deployment. The former analysis shows regularities in hardware configurations and functionalities, leading to the definition of a set of six application-level capabilities (exploration, identification, remote control, communication, manipulation, and digital situatedness). The latter focuses on the impact of robots on the daily life of users and categorises the deployment of robots for healthcare interventions using three types of services: support, mitigation, and response. Our investigation reveals that the value of healthcare interventions is limited by a stagnation of functionalities and a disconnection between the robotic platform and the design of the intervention. To address this issue, we propose a novel co-design toolkit, which uses an ecological framework for robot interventions in the healthcare domain. Our approach connects robot capabilities with known geriatric factors, to create a holistic view encompassing both the physical platform and the logic of the deployment. As a case study-based validation, we discuss the use of the toolkit in the pre-design of the robotic platform for an pilot intervention, part of the EU large-scale pilot of the EU H2020 GATEKEEPER project.


2005 ◽  
Vol 11 (5) ◽  
pp. 562-567 ◽  
Author(s):  
J Haas ◽  
M Maas-Enriquez ◽  
H P Hartung

Use of intravenous immunoglobulins (IVIG) has been recommended for treatment of RRMS if first line therapy with beta-interferon or glatiramer acetate is not tolerated, or if contraindications exist. This consensus recommendation is based on the demonstration of efficacy and tolerability of IVIG in four randomized controlled trials (RCTs). The impact of non-randomized observational trials on evidence-based recommendations for treatment is still under discussion. In order to evaluate the transferability of study results derived from RCTs into a routine practice setting, we carried out a retrospective data analysis on patients with RRMS who had been treated with IVIG during the last five years. Data sets from 308 out of 1122 screened patients were available for analysis. Treatment with IVIG resulted in a 69% reduction of the mean annual relapse rate (ARR) (calculated over two years) from 1.749±1.15 before IVIG treatment to 0.539±0.61 after start of IVIG treatment. Mean expanded disability status scale (EDSS) values remained stable throughout the observation period. The results of this observational study were similar to the results of previous RCTs with IVIG.


2021 ◽  
Vol 42 (Supplement_1) ◽  
Author(s):  
R M Mori ◽  
I N G Nunez

Abstract Background Recent publications suggest that bioabsorbable vascular scaffolds (BVS) carry an excess of thrombotic complications. Our goal was to describe the results in real life and in the long term, in a series of patients who received a BVS which is currently off the market. Methods Two hundred and thirteen consecutive patients who received at least 1 BVS between May 2012 and December 2016 were analyzed. The primary objective was the incidence of the compound event “target vessel failure” that included infarction or target vessel revascularization and cardiac death. Results Seventy-five percent of patients were men with a mean age of 61.4 years. They had a high prevalence of dyslipidemia (62.44%) and smoking (65.26%). The most common cause of admission was myocardial infarction without ST elevation (53.52%). A total of 233 coronary lesions were treated, with an average of 1.3±0.3 lesions per patient. The implant was successful in 99.5% of cases. Predilatation was performed in 89.3% and post dilation in 33.5% of cases. The use of intracoronary imaging (Optical Coherence Tomography OCT and/or Intravascular ultrasonography IVUS) to optimize the BVS implant was performed in 86 patients (40.38%). With a mean follow-up of 42.5 months, the incidence of target vessel failure was 6.57% during the first 24 months and 7.98% at the end of the follow-up. Regarding the device, this included 6 cases (2.81%) of thrombosis (definitive, probable or possible) and 10 cases (4.69%) of restenosis. Patients with a history of diabetes mellitus (HR 1.72 95% CI 1.01–2.95 P=0,05) and/or chronic oral anticoagulation (HR 5.71 95% CI 1.12–28.94 P=0.04) had a higher risk of target vessel failure. The use of intracoronary imaging (OCT and/or IVUS) during the BVS implantation had a considerable trend toward significance as a protective factor (HR 0.32 95% CI 0.11–1.03 P=0.06). Conclusions In this series of patients; in real life conditions, the incidence of target vessel failure was comparable to that previously described in randomized clinical trials. The events were more frequent during the first 2 years of follow-up, in the presence of greater cardiovascular comorbidity and in the absence of intracoronary imaging during the implantation. FUNDunding Acknowledgement Type of funding sources: Other. Main funding source(s): European Society of Cardiology KM curve for target vessel failure (TVF) Predictor analysis for TVF


2020 ◽  
pp. 1-7
Author(s):  
Sumit Kumar Gupta ◽  

Nanotechnology is new frontiers of this century. The world is facing great challenges in meeting rising demands for basic commodities(e.g., food, water and energy), finished goods (e.g., cellphones, cars and airplanes) and services (e.g., shelter, healthcare and employment) while reducing and minimizing the impact of human activities on Earth’s global environment and climate. Nanotechnology has emerged as a versatile platform that could provide efficient, cost-effective and environmentally acceptable solutions to the global sustainability challenges facing society. In recent years there has been a rapid increase in nanotechnology in the fields of medicine and more specifically in targeted drug delivery. Opportunities of utilizing nanotechnology to address global challenges in (1) water purification, (2) clean energy technologies, (3) greenhouse gases management, (4) materials supply and utilization, and (5) green manufacturing and hemistry. Smart delivery of nutrients, bio-separation of proteins, rapid sampling of biological and chemical contaminants, and nano encapsulation of nutraceuticals are some of the emerging topics of nanotechnology for food and agriculture. Nanotechnology is helping to considerably improve, even revolutionize, many technology and Industry sectors: information technology, energy, environmental science, medicine, homeland security, food safety, and transportation, among many others. Today’s nanotechnology harnesses current progress in chemistry, physics, materials science, and biotechnology to create novel materials that have unique properties because their structures are determined on the nanometer scale. This paper summarizes the various applications of nanotechnology in recent decades Nanotechnology is one of the leading scientific fields today since it combines knowledge from the fields of Physics, Chemistry, Biology, Medicine, Informatics, and Engineering. It is an emerging technological field with great potential to lead in great breakthroughs that can be applied in real life. Novel Nano and biomaterials, and Nano devices are fabricated and controlled by nanotechnology tools and techniques, which investigate and tune the properties, responses, and functions of living and non-living matter, at sizes below100 nm. The application and use of Nano materials in electronic and mechanical devices, in optical and magnetic components, quantum computing, tissue engineering, and other biotechnologies, with smallest features, widths well below 100 nm, are the economically most important parts of the nanotechnology nowadays and presumably in the near future. The number of Nano products is rapidly growing since more and more Nano engineered materials are reaching the global market the continuous revolution in nanotechnology will result in the fabrication of nanomaterial with properties and functionalities which are going to have positive changes in the lives of our citizens, be it in health, environment, electronics or any other field. In the energy generation challenge where the conventional fuel resources cannot remain the dominant energy source, taking into account the increasing consumption demand and the CO2 .Emissions alternative renewable energy sources based on new technologies have to be promoted. Innovative solar cell technologies that utilize nanostructured materials and composite systems such as organic photovoltaic offer great technological potential due to their attractive properties such as the potential of large-scale and low-cost roll-to-roll manufacturing processes


EP Europace ◽  
2020 ◽  
Author(s):  
Xavier Freixa ◽  
Boris Schmidt ◽  
Patrizio Mazzone ◽  
Sergio Berti ◽  
Sven Fischer ◽  
...  

Abstract Aims Left atrial appendage occlusion (LAAO) may be considered for patients with non-valvular atrial fibrillation (NVAF) and a relative/formal contraindication to anticoagulation. This study aimed to summarize the impact of aging on LAAO outcomes at short and long-term follow-up. Methods and results We compared subjects aged <70, ≥70 and <80, and ≥80 years old in the prospective, multicentre Amplatzer™ Amulet™ Occluder Observational Study (Abbott, Plymouth, MN, USA). Serious adverse events (SAEs) were reported from implant through a 2-year post-LAAO visit and adjudicated by an independent clinical events committee. Overall, 1088 subjects were prospectively enrolled. There were 265 subjects (24.4%) <70 years old, 491 subjects (45.1%) ≥70 and <80 years old, and 332 subjects (30.5%) ≥80 years old, with the majority (≥80%) being contraindicated to anticoagulation. As expected, CHA2DS2-VASc and HAS-BLED Scores increased with age. Implant success was high (≥98.5%) across all groups, and the proportion of subjects with a procedure- or device-related SAE was similar between groups. At follow-up, the observed ischaemic stroke rate was not significantly different between groups, and corresponding risk reductions were 62, 56, and 85% when compared with predicted rates for subjects <70, ≥70 and <80, and ≥80 years old, respectively. Major bleeding and mortality rates increased with age, while the incidence of device-related thrombus tended to increase with age. Conclusions Despite the increased risk for ischaemic stroke with increasing age in AF patients, LAAO reduced the risk for ischaemic stroke compared with the predicted rate across all age groups without differences in procedural SAEs.


BMJ Open ◽  
2019 ◽  
Vol 9 (12) ◽  
pp. e033712
Author(s):  
José Miguel Rivera-Caravaca ◽  
Francisco Marín ◽  
María Asunción Esteve-Pastor ◽  
Josefa Gálvez ◽  
Gregory Y.H. Lip ◽  
...  

IntroductionAtrial fibrillation (AF) is characterised by a high stroke risk. Vitamin K antagonists (VKAs) are the most commonly used oral anticoagulants (OACs) in Spain, but their efficacy and safety depend on the time in therapeutic range of International Normalized Ratio (INR) 2.0–3.0 over 65%–70%. Unfortunately, the difficulties of maintaining an optimal level of anticoagulation and the complications of VKAs (particularly haemorrhagic ones), frequently lead to cessation of this therapy, which has been associated with higher risk of adverse events (AEs), including ischaemic stroke. Our aims are as follows: (1) to evaluate the quality of oral anticoagulation with VKAs, the prevalence of poor quality of anticoagulation, and to identify factors predisposing to poor quality anticoagulation; and (2) to identify patients who will stop OAC and to investigate what factors influence the decision of OAC withdrawal.Methods and analysisProspective observational cohort study including outpatients newly diagnosed with AF and naïve for OACs from July 2016 to June 2018 in an anticoagulation clinic. Patients with prosthetic heart valves, rheumatic mitral valves or valvular AF will be excluded. Follow-up will extend for up to 3 years. During this period, the INR results and changes in the anticoagulant therapy will be recorded, as well as all AEs, or any other information that would be relevant to the proper conduct of research.Ethics and disseminationAll patients were informed about the nature and purpose of the study, and the protocol was approved by the Ethics Committee of Hospital General Universitario Morales Meseguer (reference: EST:20/16). This is an observational study focusing on ‘real life’ practice and therefore all treatments and follow-up will be performed in accordance to the routine clinical practice with no specific interventions or visits. The results of our study will be disseminated by presentations at national and international meetings, and publications in peer-reviewed journals.


BMJ Open ◽  
2019 ◽  
Vol 9 (6) ◽  
pp. e028434 ◽  
Author(s):  
Emil Vilstrup ◽  
Dennis Schou Graversen ◽  
Linda Huibers ◽  
Morten Bondo Christensen ◽  
Anette Fischer Pedersen

ObjectivesOut-of-hours (OOH) telephone triage is used to manage patient flow, but knowledge of the communicative skills of telephone triagists is limited. The aims of this study were to compare communicative parameters in general practitioner (GP)-led and nurse-led OOH telephone triage and to discuss differences in relation to patient-centred communication and safety issues.DesignObservational study.SettingTwo Danish OOH settings: a large-scale general practitioner cooperative in the Central Denmark Region (n=100 GP-led triage conversations) and Medical Helpline 1813 in the Capital Region of Denmark (n=100 nurse-led triage conversations with use of a clinical decision support system).Participants200 audio-recorded telephone triage conversations randomly selected.Primary and secondary outcome measuresConversations were compared with regard to length of call, distribution of speaking time, question types, callers’ expression of negative affect, and nurses’ and GPs’ responses to callers’ negative affectivity using the Mann-Whitney U test and the Student’s t-test.ResultsCompared with GPs, nurses had longer telephone contacts (137s vs 264 s, p=0.001) and asked significantly more questions (5 vs 9 questions, p=0.001). In 36% of nurse-led triage conversations, triage nurses either transferred the call to a physician or had to confer the call with a physician. Nurses gave the callers significantly more spontaneous talking time than GPs (23.4s vs 17.9 s, p=0.01). Compared with nurses, GPs seemed more likely to give an emphatic response when a caller spontaneously expressed concern; however, this difference was not statistically significant (36% vs 29%, p=0.6).ConclusionsWhen comparing communicative parameters in GP-led and nurse-led triage, several differences were observed. However, the impact of these differences in the perspective of patient-centred communication and safety needs further research. More knowledge is needed to determine what characterises good quality in telephone triage communication.


Sign in / Sign up

Export Citation Format

Share Document