scholarly journals A Methodology to Select Topology Generators for Ad Hoc Mesh Network Simulations

2020 ◽  
pp. 741-746
Author(s):  
Michael O’Sullivan ◽  
◽  
Leonardo Aniello ◽  
Vladimiro Sassone

Many academic and industrial research working on Wireless Communications and Networking rely on simulations, at least in the first stages, to obtain preliminary results to be subsequently validated in real settings. Topology generators (TG) are commonly used to generate the initial placement of nodes in artificial Ad Hoc Mesh Network topologies, where those simulations take place. The significance of these experiments heavily depends on the representativeness of artificial topologies. Indeed, if they were not drawn fairly, obtained results would apply only to a subset of possible configurations, hence they would lack of the appropriate generality required to port them to the real world. Although using many TGs could mitigate this issue by generating topologies in several different ways, that would entail a significant additional effort. Hence, the problem arises of what TGs to choose, among a number of available generators, to maximise the representativeness of generated topologies and reduce the number of TGs to use. In this paper, we address that problem by investigating the presence of bias in the initial placement of nodes in artificial Ad Hoc Mesh Network topologies produced by different TGs. We propose a methodology to assess such bias and introduce a metric to quantify the diversity of the topologies generated by a TG with respect to all the available TGs, which can be used to select what TGs to use. We carry out experiments on three well-known TGs, namely BRITE, NPART and GT-ITM. Obtained results show that using the artificial networks produced by a single TG can introduce bias.

2020 ◽  
Vol 36 (S1) ◽  
pp. 37-37
Author(s):  
Americo Cicchetti ◽  
Rossella Di Bidino ◽  
Entela Xoxi ◽  
Irene Luccarini ◽  
Alessia Brigido

IntroductionDifferent value frameworks (VFs) have been proposed in order to translate available evidence on risk-benefit profiles of new treatments into Pricing & Reimbursement (P&R) decisions. However limited evidence is available on the impact of their implementation. It's relevant to distinguish among VFs proposed by scientific societies and providers, which usually are applicable to all treatments, and VFs elaborated by regulatory agencies and health technology assessment (HTA), which focused on specific therapeutic areas. Such heterogeneity in VFs has significant implications in terms of value dimension considered and criteria adopted to define or support a price decision.MethodsA literature research was conducted to identify already proposed or adopted VF for onco-hematology treatments. Both scientific and grey literature were investigated. Then, an ad hoc data collection was conducted for multiple myeloma; breast, prostate and urothelial cancer; and Non Small Cell Lung Cancer (NSCLC) therapies. Pharmaceutical products authorized by European Medicines Agency from January 2014 till December 2019 were identified. Primary sources of data were European Public Assessment Reports and P&R decision taken by the Italian Medicines Agency (AIFA) till September 2019.ResultsThe analysis allowed to define a taxonomy to distinguish categories of VF relevant to onco-hematological treatments. We identified the “real-world” VF that emerged given past P&R decisions taken at the Italian level. Data was collected both for clinical and economical outcomes/indicators, as well as decisions taken on innovativeness of therapies. Relevant differences emerge between the real world value framework and the one that should be applied given the normative framework of the Italian Health System.ConclusionsThe value framework that emerged from the analysis addressed issues of specific aspects of onco-hematological treatments which emerged during an ad hoc analysis conducted on treatment authorized in the last 5 years. The perspective adopted to elaborate the VF was the one of an HTA agency responsible for P&R decisions at a national level. Furthermore, comparing a real-world value framework with the one based on the general criteria defined by the national legislation, our analysis allowed identification of the most critical point of the current national P&R process in terms ofsustainability of current and future therapies as advance therapies and agnostic-tumor therapies.


2009 ◽  
Vol 2 (2) ◽  
Author(s):  
Elizabeth Dean ◽  
Sarah Cook ◽  
Michael Keating ◽  
Joe Murphy

The Centers for Disease Control and Prevention (CDC) has observed consistently increasing obesity trends over the past 25 years. Recent research suggests that avatar behavior and appearance may result in positive changes to real life individual behavior. Specifically, users may adjust their identity to match that of their avatars. Preliminary results of survey interviews in Second Life support our hypotheses that individuals whose avatars engaged in healthy behaviors were more likely to engage in physical activities in the real world than individuals with less physically active avatars. Furthermore, thinner-looking avatars were associated with lower BMI in real life. One unique feature of interviewing with avatars in Second Life is that researchers have the ability to manipulate environmental factors and interviewer characteristics with a consistency that is absent in the real world. In our preliminary results, espondents were more likely to report higher BMI or weight to a heavier-looking avatar than to a thinner-looking avatar.


Robotica ◽  
1992 ◽  
Vol 10 (5) ◽  
pp. 389-396 ◽  
Author(s):  
R. A. Jarvis

SUMMARYThis paper argues the case for extracting as complete a set of sensory data as practicable from scenes consisting of complex assemblages of objects with the goal of completing the task of scene analysis, including placement, pose, identity and relationship amongst the components in a robust manner which supports goal directed robotic action, including collision-free trajectory planning, grip site location and manipulation of selected object classes.The emphasis of the paper is that of sensor fusion of range and surface colour data including preliminary results in proximity, surface normal directionality and colour based scene segmentation through semantic-free clustering processes. The larger context is that of imbedding the results of such analysis in a graphics world containing an articulated robotic manipulator and of carrying out experiments in that world prior to replication of safe manipulation sequences in the real world.


2021 ◽  
Author(s):  
Isamu Endo ◽  
Kazuki Takashima ◽  
Maakito Inoue ◽  
Kazuyuki Fujita ◽  
Kiyoshi Kiyokawa ◽  
...  

Author(s):  
V. Narayanamurti

Over the last 50 years, solid state physics and technology have blossomed through the application of modern quantum mechanics to the real world. The intimate relationship between basic research and application has been highlighted ever since the invention of the transistor in 1947, the laser in 1958 and the subsequent spawning of the computer and communications revolution which has so changed our lives. The awarding of the 2000 Nobel Prize in Physics to Alferov, Kroemer and Kilby is another important recognition of the unique interplay between basic science and technology. Such advances and discoveries were made in major industrial research laboratories — Bell Labs, IBM, RCA and others. Today many of these industrial laboratories are in decline due to changes in the regulatory environment and global economic competition. In this talk I will examine some of the frontiers in technology and emerging policy issues. My talk will be colored by my own experiences at Bell Labs and subsequently at a major U.S. national laboratory (Sandia) and at universities (University of California at Santa Barbara and Harvard). I will draw on experiences from my role as the Chair of the National Research Council (NRC) panel on the Future of Condensed Matter and Materials Physics (1999) and as a reviewer of the 2001 NRC report, Physics in a New Era. The growth rates of silicon and optical technologies will ultimately flatten as physical and economic limits are reached. If history is any guide, entirely new technologies will be created. Current research in nanoscience and nanotechnology is already leading to new relationships between fields as diverse as chemistry, biology, applied physics, electrical and mechanical engineering. Materials science is becoming even more interdisciplinary than in the past. Different fields of engineering are coming together. The interfaces between engineering and biology are emerging as another frontier. I will spend some time in exploring the frontier where quantum mechanics intersects the real world and the special role played by designer materials and new imaging tools to explore this emerging frontier. To position ourselves for the future, we therefore must find new ways of breaking disciplinary boundaries in academia. The focus provided by applications and the role of interdisciplinary research centers will be examined. Strangely, the reductionist approach inherent in nanoscience must be connected with the world of complex systems. Integrative approaches to science and technology will become more the norm in fields such as systems biology, soft condensed matter and other complex systems. Just like in nature, can we learn to adapt some of the great successes of industrial research laboratories to a university setting? I will take examples from materials science to delineate the roles of different entities so that a true pluralistic approach for science and technology can be facilitated to create the next revolution in our field.


2020 ◽  
Vol 9 (1) ◽  
pp. 1744898 ◽  
Author(s):  
Fabrice Barlesi ◽  
Adrien Dixmier ◽  
Didier Debieuvre ◽  
Christophe Raspaud ◽  
Jean-Bernard Auliac ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document