Evaluation of Localization Algorithms

Author(s):  
Michael Allen ◽  
Sebnem Baydere ◽  
Elena Gaura ◽  
Gurhan Kucuk

This chapter introduces a methodological approach to the evaluation of localization algorithms. The chapter contains a discussion of evaluation criteria and performance metrics followed by statistical/ empirical simulation models and parameters that affect the performance of the algorithms and hence their assessment. Two contrasting localization studies are presented and compared with reference to the evaluation criteria discussed throughout the chapter. The chapter concludes with a localization algorithm development cycle overview: from simulation to real deployment. The authors argue that algorithms should be simulated, emulated (on test beds or with empirical data sets) and subsequently implemented in hardware, in a realistic Wireless Sensor Network (WSN) deployment environment, as a complete test of their performance. It is hypothesised that establishing a common development and evaluation cycle for localization algorithms among researchers will lead to more realistic results and viable comparisons.

2020 ◽  
Author(s):  
Axel Lauer ◽  
Fernando Iglesias-Suarez ◽  
Veronika Eyring ◽  
the ESMValTool development team

<p>The Earth System Model Evaluation Tool (ESMValTool) has been developed with the aim of taking model evaluation to the next level by facilitating analysis of many different ESM components, providing well-documented source code and scientific background of implemented diagnostics and metrics and allowing for traceability and reproducibility of results (provenance). This has been made possible by a lively and growing development community continuously improving the tool supported by multiple national and European projects. The latest version (2.0) of the ESMValTool has been developed as a large community effort to specifically target the increased data volume of the Coupled Model Intercomparison Project Phase 6 (CMIP6) and the related challenges posed by analysis and evaluation of output from multiple high-resolution and complex ESMs. For this, the core functionalities have been completely rewritten in order to take advantage of state-of-the-art computational libraries and methods to allow for efficient and user-friendly data processing. Common operations on the input data such as regridding or computation of multi-model statistics are now centralized in a highly optimized preprocessor written in Python. The diagnostic part of the ESMValTool includes a large collection of standard recipes for reproducing peer-reviewed analyses of many variables across atmosphere, ocean, and land domains, with diagnostics and performance metrics focusing on the mean-state, trends, variability and important processes, phenomena, as well as emergent constraints. While most of the diagnostics use observational data sets (in particular satellite and ground-based observations) or reanalysis products for model evaluation some are also based on model-to-model comparisons. This presentation introduces the diagnostics newly implemented into ESMValTool v2.0 including an extended set of large-scale diagnostics for quasi-operational and comprehensive evaluation of ESMs, new diagnostics for extreme events, regional model and impact evaluation and analysis of ESMs, as well as diagnostics for emergent constraints and analysis of future projections from ESMs. The new diagnostics are illustrated with examples using results from the well-established CMIP5 and the newly available CMIP6 data sets.</p>


2020 ◽  
Author(s):  
Oleksii Nikolaienko ◽  
Per Eystein Lønning ◽  
Stian Knappskog

AbstractMotivationWith recent advances in the field of epigenetics, the focus is widening from large and frequent disease- or phenotype-related methylation signatures to rare alterations transmitted mitotically or transgenerationally (constitutional epimutations). Merging evidence indicate that such constitutional alterations, albeit occurring at a low mosaic level, may confer risk of disease later in life. Given their inherently low incidence rate and mosaic nature, there is a need for bioinformatic tools specifically designed to analyse such events.ResultsWe have developed a method (ramr) to identify aberrantly methylated DNA regions (AMRs). ramr can be applied to methylation data obtained by array or next-generation sequencing techniques to discover AMRs being associated with elevated risk of cancer as well as other diseases. We assessed accuracy and performance metrics of ramr and confirmed its applicability for analysis of large public data sets. Using ramr we identified aberrantly methylated regions that are known or may potentially be associated with development of colorectal cancer and provided functional annotation of AMRs that arise at early developmental stages.Availability and implementationThe R package is freely available at https://github.com/BBCG/ramr


2021 ◽  
Author(s):  
Lisa Bock ◽  
Birgit Hassler ◽  
Axel Lauer ◽  

<p>The Earth System Model Evaluation Tool (ESMValTool) has been developed with the aim of taking model evaluation to the next level by facilitating analysis of many different ESM components, providing well-documented source code and scientific background of implemented diagnostics and metrics and allowing for traceability and reproducibility of results (provenance). This has been made possible by a lively and growing development community continuously improving the tool supported by multiple national and European projects. The latest major release (v2.0) of the ESMValTool has been officially introduced in August 2020 as a large community effort, and since then several additional smaller releases have followed.</p><p>The diagnostic part of the ESMValTool includes a large collection of standard “recipes” for reproducing peer-reviewed analyses of many variables across ESM compartments including atmosphere, ocean, and land domains, with diagnostics and performance metrics focusing on the mean-state, trends, variability and important processes, phenomena, as well as emergent constraints. While most of the diagnostics use observational data sets (in particular satellite and ground-based observations) or reanalysis products for model evaluation some are also based on model-to-model comparisons. This presentation gives an overview on the latest scientific diagnostics and metrics added during the last year including examples of applications of these diagnostics to CMIP6 model data.</p>


2019 ◽  
Vol 17 (2) ◽  
pp. 6-14
Author(s):  
V. N. Gridin ◽  
V. V. Doenin ◽  
V. V. Panishchev ◽  
I. S. Razzhivaykin

In today’s world, many processes and events depend on forecasting. With development of mathematical models, an increasing number of factors influencing the final result of the forecast are taken into account, which in turn leads to the use of neural networks. But for training a neural network, source data sets are required, which are often not always sufficient or may not exist at all. The article describes a method of obtaining information as close to reality as possible. The proposed approach is to generate input data using simulation models of an object. The solution of a problem of generation of data sets and of training of a neural network is shown at the example of a typical marshalling railway station, and of a simulation of operations of a shunting hump. The considered examples confirmed the validity of the proposed methodological approach to generation of source data for neural networks using simulation models of a real object, based on a digital mathematical model, which makes it possible to obtain a simulation model of movement of transport objects, which is reliable in forecasting transport processes and creating relevant control algorithms.


2016 ◽  
Vol 15 (2) ◽  
pp. 49-55
Author(s):  
Pala SuriyaKala ◽  
Ravi Aditya

Human resources is traditionally an area subject to measured changes but with Big data, data analytics, Human capital Management, Talent acquisition and performance metrics as new trends, there is bound to be a sea change in this function. This paper is conceptual and tries to introspect and outline the challenges that HRM faces in Big Data. Big Data is as one knows the world of enormous generation which is revolutionizing the world with data sets at exabytes. This has been the driving force behind how governments, companies and functions will come to perform in the decades to come. The immense amount of information if properly utilized can lead to efficiency in various fields like never before. But to do this the cloud of suspicion, fear and uncertainty regarding the use of Big Data has to be removed from those who can use it to the benefit of their respective areas of application.HR traditionally has never been very data centric in the analysis of its decisions unlike marketing, finance, etc.


2015 ◽  
Vol 24 (4) ◽  
pp. 467-477
Author(s):  
Idris Skloul Ibrahim ◽  
Peter J.B. King ◽  
Hans-Wolfgang Loidl

AbstractNs2 is an open-source communications network simulator primarily used in research and teaching. Ns2 provides substantial support for simulation of TCP, routing, and multicast protocols over wired and wireless networks. Although Ns2 is a widely used powerful simulator, it lacks a way to measure networks that are used to assess reliability and performance metrics (e.g., the number of packets transferred from source to destination, delay in packets, packet loss, etc.) and it does not analyse the trace files it produces. The data obtained from the simulations are not straightforward to analyse. Ns2 is still unable to provide any data analysis statistics or graphics as requested. Moreover, the analysis of the Ns2 trace file using any software scripts requires further steps by a developer to do data processing and then produce graphical outputs. Lack of standardisation of tools means that results from different users may not be strictly comparable. There are alternative tools; however, most of them are not standalone applications, requiring some additional libraries. Also, they lack a user-friendly interface. This article presents the architecture and development considerations for the NsGTFA (Ns2 GUI Trace File Analyser) tool, which intends to simplify the management and enable the statistical analysis of trace files generated during network simulations. NsGTFA runs under Windows and has a friendly graphical user interface. This tool is a very fast standalone application implemented in VC++, taking as input an Ns2 trace file. It can output two-dimensional (2D) and 3D graphs (points, lines, and bar charts) or data sets, whatever the trace file format (Tagged, Old, or New). It is also possible to specify the output of standard network performance metrics. NsGTFA satisfies most user needs. There is no complex installation process, and no external libraries are needed.


Author(s):  
Verena Harrauer ◽  
Peter Schnedlitz

Purpose By focusing on the interface between information dissemination and interpretation at the retail sales floor, the paper aims to open up new practice theory contribution on management control and performance measurement used in complex environments. Design/methodology/approach Problem-centered qualitative interviews in two different contexts (U.S. and Europe) build the methodological approach. 22 interviewees were selected from various retail sectors and hierarchy levels with the focus on store management. Following content analysis procedures, data was coded according to contingency theoretical underpinnings. Findings Environment shapes corporate processes as well as retail management in multiple ways. By studying fast fashion industries, we found similarities in retail management in all researched settings. First, we present relevant operational performance metrics in the retailing context. Second, we see that store managers aim to optimize processes and generate efficient and effective practices to maximize store performance. Third, information and task overload are reasons for neglecting performance information. As a consequence, managers call for decision facilitating tools, e.g. dashboards, to reduce information complexity. Originality/value Widely accepted in contingency literature, environmental aspects influence business activities and performance outcomes. However, evaluating research studies that deal with performance measurement in retailing contexts reveals contradicting results. With the focus on larger retail companies with multibranch and department structures in two different national contexts we can unravel different perspectives on environment in operational retail settings for the first time.


Author(s):  
Satyanand Singh

<span lang="EN-US">Current Automatic Speaker Recognition (ASR) System has emerged as an important medium of confirmation of identity in many businesses, ecommerce applications, forensics and law enforcement as well. Specialists trained in criminological recognition can play out this undertaking far superior by looking at an arrangement of acoustic, prosodic, and semantic attributes which has been referred to as structured listening. An algorithmbased system has been developed in the recognition of forensic speakers by physics scientists and forensic linguists to reduce the probability of a contextual bias or pre-centric understanding of a reference model with the validity of an unknown audio sample and any suspicious individual. Many researchers are continuing to develop automatic algorithms in signal processing and machine learning so that improving performance can effectively introduce the speaker’s identity, where the automatic system performs equally with the human audience. In this paper, I examine the literature about the identification of speakers by machines and humans, emphasizing the key technical speaker pattern emerging for the automatic technology in the last decade. I focus on many aspects of automatic speaker recognition (ASR) systems, including speaker-specific features, speaker models, standard assessment data sets, and performance metrics</span>


Sign in / Sign up

Export Citation Format

Share Document