scholarly journals A Bayesian Approach for an Efficient Data Reduction in IoT

Author(s):  
Cristanel Razafimandimby ◽  
Valeria Loscrí ◽  
Anna Maria Vegni ◽  
Driss Aourir ◽  
Alessandro Neri
2017 ◽  
Vol 238 ◽  
pp. 234-244 ◽  
Author(s):  
Jianpei Wang ◽  
Shihong Yue ◽  
Xiao Yu ◽  
Yaru Wang

Author(s):  
Veronika Strnadová-Neeley ◽  
Aydın Buluç ◽  
Jarrod Chapman ◽  
John R. Gilbert ◽  
Joseph Gonzalez ◽  
...  

2006 ◽  
Vol 25 (3) ◽  
pp. 359-374 ◽  
Author(s):  
Surong Wang ◽  
Manoranjan Dash ◽  
Liang-Tien Chia ◽  
Min Xu

2019 ◽  
Vol 5 ◽  
pp. e239
Author(s):  
Pietro De Lellis ◽  
Shinnosuke Nakayama ◽  
Maurizio Porfiri

Public participation in scientific activities, often called citizen science, offers a possibility to collect and analyze an unprecedentedly large amount of data. However, diversity of volunteers poses a challenge to obtain accurate information when these data are aggregated. To overcome this problem, we propose a classification algorithm using Bayesian inference that harnesses diversity of volunteers to improve data accuracy. In the algorithm, each volunteer is grouped into a distinct class based on a survey regarding either their level of education or motivation to citizen science. We obtained the behavior of each class through a training set, which was then used as a prior information to estimate performance of new volunteers. By applying this approach to an existing citizen science dataset to classify images into categories, we demonstrate improvement in data accuracy, compared to the traditional majority voting. Our algorithm offers a simple, yet powerful, way to improve data accuracy under limited effort of volunteers by predicting the behavior of a class of individuals, rather than attempting at a granular description of each of them.


2014 ◽  
Vol 11 (2) ◽  
pp. 665-678 ◽  
Author(s):  
Stefanos Ougiaroglou ◽  
Georgios Evangelidis

Data reduction techniques improve the efficiency of k-Nearest Neighbour classification on large datasets since they accelerate the classification process and reduce storage requirements for the training data. IB2 is an effective prototype selection data reduction technique. It selects some items from the initial training dataset and uses them as representatives (prototypes). Contrary to many other techniques, IB2 is a very fast, one-pass method that builds its reduced (condensing) set in an incremental manner. New training data can update the condensing set without the need of the ?old? removed items. This paper proposes a variation of IB2, that generates new prototypes instead of selecting them. The variation is called AIB2 and attempts to improve the efficiency of IB2 by positioning the prototypes in the center of the data areas they represent. The empirical experimental study conducted in the present work as well as the Wilcoxon signed ranks test show that AIB2 performs better than IB2.


Sign in / Sign up

Export Citation Format

Share Document