data refinement
Recently Published Documents


TOTAL DOCUMENTS

142
(FIVE YEARS 2)

H-INDEX

18
(FIVE YEARS 0)

2021 ◽  
Vol 64 (11) ◽  
pp. 109-117
Author(s):  
Favyen Bastani ◽  
Songtao He ◽  
Satvat Jagwani ◽  
Edward Park ◽  
Sofiane Abbar ◽  
...  

Automatic map inference, data refinement, and machine-assisted map editing promises more accurate map datasets.



Author(s):  
Shiwei Dong ◽  
Yirong Wang ◽  
Yanan Wu ◽  
Yuchun Pan


Entropy ◽  
2020 ◽  
Vol 22 (5) ◽  
pp. 520
Author(s):  
Jinle Xiong ◽  
Xueyu Liang ◽  
Lina Zhao ◽  
Benny Lo ◽  
Jianqing Li ◽  
...  

Due to the wide inter- and intra-individual variability, short-term heart rate variability (HRV) analysis (usually 5 min) might lead to inaccuracy in detecting heart failure. Therefore, RR interval segmentation, which can reflect the individual heart condition, has been a key research challenge for accurate detection of heart failure. Previous studies mainly focus on analyzing the entire 24-h ECG recordings from all individuals in the database which often led to poor detection rate. In this study, we propose a set of data refinement procedures, which can automatically extract heart failure segments and yield better detection of heart failure. The procedures roughly contain three steps: (1) select fast heart rate sequences, (2) apply dynamic time warping (DTW) measure to filter out dissimilar segments, and (3) pick out individuals with large numbers of segments preserved. A physical threshold-based Sample Entropy (SampEn) was applied to distinguish congestive heart failure (CHF) subjects from normal sinus rhythm (NSR) ones, and results using the traditional threshold were also discussed. Experiment on the PhysioNet/MIT RR Interval Databases showed that in SampEn analysis (embedding dimension m = 1, tolerance threshold r = 12 ms and time series length N = 300), the accuracy value after data refinement has increased to 90.46% from 75.07%. Meanwhile, for the proposed procedures, the area under receiver operating characteristic curve (AUC) value has reached 95.73%, which outperforms the original method (i.e., without applying the proposed data refinement procedures) with AUC of 76.83%. The results have shown that our proposed data refinement procedures can significantly improve the accuracy in heart failure detection.



IEEE Access ◽  
2020 ◽  
Vol 8 ◽  
pp. 52927-52940
Author(s):  
Shu-Jie Li ◽  
Hai-Sheng Zhu ◽  
Li-Ping Zheng ◽  
Lin Li


Proceedings ◽  
2019 ◽  
Vol 31 (1) ◽  
pp. 52 ◽  
Author(s):  
Mikel Emaldi ◽  
Koldo Zabaleta ◽  
Diego López-de-Ipiña

This work describes how open data and human computation can be brought together through blockchain to foster the collaboration of citizens on the continuous enhancement of open data portals. For that, it contributes with a set of enhancements to the widely adopted data management tool Comprehensive Knowledge Archive Network (CKAN), to allow full audit and management of the change requests posed by citizens to datasets in open data portals. User contributions’ sustainability in time is tackled by providing rewards to users through AudaCoins, a currency that rewards citizens according to their refinement contributions, thus encouraging their continuous engagement with city co-creation activities.



2019 ◽  
Author(s):  
Kevin M. Garner ◽  
Polykarpos Thomadakis ◽  
Thomas Kennedy ◽  
Christos Tsolakis ◽  
Nikos N. Chrisochoides


Entropy ◽  
2019 ◽  
Vol 21 (6) ◽  
pp. 568 ◽  
Author(s):  
Mohammad Shekaramiz ◽  
Todd K. Moon ◽  
Jacob H. Gunther

We examine the deployment of multiple mobile sensors to explore an unknown region to map regions containing concentration of a physical quantity such as heat, electron density, and so on. The exploration trades off between two desiderata: to continue taking data in a region known to contain the quantity of interest with the intent of refining the measurements vs. taking data in unobserved areas to attempt to discover new regions where the quantity may exist. Making reasonable and practical decisions to simultaneously fulfill both goals of exploration and data refinement seem to be hard and contradictory. For this purpose, we propose a general framework that makes value-laden decisions for the trajectory of mobile sensors. The framework employs a Gaussian process regression model to predict the distribution of the physical quantity of interest at unseen locations. Then, the decision-making on the trajectories of sensors is performed using an epistemic utility controller. An example is provided to illustrate the merit and applicability of the proposed framework.



Sign in / Sign up

Export Citation Format

Share Document