Intelligent Data Collaboration in Heterogeneous-device IoT Platforms

2021 ◽  
Vol 17 (3) ◽  
pp. 1-17
Author(s):  
Danfeng Sun ◽  
Jia Wu ◽  
Jian Yang ◽  
Huifeng Wu

The merging boundaries between edge computing and deep learning are forging a new blueprint for the Internet of Things (IoT). However, the low-quality of data in many IoT platforms, especially those composed of heterogeneous devices, is hindering the development of high-quality applications for those platforms. The solution presented in this article is intelligent data collaboration, i.e., the concept of deep learning providing IoT with the ability to adaptively collaborate to accomplish a task. Here, we outline the concept of intelligent data collaboration in detail and present a mathematical model in general form. To demonstrate one possible case where intelligent data collaboration would be useful, we prepared an implementation called adaptive data cleaning (ADC), designed to filter noisy data out of temperature readings in an IoT base station network. ADC primarily consists of a denoising autoencoder LSTM for predictions and a four-level data processing mechanism to perform the filtering. Comparisons between ADC and a maximum slop method show ADC with the lowest false error and the best filtering rates.

Author(s):  
Sindhu P. Menon

In the last couple of years, artificial neural networks have gained considerable momentum. Their results could be enhanced if the number of layers could be made deeper. Of late, a lot of data has been generated, which has led to big data. This comes along with many challenges like quality, which is one of the most important ones. Deep learning models can improve the quality of data. In this chapter, an attempt has been made to review deep supervised and deep unsupervised learning algorithms and the various activation functions used. Challenges in deep learning have also been discussed.


Author(s):  
Mr Almelu ◽  
Dr. S. Veenadhari ◽  
Kamini Maheshwar

The Internet of Things (IoT) systems create a large amount of sensing information. The consistency of this information is an essential problem for ensuring the quality of IoT services. The IoT data, however, generally suffers due to a variety of factors such as collisions, unstable network communication, noise, manual system closure, incomplete values and equipment failure. Due to excessive latency, bandwidth limitations, and high communication costs, transferring all IoT data to the cloud to solve the missing data problem may have a detrimental impact on network performance and service quality. As a result, the issue of missing information should be addressed as soon as feasible by offloading duties like data prediction or estimations closer to the source. As a result, the issue of incomplete information must be addressed as soon as feasible by offloading duties such as predictions or assessment to the network’s edge devices. In this work, we show how deep learning may be used to offload tasks in IoT applications.


2019 ◽  
Vol 10 (1) ◽  
pp. 55-68 ◽  
Author(s):  
Iwona Markowicz ◽  
Paweł Baran

Research background: As a system of official EU statistics, Intrastat contains data collected by Member States aggregated by Eurostat on the Union’s level in the form of COMEXT database. Country-level data are based on declarations made by businesses dispatching or acquiring goods from other EU Member States. Since the same transaction is declared twice — as an ICS in one country and at the same time as an ICA in another country by the partner — the database contains mirror data. Analysis of mirror data lets us assess the quality of public statistics data on international trade. Purpose of the article: The aim of the article is to rank EU Member States according to quality of data on intra-Community trade in goods collected by Intrastat. Foreign trade stimulates economic development on one hand and is the development’s reflection on the other. Thus it is very important that official statistics in this area be of good quality. Analysis of mirror data from partner states in intra-Community trade in goods allows us to claim that not every Member State pro-vides data of satisfactory quality level. Methods: We used the authors’ methodology of assessing quality of mirror data. These include data asymmetry indices, both proposed by Eurostat and the authors’ own proposals. We have also examined the changes in the above mentioned rankings over time. Findings & Value added: The result of the survey is ordering of EU Member States according to the quality of data on intra-Community trade in goods. The rankings are presented for the period of 2014–2017, during which there were 28 Member States of the EU. Changes in distinct countries’ positions were shown as a result of changes in overall quality of statistical data collected in these countries. The research methodology can be used in the process of monitoring data quality of the Intrastat system.


Sensors ◽  
2019 ◽  
Vol 19 (3) ◽  
pp. 693
Author(s):  
Giacomo Tanganelli ◽  
Enzo Mingozzi

The Internet of Things (IoT) is becoming real, and recent studies highlight that the number of IoT devices will significantly grow in the next decade. Such massive IoT deployments are typically made available to applications as a service by means of IoT platforms, which are aware of the characteristics of the connected IoT devices–usually constrained in terms of computation, storage and energy capabilities–and dispatch application’s service requests to appropriate devices based on their capabilities. In this work, we develop an energy-aware allocation policy that aims at maximizing the lifetime of all the connected IoT devices, whilst guaranteeing that applications’ Quality of Service (QoS) requirements are met. To this aim, we formally define an IoT service allocation problem as a non-linear Generalized Assignment Problem (GAP). We then develop a time-efficient heuristic algorithm to solve the problem, which is shown to find near-optimal solutions by exploiting the availability of equivalent IoT services provided by multiple IoT devices, as expected especially in the case of massive IoT deployments.


Author(s):  
Harini Akshaya T. J. ◽  
Suresh V. ◽  
Carmel Sobia M.

Electronic health records (EHR) have been adopted in many countries as they tend to play a major role in the healthcare systems. This is due to the fact that the high quality of data could be achieved at a very low cost. EHR is a platform where the data are stored digitally and the users could access and exchange data in a secured manner. The main objective of this chapter is to summarize recent development in wearable sensors integrated with the internet of things (IoT) system and their application to monitor patients with chronic disease and older people in their homes and community. The records are transmitted digitally through wireless communication devices through gateways and stored in the cloud computing environment.


2018 ◽  
Vol 63 (5) ◽  
pp. 1337-1364 ◽  
Author(s):  
Karsten Donnay ◽  
Eric T. Dunford ◽  
Erin C. McGrath ◽  
David Backer ◽  
David E. Cunningham

The growing multitude of sophisticated event-level data collection enables novel analyses of conflict. Even when multiple event data sets are available, researchers tend to rely on only one. We instead advocate integrating information from multiple event data sets. The advantages include facilitating analysis of relationships between different types of conflict, providing more comprehensive empirical measurement, and evaluating the relative coverage and quality of data sets. Existing integration efforts have been performed manually, with significant limitations. Therefore, we introduce Matching Event Data by Location, Time and Type (MELTT)—an automated, transparent, reproducible methodology for integrating event data sets. For the cases of Nigeria 2011, South Sudan 2015, and Libya 2014, we show that using MELTT to integrate data from four leading conflict event data sets (Uppsala Conflict Data Project–Georeferenced Event Data, Armed Conflict Location and Event Data, Social Conflict Analysis Database, and Global Terrorism Database) provides a more complete picture of conflict. We also apply multiple systems estimation to show that each of these data sets has substantial missingness in coverage.


2021 ◽  
pp. 002242782110309
Author(s):  
Vijay F. Chillar

Objectives: An initial investigation by the Department of Justice (DOJ) found that the Newark Police Department (NPD) had engaged in a “pattern or practice” of constitutional violations with regard to stop and arrest practices, prompting the city to enter a consent decree. Methods: This study draws on official event-level data on FIs recorded by NPD officers (N = 50,322) and uses random effects panel regression models to examine how socioeconomic characteristics interact with the implementation of the consent decree at micro places in the short term. Results: Spatial analyses indicate a concentration of FI encounters. The implementation of the consent decree coincided with improvements in the quality of data collected by officers conducting FIs of citizens. It was also associated with decreased rates of reported FIs for the city’s Black and Latino citizens relative to their share of the local population, and patterns of FI encounters. Conclusions: Newark’s consent decree improved the quality of data collection. However, the spatial concentration of reported FIs and subsequent arrest of Black and Latino individuals have not experienced the same effect as they presumably require a culture change that is likely to necessitate a longer time frame to manifest.


Author(s):  
B. L. Armbruster ◽  
B. Kraus ◽  
M. Pan

One goal in electron microscopy of biological specimens is to improve the quality of data to equal the resolution capabilities of modem transmission electron microscopes. Radiation damage and beam- induced movement caused by charging of the sample, low image contrast at high resolution, and sensitivity to external vibration and drift in side entry specimen holders limit the effective resolution one can achieve. Several methods have been developed to address these limitations: cryomethods are widely employed to preserve and stabilize specimens against some of the adverse effects of the vacuum and electron beam irradiation, spot-scan imaging reduces charging and associated beam-induced movement, and energy-filtered imaging removes the “fog” caused by inelastic scattering of electrons which is particularly pronounced in thick specimens.Although most cryoholders can easily achieve a 3.4Å resolution specification, information perpendicular to the goniometer axis may be degraded due to vibration. Absolute drift after mechanical and thermal equilibration as well as drift after movement of a holder may cause loss of resolution in any direction.


Sign in / Sign up

Export Citation Format

Share Document