scholarly journals A Dynamic Declarative Composition Scheme for Stream Data Services

2021 ◽  
Vol 2021 ◽  
pp. 1-8
Author(s):  
Zhongmei Zhang ◽  
Zhongguo Yang ◽  
Sikandar Ali ◽  
Muhammad Asshad ◽  
Shaher Suleman Slehat

With the fast development of Sensor Network, Internet of Things, mobile devices, and pervasive computing, enormous amounts of sensor devices are deployed in physical world. Data streams produced by these sensor devices, deployed broadly, can be used to create various value-added applications. Facing continuous, real-time, high-frequency, low-valued data streams, how to flexibly and efficiently cooperate them for creating valuable application is very crucial. In this study, we propose a service-oriented manner to realize flexible streams integration. It considers data stream produced by one sensor data as a stream data service and utilizes composing multiple services to realize the cooperation among sensor devices. Firstly, we propose a stream data service model based on Event-Condition-Action rules, which can encapsulate steam data as services and continuously and timely process stream data into value-added events. Then, we propose a declarative method which can dynamically compose stream data services. Based on two kinds of declarative rules, that is, sink-rules and connect-rules, multiple data streams can be dynamically integrated through flexible service composition. To ensure the performance of service composition, we also employ a sensor partition strategy and process multiple service compositions in parallel. Through comprehensive evaluations by experiments, our service composition method shows both good efficiency and effectiveness.

2018 ◽  
Vol 14 (4) ◽  
pp. 155014771877399
Author(s):  
Hajin Kim ◽  
Myeong-Seon Gil ◽  
Yang-Sae Moon ◽  
Mi-Jung Choi

In order to rapidly process large amounts of sensor stream data, it is effective to extract and use samples that reflect the characteristics and patterns of the data stream well. In this article, we focus on improving the uniformity confidence of KSample, which has the characteristics of random sampling in the stream environment. For this, we first analyze the uniformity confidence of KSample and then derive two uniformity confidence degradation problems: (1) initial degradation, which rapidly decreases the uniformity confidence in the initial stage, and (2) continuous degradation, which gradually decreases the uniformity confidence in the later stages. We note that the initial degradation is caused by the sample range limitation and the past sample invariance, and the continuous degradation by the sampling range increase. For each problem, we present a corresponding solution, that is, we provide the sample range extension for sample range limitation, the past sample change for past sample invariance, and the use of UC-window for sampling range increase. By reflecting these solutions, we then propose a novel sampling method, named UC-KSample, which largely improves the uniformity confidence. Experimental results show that UC-KSample improves the uniformity confidence over KSample by 2.2 times on average, and it always keeps the uniformity confidence higher than the user-specified threshold. We also note that the sampling accuracy of UC-KSample is higher than that of KSample in both numeric sensor data and text data. The uniformity confidence is an important sampling metric in sensor data streams, and this is the first attempt to apply uniformity confidence to KSample. We believe that the proposed UC-KSample is an excellent approach that adopts an advantage of KSample, dynamic sampling over a fixed sampling ratio, while improving the uniformity confidence.


2021 ◽  
Vol 55 (3) ◽  
pp. 70-71
Author(s):  
Vardis M. Tsontos

Abstract Ocean science and decision support applications increasingly rely on the synergistic, interdisciplinary use of multivariate data from distributed agency repositories. While more available, the growing variety and volume of ocean observing data combined with the heterogeneous modalities of access continue to pose a challenge to broadscale uptake. This limits the effective utilization of costly investments in sustained ocean observation by an increasing diversity of user communities with a need for such environmental information on the oceans for the assessment of climate change and other ecosystem impacts. Leveraging an advanced cloud technology stack and an ongoing multi-agency pilot effort being spearheaded by NASA, the CEOS Ocean Variables Enabling Research and Applications for GEO (COVERAGE) initiative seeks to collaboratively develop the next generation data service infrastructure for a more digitally integrated ocean observing system in support of marine science and ecosystem-based management. In particular, we envisage the implementation of a data services layer atop of existing agency repositories to provide more harmonized access to satellite, in-situ, and model data across a fragmented ocean data landscape with a set of value-added services that include integrated data search, visualization, and analytics. Here we outline the motivation and importance of this effort plus efforts thus far towards the collective realization of this ambitious vision.


2016 ◽  
Vol 2016 ◽  
pp. 1-17 ◽  
Author(s):  
Mihui Kim ◽  
Mihir Asthana ◽  
Siddhartha Bhargava ◽  
Kartik Krishnan Iyyer ◽  
Rohan Tangadpalliwar ◽  
...  

The increasing number of Internet of Things (IoT) devices with various sensors has resulted in a focus on Cloud-based sensing-as-a-service (CSaaS) as a new value-added service, for example, providing temperature-sensing data via a cloud computing system. However, the industry encounters various challenges in the dynamic provisioning of on-demand CSaaS on diverse sensor networks. We require a system that will provide users with standardized access to various sensor networks and a level of abstraction that hides the underlying complexity. In this study, we aim to develop a cloud-based solution to address the challenges mentioned earlier. Our solution, SenseCloud, includes asensor virtualizationmechanism that interfaces with diverse sensor networks, amultitenancymechanism that grants multiple users access to virtualized sensor networks while sharing the same underlying infrastructure, and adynamic provisioningmechanism to allow the users to leverage the vast pool of resources on demand and on a pay-per-use basis. We implement a prototype of SenseCloud by using real sensors and verify the feasibility of our system and its performance. SenseCloud bridges the gap between sensor providers and sensor data consumers who wish to utilize sensor data.


Entropy ◽  
2021 ◽  
Vol 23 (7) ◽  
pp. 859
Author(s):  
Abdulaziz O. AlQabbany ◽  
Aqil M. Azmi

We are living in the age of big data, a majority of which is stream data. The real-time processing of this data requires careful consideration from different perspectives. Concept drift is a change in the data’s underlying distribution, a significant issue, especially when learning from data streams. It requires learners to be adaptive to dynamic changes. Random forest is an ensemble approach that is widely used in classical non-streaming settings of machine learning applications. At the same time, the Adaptive Random Forest (ARF) is a stream learning algorithm that showed promising results in terms of its accuracy and ability to deal with various types of drift. The incoming instances’ continuity allows for their binomial distribution to be approximated to a Poisson(1) distribution. In this study, we propose a mechanism to increase such streaming algorithms’ efficiency by focusing on resampling. Our measure, resampling effectiveness (ρ), fuses the two most essential aspects in online learning; accuracy and execution time. We use six different synthetic data sets, each having a different type of drift, to empirically select the parameter λ of the Poisson distribution that yields the best value for ρ. By comparing the standard ARF with its tuned variations, we show that ARF performance can be enhanced by tackling this important aspect. Finally, we present three case studies from different contexts to test our proposed enhancement method and demonstrate its effectiveness in processing large data sets: (a) Amazon customer reviews (written in English), (b) hotel reviews (in Arabic), and (c) real-time aspect-based sentiment analysis of COVID-19-related tweets in the United States during April 2020. Results indicate that our proposed method of enhancement exhibited considerable improvement in most of the situations.


2021 ◽  
Author(s):  
Zhangyue Shi ◽  
Chenang Liu ◽  
Chen Kan ◽  
Wenmeng Tian ◽  
Yang Chen

Abstract With the rapid development of the Internet of Things and information technologies, more and more manufacturing systems become cyber-enabled, which significantly improves the flexibility and productivity of manufacturing. Furthermore, a large variety of online sensors are also commonly incorporated in the manufacturing systems for online quality monitoring and control. However, the cyber-enabled environment may pose the collected online stream sensor data under high risks of cyber-physical attacks as well. Specifically, cyber-physical attacks could occur during the manufacturing process to maliciously tamper the sensor data, which could result in false alarms or failures of anomaly detection. In addition, the cyber-physical attacks may also illegally access the collected data without authorization and cause leakage of key information. Therefore, it becomes critical to develop an effective approach to protect online stream data from these attacks so that the cyber-physical security of the manufacturing systems could be assured. To achieve this goal, an integrative blockchain-enabled method, is proposed by leveraging both asymmetry encryption and camouflage techniques. A real-world case study that protects cyber-physical security of collected stream data in additive manufacturing is provided to demonstrate the effectiveness of the proposed method. The results demonstrate that malicious tampering could be detected in a relatively short time and the risk of unauthorized data access is significantly reduced as well.


2018 ◽  
Vol 14 (11) ◽  
pp. 155014771881130 ◽  
Author(s):  
Jaanus Kaugerand ◽  
Johannes Ehala ◽  
Leo Mõtus ◽  
Jürgo-Sören Preden

This article introduces a time-selective strategy for enhancing temporal consistency of input data for multi-sensor data fusion for in-network data processing in ad hoc wireless sensor networks. Detecting and handling complex time-variable (real-time) situations require methodical consideration of temporal aspects, especially in ad hoc wireless sensor network with distributed asynchronous and autonomous nodes. For example, assigning processing intervals of network nodes, defining validity and simultaneity requirements for data items, determining the size of memory required for buffering the data streams produced by ad hoc nodes and other relevant aspects. The data streams produced periodically and sometimes intermittently by sensor nodes arrive to the fusion nodes with variable delays, which results in sporadic temporal order of inputs. Using data from individual nodes in the order of arrival (i.e. freshest data first) does not, in all cases, yield the optimal results in terms of data temporal consistency and fusion accuracy. We propose time-selective data fusion strategy, which combines temporal alignment, temporal constraints and a method for computing delay of sensor readings, to allow fusion node to select the temporally compatible data from received streams. A real-world experiment (moving vehicles in urban environment) for validation of the strategy demonstrates significant improvement of the accuracy of fusion results.


2014 ◽  
pp. 291-321 ◽  
Author(s):  
Stephen Voida ◽  
Donald J. Patterson ◽  
Shwetak N. Patel
Keyword(s):  

2021 ◽  
Author(s):  
Yu Du ◽  
Xiaohang Zhang ◽  
Zhengren Li ◽  
Yijun Guo

Abstract For the global telecom operators, mobile data services have gradually taken the part of traditional voice services to become the main revenue growth point. However, during the upgrading period of new generation networks (Such as 5G), new mobile data services are still at the stage of exploration, the network capabilities and the application scenarios are unmatured. In this phase, it is incomplete and misleading to simply measure the performance of new services from one dimension, such as data traffic or revenue, and the measurement should be dynamically changed according to the development of the new services. Therefore, telecom operators want to improve the existing performance measurement from the aspect of integrity and dynamics. In this paper, we propose Mobile-data-service Development Index (MDDI), and build a quantitative model to dynamic measure the overall performance of mobile data services. To approach a fuller understanding, we creatively bring investment indicators and networks reliability indicators into performance indicators system, and discuss the relationships among subindices and the selection of outcome criteria in MDDI. In the part of empirical research, we use the model to analyze the dynamic characteristics of a new mobile data service in China, and summarize the development strategies of every stage. The findings can also give guidelines for new services of 5G and other new generation networks in the future.


Sign in / Sign up

Export Citation Format

Share Document