data stream management
Recently Published Documents


TOTAL DOCUMENTS

150
(FIVE YEARS 16)

H-INDEX

10
(FIVE YEARS 0)

2021 ◽  
Vol 14 (6) ◽  
pp. 878-889
Author(s):  
Walter Cai ◽  
Philip A. Bernstein ◽  
Wentao Wu ◽  
Badrish Chandramouli

A common stream processing application is alerting, where the data stream management system (DSMS) continuously evaluates a threshold function over incoming streams. If the threshold is crossed, the DSMS raises an alarm. The threshold function is often calculated over two or more streams, such as combining temperature and humidity readings to determine if moisture will form on a machine and therefore cause it to malfunction. This requires taking a temporal join across the input streams. We show that for the broad class of functions called quasiconvex functions, the DSMS needs to retain very few tuples per-data-stream for any given time interval and still never miss an alarm. This surprising result yields a large memory savings during normal operation. That savings is also important if one stream fails, since the DSMS would otherwise have to cache all tuples in other streams until the failed stream recovers. We prove our algorithm is optimal and provide experimental evidence that validates its substantial memory savings.


2021 ◽  
Vol 114 ◽  
pp. 155-168
Author(s):  
Muhammad Habib ur Rehman ◽  
Chee Sun Liew ◽  
Teh Ying Wah ◽  
Muhammad Imran ◽  
Khaled Salah ◽  
...  

2020 ◽  
Author(s):  
Frank Appiah

These Event processing systems are much used in wide variety of applications in the processing of large stream of events. The most distinguished of applications is the time-series data management system with timely processing to identify trends, pattern matches and forecast future values.The complexity of event information, coupled with the fact that historical event data is being kept in the database, requires the use of an event processing model that provides the user with high-level abstractions. In this paper, I survey the StreamEPS to help developers and researchers alike to understand the conceptual <div>view and processing of the event processing software system. StreamEPS forms part of Complex EventProcessing (CEP), </div><div>Data Stream Management System (DSMS) and Information Flow Processing (IFP) domain.</div>


2020 ◽  
Author(s):  
Frank Appiah

These Event processing systems are much used in wide variety of applications in the processing of large stream of events. The most distinguished of applications is the time-series data management system with timely processing to identify trends, pattern matches and forecast future values.The complexity of event information, coupled with the fact that historical event data is being kept in the database, requires the use of an event processing model that provides the user with high-level abstractions. In this paper, I survey the StreamEPS to help developers and researchers alike to understand the conceptual <div>view and processing of the event processing software system. StreamEPS forms part of Complex EventProcessing (CEP), </div><div>Data Stream Management System (DSMS) and Information Flow Processing (IFP) domain.</div>


Author(s):  
Elisabeth K¨allstr¨om ◽  
Tomas Olsson ◽  
John Lindstr¨om ◽  
Lars Hakansson ◽  
Jonas Larsson

In order to reduce unnecessary stops and expensive downtime originating from clutch failure of construction equipment machines; adequate real time sensor data measured on the machine in combination with feature extraction and classification methods may be utilized.This paper presents a framework with feature extraction methods and an anomaly detection module combined with Case-Based Reasoning (CBR) for on-board clutch slippage detection and diagnosis in heavy duty equipment. The feature extraction methods used are Moving Average Square Value Filtering (MASVF) and a measure of the fourth order statistical properties of the signals implemented as continuous queries over data streams. The anomaly detection module has two components, the Gaussian Mixture Model (GMM) and the Logistics Regression classifier. CBR is a learning approach that classifies faults by creating a new solution for a new fault case from the solution of the previous fault cases. Through use of a data stream management system and continuous queries (CQs), the anomaly detection module continuously waits for a clutch slippage event detected by the feature extraction methods, the query returns a set of features, which activates the anomaly detection module. The first component of the anomaly detection module trains a GMM to extracted features while the second component uses a Logistic Regression classifier for classifying normal and anomalous data. When an anomaly is detected, the Case-Based diagnosis module is activated for fault severity estimation.


2020 ◽  
Vol 1 ◽  
pp. 1-23
Author(s):  
Tobias Werner ◽  
Thomas Brinkhoff

Abstract. Unmanned aerial and submersible vehicles are used in an increasing number of applications especially for data collection in misanthropic environments. During a mission, such vehicles generate multiple spatio-temporal data streams suitable to be processed by data stream management systems (DSMS). The main approach of a DSMS is limiting the elements of a stream by using sliding and tilting windows with time intervals as temporal condition. However, due to varying vehicle speed and limited on-board resources, such temporal windows do not provide adequate support for spatio-temporal problems. For solving this problem, we propose a set of six new spatio-temporal window operators in this paper. This set comprises of sliding distance, tilting distance, tilting waypoint, session distance, jumping distance and an area window to limit stream elements based on spatial conditions. Each of the listed operators provides an individual behaviour to support sophisticated applications like spatial interpolation and forecasting. An evaluation based on an example trajectory shows the benefit of the presented operators for spatio-temporal applications.


Sign in / Sign up

Export Citation Format

Share Document