A Self-Organizing Neural Network to Approach Novelty Detection

Author(s):  
Marcelo Keese Albertini ◽  
Rodrigo Fernandes de Mello

Machine learning is a field of artificial intelligence which aims at developing techniques to automatically transfer human knowledge into analytical models. Recently, those techniques have been applied to time series with unknown dynamics and fluctuations in the established behavior patterns, such as humancomputer interaction, inspection robotics and climate change. In order to detect novelties in those time series, techniques are required to learn and update knowledge structures, adapting themselves to data tendencies. The learning and updating process should integrate and accommodate novelty events into the normal behavior model, possibly incurring the revaluation of long-term memories. This sort of application has been addressed by the proposal of incremental techniques based on unsupervised neural networks and regression techniques. Such proposals have introduced two new concepts in time-series novelty detection. The first defines the temporal novelty, which indicates the occurrence of unexpected series of events. The second measures how novel a single event is, based on the historical knowledge. However, current studies do not fully consider both concepts of detecting and quantifying temporal novelties. This motivated the proposal of the self-organizing novelty detection neural network architecture (SONDE) which incrementally learns patterns in order to represent unknown dynamics and fluctuation of established behavior. The knowledge accumulated by SONDE is employed to estimate Markov chains which model causal relationships. This architecture is applied to detect and measure temporal and nontemporal novelties. The evaluation of the proposed technique is carried out through simulations and experiments, which have presented promising results.

2012 ◽  
pp. 262-282
Author(s):  
Marcelo Keese Albertini ◽  
Rodrigo Fernandes de Mello

Machine learning is a field of artificial intelligence which aims at developing techniques to automatically transfer human knowledge into analytical models. Recently, those techniques have been applied to time series with unknown dynamics and fluctuations in the established behavior patterns, such as humancomputer interaction, inspection robotics and climate change. In order to detect novelties in those time series, techniques are required to learn and update knowledge structures, adapting themselves to data tendencies. The learning and updating process should integrate and accommodate novelty events into the normal behavior model, possibly incurring the revaluation of long-term memories. This sort of application has been addressed by the proposal of incremental techniques based on unsupervised neural networks and regression techniques. Such proposals have introduced two new concepts in time-series novelty detection. The first defines the temporal novelty, which indicates the occurrence of unexpected series of events. The second measures how novel a single event is, based on the historical knowledge. However, current studies do not fully consider both concepts of detecting and quantifying temporal novelties. This motivated the proposal of the self-organizing novelty detection neural network architecture (SONDE) which incrementally learns patterns in order to represent unknown dynamics and fluctuation of established behavior. The knowledge accumulated by SONDE is employed to estimate Markov chains which model causal relationships. This architecture is applied to detect and measure temporal and nontemporal novelties. The evaluation of the proposed technique is carried out through simulations and experiments, which have presented promising results.


2021 ◽  
Vol 292 ◽  
pp. 116912
Author(s):  
Rong Wang Ng ◽  
Kasim Mumtaj Begam ◽  
Rajprasad Kumar Rajkumar ◽  
Yee Wan Wong ◽  
Lee Wai Chong

2018 ◽  
Vol 7 (4.15) ◽  
pp. 25 ◽  
Author(s):  
Said Jadid Abdulkadir ◽  
Hitham Alhussian ◽  
Muhammad Nazmi ◽  
Asim A Elsheikh

Forecasting time-series data are imperative especially when planning is required through modelling using uncertain knowledge of future events. Recurrent neural network models have been applied in the industry and outperform standard artificial neural networks in forecasting, but fail in long term time-series forecasting due to the vanishing gradient problem. This study offers a robust solution that can be implemented for long-term forecasting using a special architecture of recurrent neural network known as Long Short Term Memory (LSTM) model to overcome the vanishing gradient problem. LSTM is specially designed to avoid the long-term dependency problem as their default behavior. Empirical analysis is performed using quantitative forecasting metrics and comparative model performance on the forecasted outputs. An evaluation analysis is performed to validate that the LSTM model provides better forecasted outputs on Standard & Poor’s 500 Index (S&P 500) in terms of error metrics as compared to other forecasting models.  


2009 ◽  
Vol 2009 ◽  
pp. 1-21
Author(s):  
Sanjay L. Badjate ◽  
Sanjay V. Dudul

Multistep ahead prediction of a chaotic time series is a difficult task that has attracted increasing interest in the recent years. The interest in this work is the development of nonlinear neural network models for the purpose of building multistep chaotic time series prediction. In the literature there is a wide range of different approaches but their success depends on the predicting performance of the individual methods. Also the most popular neural models are based on the statistical and traditional feed forward neural networks. But it is seen that this kind of neural model may present some disadvantages when long-term prediction is required. In this paper focused time-lagged recurrent neural network (FTLRNN) model with gamma memory is developed for different prediction horizons. It is observed that this predictor performs remarkably well for short-term predictions as well as medium-term predictions. For coupled partial differential equations generated chaotic time series such as Mackey Glass and Duffing, FTLRNN-based predictor performs consistently well for different depths of predictions ranging from short term to long term, with only slight deterioration after k is increased beyond 50. For real-world highly complex and nonstationary time series like Sunspots and Laser, though the proposed predictor does perform reasonably for short term and medium-term predictions, its prediction ability drops for long term ahead prediction. However, still this is the best possible prediction results considering the facts that these are nonstationary time series. As a matter of fact, no other NN configuration can match the performance of FTLRNN model. The authors experimented the performance of this FTLRNN model on predicting the dynamic behavior of typical Chaotic Mackey-Glass time series, Duffing time series, and two real-time chaotic time series such as monthly sunspots and laser. Static multi layer perceptron (MLP) model is also attempted and compared against the proposed model on the performance measures like mean squared error (MSE), Normalized mean squared error (NMSE), and Correlation Coefficient (r). The standard back-propagation algorithm with momentum term has been used for both the models.


Sign in / Sign up

Export Citation Format

Share Document