scholarly journals An Incremental Learning of Concept Drifts Using Evolving Type-2 Recurrent Fuzzy Neural Networks

2017 ◽  
Vol 25 (5) ◽  
pp. 1175-1192 ◽  
Author(s):  
Mahardhika Pratama ◽  
Jie Lu ◽  
Edwin Lughofer ◽  
Guangquan Zhang ◽  
Meng Joo Er
2012 ◽  
Vol 3 (3) ◽  
pp. 179-188 ◽  
Author(s):  
Sevil Ahmed ◽  
Nikola Shakev ◽  
Andon Topalov ◽  
Kostadin Shiev ◽  
Okyay Kaynak

Author(s):  
Tsung-Chih Lin ◽  
Yi-Ming Chang ◽  
Tun-Yuan Lee

This paper proposes a novel fuzzy modeling approach for identification of dynamic systems. A fuzzy model, recurrent interval type-2 fuzzy neural network (RIT2FNN), is constructed by using a recurrent neural network which recurrent weights, mean and standard deviation of the membership functions are updated. The complete back propagation (BP) algorithm tuning equations used to tune the antecedent and consequent parameters for the interval type-2 fuzzy neural networks (IT2FNNs) are developed to handle the training data corrupted by noise or rule uncertainties for nonlinear system identification involving external disturbances. Only by using the current inputs and most recent outputs of the input layers, the system can be completely identified based on RIT2FNNs. In order to show that the interval IT2FNNs can handle the measurement uncertainties, training data are corrupted by white Gaussian noise with signal-to-noise ratio (SNR) 20 dB. Simulation results are obtained for the identification of nonlinear system, which yield more improved performance than those using recurrent type-1 fuzzy neural networks (RT1FNNs).


Symmetry ◽  
2020 ◽  
Vol 12 (4) ◽  
pp. 679
Author(s):  
Muhammad Anwar Ma’sum

Classification in multi-modal data is one of the challenges in the machine learning field. The multi-modal data need special treatment as its features are distributed in several areas. This study proposes multi-codebook fuzzy neural networks by using intelligent clustering and dynamic incremental learning for multi-modal data classification. In this study, we utilized intelligent K-means clustering based on anomalous patterns and intelligent K-means clustering based on histogram information. In this study, clustering is used to generate codebook candidates before the training process, while incremental learning is utilized when the condition to generate a new codebook is sufficient. The condition to generate a new codebook in incremental learning is based on the similarity of the winner class and other classes. The proposed method was evaluated in synthetic and benchmark datasets. The experiment results showed that the proposed multi-codebook fuzzy neural networks that use dynamic incremental learning have significant improvements compared to the original fuzzy neural networks. The improvements were 15.65%, 5.31% and 11.42% on the synthetic dataset, the benchmark dataset, and the average of all datasets, respectively, for incremental version 1. The incremental learning version 2 improved by 21.08% 4.63%, and 14.35% on the synthetic dataset, the benchmark dataset, and the average of all datasets, respectively. The multi-codebook fuzzy neural networks that use intelligent clustering also had significant improvements compared to the original fuzzy neural networks, achieving 23.90%, 2.10%, and 15.02% improvements on the synthetic dataset, the benchmark dataset, and the average of all datasets, respectively.


Sign in / Sign up

Export Citation Format

Share Document