Differential Evolution-Based Optimization of Kernel Parameters in Radial Basis Function Networks for Classification

2013 ◽  
Vol 4 (1) ◽  
pp. 56-80 ◽  
Author(s):  
Ch. Sanjeev Kumar Dash ◽  
Ajit Kumar Behera ◽  
Satchidananda Dehuri ◽  
Sung-Bae Cho

In this paper a two phases learning algorithm with a modified kernel for radial basis function neural networks is proposed for classification. In phase one a new meta-heuristic approach differential evolution is used to reveal the parameters of the modified kernel. The second phase focuses on optimization of weights for learning the networks. Further, a predefined set of basis functions is taken for empirical analysis of which basis function is better for which kind of domain. The simulation result shows that the proposed learning mechanism is evidently producing better classification accuracy vis-à-vis radial basis function neural networks (RBFNs) and genetic algorithm-radial basis function (GA-RBF) neural networks.

Author(s):  
Ch. Sanjeev Kumar Dash ◽  
Ajit Kumar Behera ◽  
Satchidananda Dehuri ◽  
Sung-Bae Cho

The classification of diseases appears as one of the fundamental problems for a medical practitioner, which might be substantially improved by intelligent systems. The present work is aimed at designing in what way an intelligent system supporting medical decision can be developed by hybridizing radial basis function neural networks (RBFNs) and differential evolution (DE). To this extent, a two phases learning algorithm with a modified kernel for radial basis function neural networks is proposed for classification. In phase one, differential evolution is used to reveal the parameters of the modified kernel. The second phase focus on optimization of weights for learning the networks. The proposed method is validated using five medical datasets such as bupa liver disorders, pima Indians diabetes, new thyroid, stalog (heart), and hepatitis. In addition, a predefined set of basis functions are considered to gain insight into, which basis function is better for what kind of domain through an empirical analysis. The experiment results indicate that the proposed method classification accuracy with 95% and 98% confidence interval is better than the base line classifier (i.e., simple RBFNs) in all aforementioned datasets. In the case of imbalanced dataset like new thyroid, the authors have noted that with 98% confidence level the classification accuracy of the proposed method based on the multi-quadratic kernel is better than other kernels; however, in the case of hepatitis, the proposed method based on cubic kernel is promising.


1991 ◽  
Vol 3 (4) ◽  
pp. 579-588 ◽  
Author(s):  
Chris Bishop

An important feature of radial basis function neural networks is the existence of a fast, linear learning algorithm in a network capable of representing complex nonlinear mappings. Satisfactory generalization in these networks requires that the network mapping be sufficiently smooth. We show that a modification to the error functional allows smoothing to be introduced explicitly without significantly affecting the speed of training. A simple example is used to demonstrate the resulting improvement in the generalization properties of the network.


2002 ◽  
Vol 14 (12) ◽  
pp. 2997-3011 ◽  
Author(s):  
Michael Schmitt

We establish versions of Descartes' rule of signs for radial basis function (RBF) neural networks. The RBF rules of signs provide tight bounds for the number of zeros of univariate networks with certain parameter restrictions. Moreover, they can be used to infer that the Vapnik-Chervonenkis (VC) dimension and pseudodimension of these networks are no more than linear. This contrasts with previous work showing that RBF neural networks with two or more input nodes have superlinear VC dimension. The rules also give rise to lower bounds for network sizes, thus demonstrating the relevance of network parameters for the complexity of computing with RBF neural networks.


2014 ◽  
Vol 2014 ◽  
pp. 1-9 ◽  
Author(s):  
Syed Saad Azhar Ali ◽  
Muhammad Moinuddin ◽  
Kamran Raza ◽  
Syed Hasan Adil

Radial basis function neural networks are used in a variety of applications such as pattern recognition, nonlinear identification, control and time series prediction. In this paper, the learning algorithm of radial basis function neural networks is analyzed in a feedback structure. The robustness of the learning algorithm is discussed in the presence of uncertainties that might be due to noisy perturbations at the input or to modeling mismatch. An intelligent adaptation rule is developed for the learning rate of RBFNN which gives faster convergence via an estimate of error energy while giving guarantee to thel2stability governed by the upper bounding via small gain theorem. Simulation results are presented to support our theoretical development.


2015 ◽  
Vol 761 ◽  
pp. 120-124
Author(s):  
K.A.A. Aziz ◽  
Abdul Kadir ◽  
Rostam Affendi Hamzah ◽  
Amat Amir Basari

This paper presents a product identification using image processing and radial basis function neural networks. The system identified a specific product based on the shape of the product. An image processing had been applied to the acquired image and the product was recognized using the Radial Basis Function Neural Network (RBFNN). The RBF Neural Networks offer several advantages compared to other neural network architecture such as they can be trained using a fast two-stage training algorithm and the network possesses the property of best approximation. The output of the network can be optimized by setting suitable values of the center and the spread of RBF. In this paper, fixed spread value was used for every cluster. The system can detect all the four products with 100% successful rate using ±0.2 tolerance.


2016 ◽  
Vol 60 ◽  
pp. 151-164 ◽  
Author(s):  
Afshin Tatar ◽  
Saeid Naseri ◽  
Mohammad Bahadori ◽  
Ali Zeinolabedini Hezave ◽  
Tomoaki Kashiwao ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document