Improving the Generalization Properties of Radial Basis Function Neural Networks

1991 ◽  
Vol 3 (4) ◽  
pp. 579-588 ◽  
Author(s):  
Chris Bishop

An important feature of radial basis function neural networks is the existence of a fast, linear learning algorithm in a network capable of representing complex nonlinear mappings. Satisfactory generalization in these networks requires that the network mapping be sufficiently smooth. We show that a modification to the error functional allows smoothing to be introduced explicitly without significantly affecting the speed of training. A simple example is used to demonstrate the resulting improvement in the generalization properties of the network.

2013 ◽  
Vol 4 (1) ◽  
pp. 56-80 ◽  
Author(s):  
Ch. Sanjeev Kumar Dash ◽  
Ajit Kumar Behera ◽  
Satchidananda Dehuri ◽  
Sung-Bae Cho

In this paper a two phases learning algorithm with a modified kernel for radial basis function neural networks is proposed for classification. In phase one a new meta-heuristic approach differential evolution is used to reveal the parameters of the modified kernel. The second phase focuses on optimization of weights for learning the networks. Further, a predefined set of basis functions is taken for empirical analysis of which basis function is better for which kind of domain. The simulation result shows that the proposed learning mechanism is evidently producing better classification accuracy vis-à-vis radial basis function neural networks (RBFNs) and genetic algorithm-radial basis function (GA-RBF) neural networks.


2014 ◽  
Vol 2014 ◽  
pp. 1-9 ◽  
Author(s):  
Syed Saad Azhar Ali ◽  
Muhammad Moinuddin ◽  
Kamran Raza ◽  
Syed Hasan Adil

Radial basis function neural networks are used in a variety of applications such as pattern recognition, nonlinear identification, control and time series prediction. In this paper, the learning algorithm of radial basis function neural networks is analyzed in a feedback structure. The robustness of the learning algorithm is discussed in the presence of uncertainties that might be due to noisy perturbations at the input or to modeling mismatch. An intelligent adaptation rule is developed for the learning rate of RBFNN which gives faster convergence via an estimate of error energy while giving guarantee to thel2stability governed by the upper bounding via small gain theorem. Simulation results are presented to support our theoretical development.


2004 ◽  
Vol 02 (03) ◽  
pp. 511-531 ◽  
Author(s):  
ZHENG RONG YANG ◽  
EMILY A. BERRY

This paper presents a new neural learning algorithm for protease cleavage site prediction. The basic idea is to replace the radial basis function used in radial basis function neural networks by a so-called bio-basis function using amino acid similarity matrices. Mutual information is used to select bio-bases and a corresponding selection algorithm is developed. The algorithm has been applied to the prediction of HIV and Hepatitis C virus protease cleavage sites in proteins with success.


Sign in / Sign up

Export Citation Format

Share Document