scholarly journals Using Radial Basis Function Networks for Function Approximation and Classification

2012 ◽  
Vol 2012 ◽  
pp. 1-34 ◽  
Author(s):  
Yue Wu ◽  
Hui Wang ◽  
Biaobiao Zhang ◽  
K.-L. Du

The radial basis function (RBF) network has its foundation in the conventional approximation theory. It has the capability of universal approximation. The RBF network is a popular alternative to the well-known multilayer perceptron (MLP), since it has a simpler structure and a much faster training process. In this paper, we give a comprehensive survey on the RBF network and its learning. Many aspects associated with the RBF network, such as network structure, universal approimation capability, radial basis functions, RBF network learning, structure optimization, normalized RBF networks, application to dynamic system modeling, and nonlinear complex-valued signal processing, are described. We also compare the features and capability of the two models.

2009 ◽  
Vol 19 (04) ◽  
pp. 253-267 ◽  
Author(s):  
R. SAVITHA ◽  
S. SURESH ◽  
N. SUNDARARAJAN

In this paper, a fully complex-valued radial basis function (FC-RBF) network with a fully complex-valued activation function has been proposed, and its complex-valued gradient descent learning algorithm has been developed. The fully complex activation function, sech(.) of the proposed network, satisfies all the properties needed for a complex-valued activation function and has Gaussian-like characteristics. It maps Cn → C, unlike the existing activation functions of complex-valued RBF network that maps Cn → R. Since the performance of the complex-RBF network depends on the number of neurons and initialization of network parameters, we propose a K-means clustering based neuron selection and center initialization scheme. First, we present a study on convergence using complex XOR problem. Next, we present a synthetic function approximation problem and the two-spiral classification problem. Finally, we present the results for two practical applications, viz., a non-minimum phase equalization and an adaptive beam-forming problem. The performance of the network was compared with other well-known complex-valued RBF networks available in literature, viz., split-complex CRBF, CMRAN and the CELM. The results indicate that the proposed fully complex-valued network has better convergence, approximation and classification ability.


1996 ◽  
Vol 07 (02) ◽  
pp. 167-179 ◽  
Author(s):  
ROBERT SHORTEN ◽  
RODERICK MURRAY-SMITH

Normalisation of the basis function activations in a Radial Basis Function (RBF) network is a common way of achieving the partition of unity often desired for modelling applications. It results in the basis functions covering the whole of the input space to the same degree. However, normalisation of the basis functions can lead to other effects which are sometimes less desirable for modelling applications. This paper describes some side effects of normalisation which fundamentally alter properties of the basis functions, e.g. the shape is no longer uniform, maxima of basis functions can be shifted from their centres, and the basis functions are no longer guaranteed to decrease monotonically as distance from their centre increases—in many cases basis functions can ‘reactivate’, i.e. re-appear far from the basis function centre. This paper examines how these phenomena occur, discusses their relevance for non-linear function approximation and examines the effect of normalisation on the network condition number and weights.


2012 ◽  
Vol 24 (5) ◽  
pp. 1297-1328 ◽  
Author(s):  
R. Savitha ◽  
S. Suresh ◽  
N. Sundararajan

Recent studies on human learning reveal that self-regulated learning in a metacognitive framework is the best strategy for efficient learning. As the machine learning algorithms are inspired by the principles of human learning, one needs to incorporate the concept of metacognition to develop efficient machine learning algorithms. In this letter we present a metacognitive learning framework that controls the learning process of a fully complex-valued radial basis function network and is referred to as a metacognitive fully complex-valued radial basis function (Mc-FCRBF) network. Mc-FCRBF has two components: a cognitive component containing the FC-RBF network and a metacognitive component, which regulates the learning process of FC-RBF. In every epoch, when a sample is presented to Mc-FCRBF, the metacognitive component decides what to learn, when to learn, and how to learn based on the knowledge acquired by the FC-RBF network and the new information contained in the sample. The Mc-FCRBF learning algorithm is described in detail, and both its approximation and classification abilities are evaluated using a set of benchmark and practical problems. Performance results indicate the superior approximation and classification performance of Mc-FCRBF compared to existing methods in the literature.


Heat Transfer ◽  
2021 ◽  
Author(s):  
Maryam Fallah Najafabadi ◽  
Hossein Talebi Rostami ◽  
Khashayar Hosseinzadeh ◽  
Davood Domiri Ganji

2014 ◽  
Vol 2014 ◽  
pp. 1-14 ◽  
Author(s):  
Yunfeng Wu ◽  
Xin Luo ◽  
Fang Zheng ◽  
Shanshan Yang ◽  
Suxian Cai ◽  
...  

This paper presents a novel adaptive linear and normalized combination (ALNC) method that can be used to combine the component radial basis function networks (RBFNs) to implement better function approximation and regression tasks. The optimization of the fusion weights is obtained by solving a constrained quadratic programming problem. According to the instantaneous errors generated by the component RBFNs, the ALNC is able to perform the selective ensemble of multiple leaners by adaptively adjusting the fusion weights from one instance to another. The results of the experiments on eight synthetic function approximation and six benchmark regression data sets show that the ALNC method can effectively help the ensemble system achieve a higher accuracy (measured in terms of mean-squared error) and the better fidelity (characterized by normalized correlation coefficient) of approximation, in relation to the popular simple average, weighted average, and the Bagging methods.


Sign in / Sign up

Export Citation Format

Share Document