scholarly journals Adaptive Linear and Normalized Combination of Radial Basis Function Networks for Function Approximation and Regression

2014 ◽  
Vol 2014 ◽  
pp. 1-14 ◽  
Author(s):  
Yunfeng Wu ◽  
Xin Luo ◽  
Fang Zheng ◽  
Shanshan Yang ◽  
Suxian Cai ◽  
...  

This paper presents a novel adaptive linear and normalized combination (ALNC) method that can be used to combine the component radial basis function networks (RBFNs) to implement better function approximation and regression tasks. The optimization of the fusion weights is obtained by solving a constrained quadratic programming problem. According to the instantaneous errors generated by the component RBFNs, the ALNC is able to perform the selective ensemble of multiple leaners by adaptively adjusting the fusion weights from one instance to another. The results of the experiments on eight synthetic function approximation and six benchmark regression data sets show that the ALNC method can effectively help the ensemble system achieve a higher accuracy (measured in terms of mean-squared error) and the better fidelity (characterized by normalized correlation coefficient) of approximation, in relation to the popular simple average, weighted average, and the Bagging methods.

2000 ◽  
Vol 10 (06) ◽  
pp. 453-465 ◽  
Author(s):  
MARK ORR ◽  
JOHN HALLAM ◽  
KUNIO TAKEZAWA ◽  
ALAN MURRAY ◽  
SEISHI NINOMIYA ◽  
...  

We describe a method for non-parametric regression which combines regression trees with radial basis function networks. The method is similar to that of Kubat,1 who was first to suggest such a combination, but has some significant improvements. We demonstrate the features of the new method, compare its performance with other methods on DELVE data sets and apply it to a real world problem involving the classification of soybean plants from digital images.


1996 ◽  
Vol 07 (02) ◽  
pp. 167-179 ◽  
Author(s):  
ROBERT SHORTEN ◽  
RODERICK MURRAY-SMITH

Normalisation of the basis function activations in a Radial Basis Function (RBF) network is a common way of achieving the partition of unity often desired for modelling applications. It results in the basis functions covering the whole of the input space to the same degree. However, normalisation of the basis functions can lead to other effects which are sometimes less desirable for modelling applications. This paper describes some side effects of normalisation which fundamentally alter properties of the basis functions, e.g. the shape is no longer uniform, maxima of basis functions can be shifted from their centres, and the basis functions are no longer guaranteed to decrease monotonically as distance from their centre increases—in many cases basis functions can ‘reactivate’, i.e. re-appear far from the basis function centre. This paper examines how these phenomena occur, discusses their relevance for non-linear function approximation and examines the effect of normalisation on the network condition number and weights.


Sign in / Sign up

Export Citation Format

Share Document