Function approximation from noisy data by an incremental RBF network

1999 ◽  
Vol 32 (12) ◽  
pp. 2081-2083 ◽  
Author(s):  
Menita Carozza ◽  
Salvatore Rampone
2006 ◽  
Vol 16 (04) ◽  
pp. 283-293 ◽  
Author(s):  
PEI-YI HAO ◽  
JUNG-HSIEN CHIANG

This paper presents the pruning and model-selecting algorithms to the support vector learning for sample classification and function regression. When constructing RBF network by support vector learning we occasionally obtain redundant support vectors which do not significantly affect the final classification and function approximation results. The pruning algorithms primarily based on the sensitivity measure and the penalty term. The kernel function parameters and the position of each support vector are updated in order to have minimal increase in error, and this makes the structure of SVM network more flexible. We illustrate this approach with synthetic data simulation and face detection problem in order to demonstrate the pruning effectiveness.


2012 ◽  
Vol 2012 ◽  
pp. 1-34 ◽  
Author(s):  
Yue Wu ◽  
Hui Wang ◽  
Biaobiao Zhang ◽  
K.-L. Du

The radial basis function (RBF) network has its foundation in the conventional approximation theory. It has the capability of universal approximation. The RBF network is a popular alternative to the well-known multilayer perceptron (MLP), since it has a simpler structure and a much faster training process. In this paper, we give a comprehensive survey on the RBF network and its learning. Many aspects associated with the RBF network, such as network structure, universal approimation capability, radial basis functions, RBF network learning, structure optimization, normalized RBF networks, application to dynamic system modeling, and nonlinear complex-valued signal processing, are described. We also compare the features and capability of the two models.


1996 ◽  
Vol 07 (02) ◽  
pp. 167-179 ◽  
Author(s):  
ROBERT SHORTEN ◽  
RODERICK MURRAY-SMITH

Normalisation of the basis function activations in a Radial Basis Function (RBF) network is a common way of achieving the partition of unity often desired for modelling applications. It results in the basis functions covering the whole of the input space to the same degree. However, normalisation of the basis functions can lead to other effects which are sometimes less desirable for modelling applications. This paper describes some side effects of normalisation which fundamentally alter properties of the basis functions, e.g. the shape is no longer uniform, maxima of basis functions can be shifted from their centres, and the basis functions are no longer guaranteed to decrease monotonically as distance from their centre increases—in many cases basis functions can ‘reactivate’, i.e. re-appear far from the basis function centre. This paper examines how these phenomena occur, discusses their relevance for non-linear function approximation and examines the effect of normalisation on the network condition number and weights.


2013 ◽  
Vol 380-384 ◽  
pp. 1166-1169 ◽  
Author(s):  
Yan Hui Wang ◽  
Kun Zhang

With the application of adaptive genetic algorithm to the training of multi-layer RBF networks and the optimization of the hidden layer centers and width values and using regularized least squares method, weight vectors is obtained. Computer simulation shows that the precision of real function approximation by this algorithm is much higher than the precision by clustering algorithm for multi-layer RBF networks.


2018 ◽  
Vol 371 ◽  
pp. 363-381 ◽  
Author(s):  
Yeonjong Shin ◽  
Kailiang Wu ◽  
Dongbin Xiu

2014 ◽  
Vol 2 (1) ◽  
pp. 1
Author(s):  
Richard Schwartz
Keyword(s):  

Sign in / Sign up

Export Citation Format

Share Document