Radial Basis Function network learning using localized generalization error bound

2009 ◽  
Vol 179 (19) ◽  
pp. 3199-3217 ◽  
Author(s):  
Daniel S. Yeung ◽  
Patrick P.K. Chan ◽  
Wing W.Y. Ng
Processes ◽  
2022 ◽  
Vol 10 (1) ◽  
pp. 140
Author(s):  
Yanxia Yang ◽  
Pu Wang ◽  
Xuejin Gao

A radial basis function neural network (RBFNN), with a strong function approximation ability, was proven to be an effective tool for nonlinear process modeling. However, in many instances, the sample set is limited and the model evaluation error is fixed, which makes it very difficult to construct an optimal network structure to ensure the generalization ability of the established nonlinear process model. To solve this problem, a novel RBFNN with a high generation performance (RBFNN-GP), is proposed in this paper. The proposed RBFNN-GP consists of three contributions. First, a local generalization error bound, introducing the sample mean and variance, is developed to acquire a small error bound to reduce the range of error. Second, the self-organizing structure method, based on a generalization error bound and network sensitivity, is established to obtain a suitable number of neurons to improve the generalization ability. Third, the convergence of this proposed RBFNN-GP is proved theoretically in the case of structure fixation and structure adjustment. Finally, the performance of the proposed RBFNN-GP is compared with some popular algorithms, using two numerical simulations and a practical application. The comparison results verified the effectiveness of RBFNN-GP.


2016 ◽  
Author(s):  
Olímpio Murilo Capeli ◽  
Euvaldo Ferreira Cabral Junior ◽  
Sadao Isotani ◽  
Antonio Roberto Pereira Leite de Albuquerque

Sign in / Sign up

Export Citation Format

Share Document