Training algorithms for Radial Basis Function Networks to tackle learning processes with imbalanced data-sets

2014 ◽  
Vol 25 ◽  
pp. 26-39 ◽  
Author(s):  
M.D. Pérez-Godoy ◽  
Antonio J. Rivera ◽  
C.J. Carmona ◽  
M.J. del Jesus
2014 ◽  
Vol 2014 ◽  
pp. 1-14 ◽  
Author(s):  
Yunfeng Wu ◽  
Xin Luo ◽  
Fang Zheng ◽  
Shanshan Yang ◽  
Suxian Cai ◽  
...  

This paper presents a novel adaptive linear and normalized combination (ALNC) method that can be used to combine the component radial basis function networks (RBFNs) to implement better function approximation and regression tasks. The optimization of the fusion weights is obtained by solving a constrained quadratic programming problem. According to the instantaneous errors generated by the component RBFNs, the ALNC is able to perform the selective ensemble of multiple leaners by adaptively adjusting the fusion weights from one instance to another. The results of the experiments on eight synthetic function approximation and six benchmark regression data sets show that the ALNC method can effectively help the ensemble system achieve a higher accuracy (measured in terms of mean-squared error) and the better fidelity (characterized by normalized correlation coefficient) of approximation, in relation to the popular simple average, weighted average, and the Bagging methods.


2000 ◽  
Vol 10 (06) ◽  
pp. 453-465 ◽  
Author(s):  
MARK ORR ◽  
JOHN HALLAM ◽  
KUNIO TAKEZAWA ◽  
ALAN MURRAY ◽  
SEISHI NINOMIYA ◽  
...  

We describe a method for non-parametric regression which combines regression trees with radial basis function networks. The method is similar to that of Kubat,1 who was first to suggest such a combination, but has some significant improvements. We demonstrate the features of the new method, compare its performance with other methods on DELVE data sets and apply it to a real world problem involving the classification of soybean plants from digital images.


1991 ◽  
Vol 3 (2) ◽  
pp. 246-257 ◽  
Author(s):  
J. Park ◽  
I. W. Sandberg

There have been several recent studies concerning feedforward networks and the problem of approximating arbitrary functionals of a finite number of real variables. Some of these studies deal with cases in which the hidden-layer nonlinearity is not a sigmoid. This was motivated by successful applications of feedforward networks with nonsigmoidal hidden-layer units. This paper reports on a related study of radial-basis-function (RBF) networks, and it is proved that RBF networks having one hidden layer are capable of universal approximation. Here the emphasis is on the case of typical RBF networks, and the results show that a certain class of RBF networks with the same smoothing factor in each kernel node is broad enough for universal approximation.


Sign in / Sign up

Export Citation Format

Share Document