Ensemble Learning via Extreme Learning Machines for Imbalanced Data

Author(s):  
Adnan Omer Abuassba ◽  
Dezheng O. Zhang ◽  
Xiong Luo

Ensembles are known to reduce the risk of selecting the wrong model by aggregating all candidate models. Ensembles are known to be more accurate than single models. Accuracy has been identified as an important factor in explaining the success of ensembles. Several techniques have been proposed to improve ensemble accuracy. But, until now, no perfect one has been proposed. The focus of this research is on how to create accurate ensemble learning machine (ELM) in the context of classification to deal with supervised data, noisy data, imbalanced data, and semi-supervised data. To deal with mentioned issues, the authors propose a heterogeneous ELM ensemble. The proposed heterogeneous ensemble of ELMs (AELME) for classification has different ELM algorithms, including regularized ELM (RELM) and kernel ELM (KELM). The authors propose new diverse AdaBoost ensemble-based ELM (AELME) for binary and multiclass data classification to deal with the imbalanced data issue.

2014 ◽  
Vol 548-549 ◽  
pp. 1735-1738 ◽  
Author(s):  
Jian Tang ◽  
Dong Yan ◽  
Li Jie Zhao

Modeling concrete compressive strength is useful to ensure quality of civil engineering. This paper aims to compare several Extreme learning machines (ELMs) based modeling approaches for predicting the concrete compressive strength. Normal ELM algorithm, Partial least square-based extreme learning machines (PLS-ELMs) algorithm and Kernel ELM (KELM) algorithm are used and evaluated. Results indicate that the normal ELMs algorithm has the highest modeling speed, and the KELM has the best prediction accuracy. Every method is validated for modeling concrete compressive strength. The appropriate modeling approach should be selected according different purposes.


2014 ◽  
Vol 2014 ◽  
pp. 1-11 ◽  
Author(s):  
Xinran Zhou ◽  
Zijian Liu ◽  
Congxu Zhu

To apply the single hidden-layer feedforward neural networks (SLFN) to identify time-varying system, online regularized extreme learning machine (ELM) with forgetting mechanism (FORELM) and online kernelized ELM with forgetting mechanism (FOKELM) are presented in this paper. The FORELM updates the output weights of SLFN recursively by using Sherman-Morrison formula, and it combines advantages of online sequential ELM with forgetting mechanism (FOS-ELM) and regularized online sequential ELM (ReOS-ELM); that is, it can capture the latest properties of identified system by studying a certain number of the newest samples and also can avoid issue of ill-conditioned matrix inversion by regularization. The FOKELM tackles the problem of matrix expansion of kernel based incremental ELM (KB-IELM) by deleting the oldest sample according to the block matrix inverse formula when samples occur continually. The experimental results show that the proposed FORELM and FOKELM have better stability than FOS-ELM and have higher accuracy than ReOS-ELM in nonstationary environments; moreover, FORELM and FOKELM have time efficiencies superiority over dynamic regression extreme learning machine (DR-ELM) under certain conditions.


2017 ◽  
Vol 22 (6) ◽  
pp. 691-701 ◽  
Author(s):  
Adnan O. M. Abuassba ◽  
Yao Zhang ◽  
Xiong Luo ◽  
Dezheng Zhang ◽  
Wulamu Aziguli

Sign in / Sign up

Export Citation Format

Share Document