scholarly journals Deep Network Based on Stacked Orthogonal Convex Incremental ELM Autoencoders

2016 ◽  
Vol 2016 ◽  
pp. 1-17 ◽  
Author(s):  
Chao Wang ◽  
Jianhui Wang ◽  
Shusheng Gu

Extreme learning machine (ELM) as an emerging technology has recently attracted many researchers’ interest due to its fast learning speed and state-of-the-art generalization ability in the implementation. Meanwhile, the incremental extreme learning machine (I-ELM) based on incremental learning algorithm was proposed which outperforms many popular learning algorithms. However, the incremental algorithms with ELM do not recalculate the output weights of all the existing nodes when a new node is added and cannot obtain the least-squares solution of output weight vectors. In this paper, we propose orthogonal convex incremental learning machine (OCI-ELM) with Gram-Schmidt orthogonalization method and Barron’s convex optimization learning method to solve the nonconvex optimization problem and least-squares solution problem, and then we give the rigorous proofs in theory. Moreover, in this paper, we propose a deep architecture based on stacked OCI-ELM autoencoders according to stacked generalization philosophy for solving large and complex data problems. The experimental results verified with both UCI datasets and large datasets demonstrate that the deep network based on stacked OCI-ELM autoencoders (DOC-IELM-AEs) outperforms the other methods mentioned in the paper with better performance on regression and classification problems.

2014 ◽  
Vol 2014 ◽  
pp. 1-8 ◽  
Author(s):  
Qingsong Xu

Extreme learning machine (ELM) is a learning algorithm for single-hidden layer feedforward neural network dedicated to an extremely fast learning. However, the performance of ELM in structural impact localization is unknown yet. In this paper, a comparison study of ELM with least squares support vector machine (LSSVM) is presented for the application on impact localization of a plate structure with surface-mounted piezoelectric sensors. Both basic and kernel-based ELM regression models have been developed for the location prediction. Comparative studies of the basic ELM, kernel-based ELM, and LSSVM models are carried out. Results show that the kernel-based ELM requires the shortest learning time and it is capable of producing suboptimal localization accuracy among the three models. Hence, ELM paves a promising way in structural impact detection.


2015 ◽  
Vol 2015 ◽  
pp. 1-12 ◽  
Author(s):  
Pengbo Zhang ◽  
Zhixin Yang

Extreme learning machine (ELM) has been well recognized as an effective learning algorithm with extremely fast learning speed and high generalization performance. However, to deal with the regression applications involving big data, the stability and accuracy of ELM shall be further enhanced. In this paper, a new hybrid machine learning method called robust AdaBoost.RT based ensemble ELM (RAE-ELM) for regression problems is proposed, which combined ELM with the novel robust AdaBoost.RT algorithm to achieve better approximation accuracy than using only single ELM network. The robust threshold for each weak learner will be adaptive according to the weak learner’s performance on the corresponding problem dataset. Therefore, RAE-ELM could output the final hypotheses in optimally weighted ensemble of weak learners. On the other hand, ELM is a quick learner with high regression performance, which makes it a good candidate of “weak” learners. We prove that the empirical error of the RAE-ELM is within a significantly superior bound. The experimental verification has shown that the proposed RAE-ELM outperforms other state-of-the-art algorithms on many real-world regression problems.


Symmetry ◽  
2019 ◽  
Vol 11 (10) ◽  
pp. 1284
Author(s):  
Licheng Cui ◽  
Huawei Zhai ◽  
Hongfei Lin

An extreme learning machine (ELM) is an innovative algorithm for the single hidden layer feed-forward neural networks and, essentially, only exists to find the optimal output weight so as to minimize output error based on the least squares regression from the hidden layer to the output layer. With a focus on the output weight, we introduce the orthogonal constraint into the output weight matrix, and propose a novel orthogonal extreme learning machine (NOELM) based on the idea of optimization column by column whose main characteristic is that the optimization of complex output weight matrix is decomposed into optimizing the single column vector of the matrix. The complex orthogonal procrustes problem is transformed into simple least squares regression with an orthogonal constraint, which can preserve more information from ELM feature space to output subspace, these make NOELM more regression analysis and discrimination ability. Experiments show that NOELM has better performance in training time, testing time and accuracy than ELM and OELM.


Author(s):  
JUNHAI ZHAI ◽  
HONGYU XU ◽  
YAN LI

Extreme learning machine (ELM) is an efficient and practical learning algorithm used for training single hidden layer feed-forward neural networks (SLFNs). ELM can provide good generalization performance at extremely fast learning speed. However, ELM suffers from instability and over-fitting, especially on relatively large datasets. Based on probabilistic SLFNs, an approach of fusion of extreme learning machine (F-ELM) with fuzzy integral is proposed in this paper. The proposed algorithm consists of three stages. Firstly, the bootstrap technique is employed to generate several subsets of original dataset. Secondly, probabilistic SLFNs are trained with ELM algorithm on each subset. Finally, the trained probabilistic SLFNs are fused with fuzzy integral. The experimental results show that the proposed approach can alleviate to some extent the problems mentioned above, and can increase the prediction accuracy.


2019 ◽  
Vol 36 (4) ◽  
pp. 3263-3269 ◽  
Author(s):  
Chunmei He ◽  
Yaqi Liu ◽  
Tong Yao ◽  
Fanhua Xu ◽  
Yanyun Hu ◽  
...  

2015 ◽  
Vol 2015 ◽  
pp. 1-10 ◽  
Author(s):  
Yang Liu ◽  
Bo He ◽  
Diya Dong ◽  
Yue Shen ◽  
Tianhong Yan ◽  
...  

A novel particle swarm optimization based selective ensemble (PSOSEN) of online sequential extreme learning machine (OS-ELM) is proposed. It is based on the original OS-ELM with an adaptive selective ensemble framework. Two novel insights are proposed in this paper. First, a novel selective ensemble algorithm referred to as particle swarm optimization selective ensemble is proposed, noting that PSOSEN is a general selective ensemble method which is applicable to any learning algorithms, including batch learning and online learning. Second, an adaptive selective ensemble framework for online learning is designed to balance the accuracy and speed of the algorithm. Experiments for both regression and classification problems with UCI data sets are carried out. Comparisons between OS-ELM, simple ensemble OS-ELM (EOS-ELM), genetic algorithm based selective ensemble (GASEN) of OS-ELM, and the proposed particle swarm optimization based selective ensemble of OS-ELM empirically show that the proposed algorithm achieves good generalization performance and fast learning speed.


2021 ◽  
Vol 2021 ◽  
pp. 1-11
Author(s):  
Qinwei Fan ◽  
Tongke Fan

Extreme learning machine (ELM), as a new simple feedforward neural network learning algorithm, has been extensively used in practical applications because of its good generalization performance and fast learning speed. However, the standard ELM requires more hidden nodes in the application due to the random assignment of hidden layer parameters, which in turn has disadvantages such as poorly hidden layer sparsity, low adjustment ability, and complex network structure. In this paper, we propose a hybrid ELM algorithm based on the bat and cuckoo search algorithm to optimize the input weight and threshold of the ELM algorithm. We test the numerical experimental performance of function approximation and classification problems under a few benchmark datasets; simulation results show that the proposed algorithm can obtain significantly better prediction accuracy compared to similar algorithms.


2018 ◽  
Vol 7 (4.35) ◽  
pp. 347
Author(s):  
Chong Tak Yaw ◽  
Shen Yuong Wong ◽  
Keem Siah Yap ◽  
Chin Hooi Tan

Extreme Learning Machine (ELM) is widely known as an effective learning algorithm than the conventional learning methods from the point of learning speed as well as generalization. The hidden neurons are optional in neuron alike whereas the weights are the criteria required to study the linking among the output layer as well as hidden layers. On the other hand, the ensemble model to integrate every independent prediction of several ELMs to produce a final output. This particular approach was included in a Multi-Agent System (MAS). By hybrid those two approached, a novel extreme learning machine based multi-agent systems (ELM-MAS) for handling classification problems is presented in this paper. It contains two layers of ELMs, i.e., individual agent layer and parent agent layer. Several activation functions using benchmark datasets and real-world applications, i.e., satellite image, image segmentation, fault diagnosis in power generation (including circulating water systems as well as GAST governor) were used to test the ELM-MAS developed. Our experimental results suggest that ELM-MAS is capable of achieving good accuracy rates relative to others approaches.


2018 ◽  
Vol 246 ◽  
pp. 03018
Author(s):  
Zuozhi Liu ◽  
JinJian Wu ◽  
Jianpeng Wang

Extreme learning machine (ELM) is a new novel learning algorithm for generalized single-hidden layer feedforward networks (SLFNs). Although it shows fast learning speed in many areas, there is still room for improvement in computational cost. To address this issue, this paper proposes an improved ELM (FRCFELM) which employs the full rank Cholesky factorization to compute output weights instead of traditional SVD. In addition, this paper proves in theory that the proposed FRCF-ELM has lower computational complexity. Experimental results over some benchmark applications indicate that the proposed FRCF-ELM learns faster than original ELM algorithm while preserving good generalization performance.


Sign in / Sign up

Export Citation Format

Share Document