Incremental conic functions algorithm for large scale classification problems

2018 ◽  
Vol 77 ◽  
pp. 187-194 ◽  
Author(s):  
Emre Cimen ◽  
Gurkan Ozturk ◽  
Omer Nezih Gerek
Electronics ◽  
2020 ◽  
Vol 9 (5) ◽  
pp. 792
Author(s):  
Dongbao Jia ◽  
Yuka Fujishita ◽  
Cunhua Li ◽  
Yuki Todo ◽  
Hongwei Dai

With the characteristics of simple structure and low cost, the dendritic neuron model (DNM) is used as a neuron model to solve complex problems such as nonlinear problems for achieving high-precision models. Although the DNM obtains higher accuracy and effectiveness than the middle layer of the multilayer perceptron in small-scale classification problems, there are no examples that apply it to large-scale classification problems. To achieve better performance for solving practical problems, an approximate Newton-type method-neural network with random weights for the comparison; and three learning algorithms including back-propagation (BP), biogeography-based optimization (BBO), and a competitive swarm optimizer (CSO) are used in the DNM in this experiment. Moreover, three classification problems are solved by using the above learning algorithms to verify their precision and effectiveness in large-scale classification problems. As a consequence, in the case of execution time, DNM + BP is the optimum; DNM + CSO is the best in terms of both accuracy stability and execution time; and considering the stability of comprehensive performance and the convergence rate, DNM + BBO is a wise choice.


Author(s):  
Ziad Akram Ali Hammouri ◽  
Manuel Fernandez Delgado ◽  
Eva Cernadas ◽  
Senen Barro

2020 ◽  
Vol 168 ◽  
pp. 26-33
Author(s):  
Natacha Gueorguieva ◽  
Iren Valova ◽  
Dominic Klusek

Sign in / Sign up

Export Citation Format

Share Document