scholarly journals Computationally efficient models for high-dimensional and large-scale classification problems

2009 ◽  
Author(s):  
Li Ma
2020 ◽  
Vol 2020 ◽  
pp. 1-7 ◽  
Author(s):  
Aboubakar Nasser Samatin Njikam ◽  
Huan Zhao

This paper introduces an extremely lightweight (with just over around two hundred thousand parameters) and computationally efficient CNN architecture, named CharTeC-Net (Character-based Text Classification Network), for character-based text classification problems. This new architecture is composed of four building blocks for feature extraction. Each of these building blocks, except the last one, uses 1 × 1 pointwise convolutional layers to add more nonlinearity to the network and to increase the dimensions within each building block. In addition, shortcut connections are used in each building block to facilitate the flow of gradients over the network, but more importantly to ensure that the original signal present in the training data is shared across each building block. Experiments on eight standard large-scale text classification and sentiment analysis datasets demonstrate CharTeC-Net’s superior performance over baseline methods and yields competitive accuracy compared with state-of-the-art methods, although CharTeC-Net has only between 181,427 and 225,323 parameters and weighs less than 1 megabyte.


2018 ◽  
Vol 77 ◽  
pp. 187-194 ◽  
Author(s):  
Emre Cimen ◽  
Gurkan Ozturk ◽  
Omer Nezih Gerek

2020 ◽  
Author(s):  
Saber Meamardoost ◽  
Mahasweta Bhattacharya ◽  
EunJung Hwang ◽  
Takaki Komiyama ◽  
Claudia Mewes ◽  
...  

AbstractThe inference of neuronal connectome from large-scale neuronal activity recordings, such as two-photon Calcium imaging, represents an active area of research in computational neuroscience. In this work, we developed FARCI (Fast and Robust Connectome Inference), a MATLAB package for neuronal connectome inference from high-dimensional two-photon Calcium fluorescence data. We employed partial correlations as a measure of the functional association strength between pairs of neurons to reconstruct a neuronal connectome. We demonstrated using gold standard datasets from the Neural Connectomics Challenge (NCC) that FARCI provides an accurate connectome and its performance is robust to network sizes, missing neurons, and noise levels. Moreover, FARCI is computationally efficient and highly scalable to large networks. In comparison to the best performing algorithm in the NCC, FARCI produces more accurate networks over different network sizes and subsampling, while providing over two orders of magnitude faster computational speed.


Electronics ◽  
2020 ◽  
Vol 9 (5) ◽  
pp. 792
Author(s):  
Dongbao Jia ◽  
Yuka Fujishita ◽  
Cunhua Li ◽  
Yuki Todo ◽  
Hongwei Dai

With the characteristics of simple structure and low cost, the dendritic neuron model (DNM) is used as a neuron model to solve complex problems such as nonlinear problems for achieving high-precision models. Although the DNM obtains higher accuracy and effectiveness than the middle layer of the multilayer perceptron in small-scale classification problems, there are no examples that apply it to large-scale classification problems. To achieve better performance for solving practical problems, an approximate Newton-type method-neural network with random weights for the comparison; and three learning algorithms including back-propagation (BP), biogeography-based optimization (BBO), and a competitive swarm optimizer (CSO) are used in the DNM in this experiment. Moreover, three classification problems are solved by using the above learning algorithms to verify their precision and effectiveness in large-scale classification problems. As a consequence, in the case of execution time, DNM + BP is the optimum; DNM + CSO is the best in terms of both accuracy stability and execution time; and considering the stability of comprehensive performance and the convergence rate, DNM + BBO is a wise choice.


Author(s):  
Ziad Akram Ali Hammouri ◽  
Manuel Fernandez Delgado ◽  
Eva Cernadas ◽  
Senen Barro

Sign in / Sign up

Export Citation Format

Share Document