tucker decomposition
Recently Published Documents


TOTAL DOCUMENTS

148
(FIVE YEARS 80)

H-INDEX

14
(FIVE YEARS 3)

Symmetry ◽  
2022 ◽  
Vol 14 (1) ◽  
pp. 113
Author(s):  
Rafał Zdunek ◽  
Krzysztof Fonał

Nonnegative Tucker decomposition (NTD) is a robust method used for nonnegative multilinear feature extraction from nonnegative multi-way arrays. The standard version of NTD assumes that all of the observed data are accessible for batch processing. However, the data in many real-world applications are not static or are represented by a large number of multi-way samples that cannot be processing in one batch. To tackle this problem, a dynamic approach to NTD can be explored. In this study, we extend the standard model of NTD to an incremental or online version, assuming volatility of observed multi-way data along one mode. We propose two computational approaches for updating the factors in the incremental model: one is based on the recursive update model, and the other uses the concept of the block Kaczmarz method that belongs to coordinate descent methods. The experimental results performed on various datasets and streaming data demonstrate high efficiently of both algorithmic approaches, with respect to the baseline NTD methods.


2021 ◽  
pp. 107841
Author(s):  
Pengpeng Shao ◽  
Dawei Zhang ◽  
Guohua Yang ◽  
Jianhua Tao ◽  
Feihu Che ◽  
...  

2021 ◽  
Vol 11 (16) ◽  
pp. 7397
Author(s):  
Mauricio Maldonado-Chan ◽  
Andres Mendez-Vazquez ◽  
Ramon Osvaldo Guardado-Medina

Gated networks are networks that contain gating connections in which the output of at least two neurons are multiplied. The basic idea of a gated restricted Boltzmann machine (RBM) model is to use the binary hidden units to learn the conditional distribution of one image (the output) given another image (the input). This allows the hidden units of a gated RBM to model the transformations between two successive images. Inference in the model consists in extracting the transformations given a pair of images. However, a fully connected multiplicative network creates cubically many parameters, forming a three-dimensional interaction tensor that requires a lot of memory and computations for inference and training. In this paper, we parameterize the bilinear interactions in the gated RBM through a multimodal tensor-based Tucker decomposition. Tucker decomposition decomposes a tensor into a set of matrices and one (usually smaller) core tensor. The parameterization through Tucker decomposition helps reduce the number of model parameters, reduces the computational costs of the learning process and effectively strengthens the structured feature learning. When trained on affine transformations of still images, we show how a completely unsupervised network learns explicit encodings of image transformations.


2021 ◽  
Author(s):  
Zitong Li ◽  
Qiming Fang ◽  
Grey Ballard
Keyword(s):  

Author(s):  
YuNing Qiu ◽  
GuoXu Zhou ◽  
XinQi Chen ◽  
DongPing Zhang ◽  
XinHai Zhao ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document