Partitioned Alternating Least Squares Technique for Canonical Polyadic Tensor Decomposition

2016 ◽  
Vol 23 (7) ◽  
pp. 993-997 ◽  
Author(s):  
Petr Tichavsky ◽  
Anh-Huy Phan ◽  
Andrzej Cichocki
2019 ◽  
Vol 11 (24) ◽  
pp. 2932 ◽  
Author(s):  
Geunseop Lee

Hyperspectral imaging is widely used to many applications as it includes both spatial and spectral distributions of a target scene. However, a compression, or a low multilinear rank approximation of hyperspectral imaging data, is required owing to the difficult manipulation of the massive amount of data. In this paper, we propose an efficient algorithm for higher order singular value decomposition that enables the decomposition of a tensor into a compressed tensor multiplied by orthogonal factor matrices. Specifically, we sequentially compute low rank factor matrices from the Tucker-1 model optimization problems via an alternating least squares approach. Experiments with real world hyperspectral imaging revealed that the proposed algorithm could compute the compressed tensor with a higher computational speed, but with no significant difference in accuracy of compression compared to the other tensor decomposition-based compression algorithms.


2019 ◽  
Vol 18 (01) ◽  
pp. 129-147 ◽  
Author(s):  
Xianpeng Mao ◽  
Gonglin Yuan ◽  
Yuning Yang

Though the alternating least squares algorithm (ALS), as a classic and easily implemented algorithm, has been widely applied to tensor decomposition and approximation problems, it has some drawbacks: the convergence of ALS is not guaranteed, and the swamp phenomenon appears in some cases, causing the convergence rate to slow down dramatically. To overcome these shortcomings, the regularized-ALS algorithm (RALS) was proposed in the literature. By employing the optimal step-size selection rule, we develop a self-adaptive regularized alternating least squares method (SA-RALS) to accelerate RALS in this paper. Theoretically, we show that the step-size is always larger than unity, and can be larger than [Formula: see text], which is quite different from several optimization algorithms. Furthermore, under mild assumptions, we prove that the whole sequence generated by SA-RALS converges to a stationary point of the objective function. Numerical results verify that the SA-RALS performs better than RALS in terms of the number of iterations and the CPU time.


1996 ◽  
Vol 8 (3) ◽  
pp. 133-144 ◽  
Author(s):  
María del Mar del Pozo Andrés ◽  
Jacques F A Braster

In this article we propose two research techniques that can bridge the gap between quantitative and qualitative historical research. These are: (1) a multiple regression approach that gives information about general patterns between numerical variables and the selection of outliers for qualitative analysis; (2) a homogeneity analysis with alternating least squares that results in a two-dimensional picture in which the relationships between categorical variables are graphically presented.


Sign in / Sign up

Export Citation Format

Share Document