A Generalized Graph Regularized Non-Negative Tucker Decomposition Framework for Tensor Data Representation

2020 ◽  
pp. 1-14 ◽  
Author(s):  
Yuning Qiu ◽  
Guoxu Zhou ◽  
Yanjiao Wang ◽  
Yu Zhang ◽  
Shengli Xie
Author(s):  
YuNing Qiu ◽  
GuoXu Zhou ◽  
XinQi Chen ◽  
DongPing Zhang ◽  
XinHai Zhao ◽  
...  

2020 ◽  
Vol 16 (4) ◽  
pp. 155014772091640
Author(s):  
Chenquan Gan ◽  
Junwei Mao ◽  
Zufan Zhang ◽  
Qingyi Zhu

Tensor compression algorithms play an important role in the processing of multidimensional signals. In previous work, tensor data structures are usually destroyed by vectorization operations, resulting in information loss and new noise. To this end, this article proposes a tensor compression algorithm using Tucker decomposition and dictionary dimensionality reduction, which mainly includes three parts: tensor dictionary representation, dictionary preprocessing, and dictionary update. Specifically, the tensor is respectively performed by the sparse representation and Tucker decomposition, from which one can obtain the dictionary, sparse coefficient, and core tensor. Furthermore, the sparse representation can be obtained through the relationship between sparse coefficient and core tensor. In addition, the dimensionality of the input tensor is reduced by using the concentrated dictionary learning. Finally, some experiments show that, compared with other algorithms, the proposed algorithm has obvious advantages in preserving the original data information and denoising ability.


2020 ◽  
Author(s):  
Dimitris G. Chachlakis ◽  
Mayur Dhanaraj ◽  
Ashley Prater-Bennette ◽  
Panos P. Markopoulos

<p>Tucker decomposition is a standard method for processing multi-way (tensor) measurements and finds many applications in machine learning and data mining, among other fields. When tensor measurements arrive in a streaming fashion or are too many to jointly decompose, incremental Tucker analysis is preferred. In addition, dynamic basis adaptation is desired when the nominal data subspaces change. At the same time, it has been documented that outliers in the data can significantly compromise the performance of existing methods for dynamic Tucker analysis. In this work, we present Dynamic L1-Tucker: an algorithm for dynamic and outlier-resistant Tucker analysis of tensor data. Our experimental studies on both real and synthetic datasets corroborate that the proposed method (i) attains high basis estimation performance, (ii) identifies/rejects outliers, and (iii) adapts to nominal subspace changes.</p>


Author(s):  
Dimitris G. Chachlakis ◽  
Mayur Dhanaraj ◽  
Ashley Prater-Bennette ◽  
Panos P. Markopoulos

<p>Tucker decomposition is a standard method for processing multi-way (tensor) measurements and finds many applications in machine learning and data mining, among other fields. When tensor measurements arrive in a streaming fashion or are too many to jointly decompose, incremental Tucker analysis is preferred. In addition, dynamic basis adaptation is desired when the nominal data subspaces change. At the same time, it has been documented that outliers in the data can significantly compromise the performance of existing methods for dynamic Tucker analysis. In this work, we present Dynamic L1-Tucker: an algorithm for dynamic and outlier-resistant Tucker analysis of tensor data. Our experimental studies on both real and synthetic datasets corroborate that the proposed method (i) attains high basis estimation performance, (ii) identifies/rejects outliers, and (iii) adapts to nominal subspace changes.</p>


Sign in / Sign up

Export Citation Format

Share Document