scholarly journals Robust Tensor Decomposition via Orientation Invariant Tubal Nuclear Norms

2020 ◽  
Vol 34 (04) ◽  
pp. 6102-6109 ◽  
Author(s):  
Andong Wang ◽  
Chao Li ◽  
Zhong Jin ◽  
Qibin Zhao

Low-rank tensor recovery has been widely applied to computer vision and machine learning. Recently, tubal nuclear norm (TNN) based optimization is proposed with superior performance as compared to other tensor nuclear norms. However, one major limitation is its orientation sensitivity due to low-rankness strictly defined along tubal orientation and it cannot simultaneously model spectral low-rankness in multiple orientations. To this end, we introduce two new tensor norms called OITNN-O and OITNN-L to exploit multi-orientational spectral low-rankness for an arbitrary K-way (K ≥ 3) tensors. We further formulate two robust tensor decomposition models via the proposed norms and develop two algorithms as the solutions. Theoretically, we establish non-asymptotic error bounds which can predict the scaling behavior of the estimation error. Experiments on real-world datasets demonstrate the superiority and effectiveness of the proposed norms.

Author(s):  
Longhao Yuan ◽  
Chao Li ◽  
Danilo Mandic ◽  
Jianting Cao ◽  
Qibin Zhao

In tensor completion tasks, the traditional low-rank tensor decomposition models suffer from the laborious model selection problem due to their high model sensitivity. In particular, for tensor ring (TR) decomposition, the number of model possibilities grows exponentially with the tensor order, which makes it rather challenging to find the optimal TR decomposition. In this paper, by exploiting the low-rank structure of the TR latent space, we propose a novel tensor completion method which is robust to model selection. In contrast to imposing the low-rank constraint on the data space, we introduce nuclear norm regularization on the latent TR factors, resulting in the optimization step using singular value decomposition (SVD) being performed at a much smaller scale. By leveraging the alternating direction method of multipliers (ADMM) scheme, the latent TR factors with optimal rank and the recovered tensor can be obtained simultaneously. Our proposed algorithm is shown to effectively alleviate the burden of TR-rank selection, thereby greatly reducing the computational cost. The extensive experimental results on both synthetic and real-world data demonstrate the superior performance and efficiency of the proposed approach against the state-of-the-art algorithms.


2021 ◽  
Vol 13 (20) ◽  
pp. 4116
Author(s):  
Meng Cao ◽  
Wenxing Bao ◽  
Kewen Qu

The hyperspectral image super-resolution (HSI-SR) problem aims at reconstructing the high resolution spatial–spectral information of the scene by fusing low-resolution hyperspectral images (LR-HSI) and the corresponding high-resolution multispectral image (HR-MSI). In order to effectively preserve the spatial and spectral structure of hyperspectral images, a new joint regularized low-rank tensor decomposition method (JRLTD) is proposed for HSI-SR. This model alleviates the problem that the traditional HSI-SR method, based on tensor decomposition, fails to adequately take into account the manifold structure of high-dimensional HR-HSI and is sensitive to outliers and noise. The model first operates on the hyperspectral data using the classical Tucker decomposition to transform the hyperspectral data into the form of a three-mode dictionary multiplied by the core tensor, after which the graph regularization and unidirectional total variational (TV) regularization are introduced to constrain the three-mode dictionary. In addition, we impose the l1-norm on core tensor to characterize the sparsity. While effectively preserving the spatial and spectral structures in the fused hyperspectral images, the presence of anomalous noise values in the images is reduced. In this paper, the hyperspectral image super-resolution problem is transformed into a joint regularization optimization problem based on tensor decomposition and solved by a hybrid framework between the alternating direction multiplier method (ADMM) and the proximal alternate optimization (PAO) algorithm. Experimental results conducted on two benchmark datasets and one real dataset show that JRLTD shows superior performance over state-of-the-art hyperspectral super-resolution algorithms.


2021 ◽  
Vol 2 (4) ◽  
pp. 401-417
Author(s):  
Seyed M. Ghafari ◽  
Amin Beheshti ◽  
Aditya Joshi ◽  
Cecile Paris ◽  
Shahpar Yakhchi ◽  
...  

Trust among users in online social networks is a key factor in determining the amount of information that is perceived as reliable. Compared to the number of users in online social networks, user-specified trust relations are very sparse. This makes the pair-wise trust prediction a challenging task. Social studies have investigated trust and why people trust each other. The relation between trust and personality traits of people who established those relations, has been proved by social theories. In this work, we attempt to alleviate the effect of the sparsity of trust relations by extracting implicit information from the users, in particular, by focusing on users' personality traits and seeking a low-rank representation of users. We investigate the potential impact on the prediction of trust relations, by incorporating users' personality traits based on the Big Five factor personality model. We evaluate the impact of similarities of users' personality traits and the effect of each personality trait on pair-wise trust relations. Next, we formulate a new unsupervised trust prediction model based on tensor decomposition. Finally, we empirically evaluate this model using two real-world datasets. Our extensive experiments confirm the superior performance of our model compared to the state-of-the-art approaches.


2021 ◽  
Author(s):  
Shengchuan Li ◽  
Yanmei Wang ◽  
Qiong Luo ◽  
Kai Wang ◽  
Zhi Han ◽  
...  

Author(s):  
Jize Xue ◽  
Yongqiang Zhao ◽  
Shaoguang Huang ◽  
Wenzhi Liao ◽  
Jonathan Cheung-Wai Chan ◽  
...  

2020 ◽  
Vol 29 ◽  
pp. 9044-9059
Author(s):  
Lin Chen ◽  
Xue Jiang ◽  
Xingzhao Liu ◽  
Zhixin Zhou

2020 ◽  
Vol 532 ◽  
pp. 170-189
Author(s):  
Yu-Bang Zheng ◽  
Ting-Zhu Huang ◽  
Xi-Le Zhao ◽  
Tai-Xiang Jiang ◽  
Teng-Yu Ji ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document