scholarly journals Nonconvex Low-Rank Tensor Completion from Noisy Data

2021 ◽  
Author(s):  
Changxiao Cai ◽  
Gen Li ◽  
H. Vincent Poor ◽  
Yuxin Chen

This paper investigates a problem of broad practical interest, namely, the reconstruction of a large-dimensional low-rank tensor from highly incomplete and randomly corrupted observations of its entries. Although a number of papers have been dedicated to this tensor completion problem, prior algorithms either are computationally too expensive for large-scale applications or come with suboptimal statistical performance. Motivated by this, we propose a fast two-stage nonconvex algorithm—a gradient method following a rough initialization—that achieves the best of both worlds: optimal statistical accuracy and computational efficiency. Specifically, the proposed algorithm provably completes the tensor and retrieves all low-rank factors within nearly linear time, while at the same time enjoying near-optimal statistical guarantees (i.e., minimal sample complexity and optimal estimation accuracy). The insights conveyed through our analysis of nonconvex optimization might have implications for a broader family of tensor reconstruction problems beyond tensor completion.

2018 ◽  
Vol 2018 ◽  
pp. 1-11 ◽  
Author(s):  
Jinzhi Liao ◽  
Jiuyang Tang ◽  
Xiang Zhao ◽  
Haichuan Shang

POI recommendation finds significant importance in various real-life applications, especially when meeting with location-based services, e.g., check-ins social networks. In this paper, we propose to solve POI recommendation through a novel model of dynamic tensor, which is among the first triumphs of its kind. In order to carry out timely recommendation, we predict POI by utilizing a completion algorithm based on fast low-rank tensor. Particularly, the dynamic tensor structure is complemented by the fast low-rank tensor completion algorithm so as to achieve prediction with better performance, where the parameter optimization is achieved by a pigeon-inspired heuristic algorithm. In short, our POI recommendation via the dynamic tensor method can take advantage of the intrinsic characteristics of check-ins data due to the multimode features such as current categories, subsequent categories, and temporal information as well as seasons variations are all integrated into the model. Extensive experiment results not only validate the superiority of our proposed method but also imply the application prospect in large-scale and real-time POI recommendation environment.


2021 ◽  
Author(s):  
Vasanth S. Murali ◽  
Didem Ağaç Çobanoğlu ◽  
Michael Hsieh ◽  
Meyer Zinn ◽  
Venkat S. Malladi ◽  
...  

AbstractThe heterogeneity of cancer necessitates developing a multitude of targeted therapies. We propose the view that cancer drug discovery is a low rank tensor completion problem. We implement this vision by using heterogeneous public data to construct a tensor of drug-target-disease associations. We show the validity of this approach computationally by simulations, and experimentally by testing drug candidates. Specifically, we show that a novel drug candidate, SU11652, controls melanoma tumor growth, including BRAFWT melanoma. Independently, we show that another molecule, TC-E 5008, controls tumor proliferation on ex vivo ER+ human breast cancer. Most importantly, we identify these chemicals with only a few computationally selected experiments as opposed to brute-force screens. The efficiency of our approach enables use of ex vivo human tumor assays as a primary screening tool. We provide a web server, the Cancer Vulnerability Explorer (accessible at https://cavu.biohpc.swmed.edu), to facilitate the use of our methodology.


2022 ◽  
Vol 4 ◽  
Author(s):  
Kaiqi Zhang ◽  
Cole Hawkins ◽  
Zheng Zhang

A major challenge in many machine learning tasks is that the model expressive power depends on model size. Low-rank tensor methods are an efficient tool for handling the curse of dimensionality in many large-scale machine learning models. The major challenges in training a tensor learning model include how to process the high-volume data, how to determine the tensor rank automatically, and how to estimate the uncertainty of the results. While existing tensor learning focuses on a specific task, this paper proposes a generic Bayesian framework that can be employed to solve a broad class of tensor learning problems such as tensor completion, tensor regression, and tensorized neural networks. We develop a low-rank tensor prior for automatic rank determination in nonlinear problems. Our method is implemented with both stochastic gradient Hamiltonian Monte Carlo (SGHMC) and Stein Variational Gradient Descent (SVGD). We compare the automatic rank determination and uncertainty quantification of these two solvers. We demonstrate that our proposed method can determine the tensor rank automatically and can quantify the uncertainty of the obtained results. We validate our framework on tensor completion tasks and tensorized neural network training tasks.


2021 ◽  
Vol 7 (7) ◽  
pp. 110
Author(s):  
Zehan Chao ◽  
Longxiu Huang ◽  
Deanna Needell

Matrix completion, the problem of completing missing entries in a data matrix with low-dimensional structure (such as rank), has seen many fruitful approaches and analyses. Tensor completion is the tensor analog that attempts to impute missing tensor entries from similar low-rank type assumptions. In this paper, we study the tensor completion problem when the sampling pattern is deterministic and possibly non-uniform. We first propose an efficient weighted Higher Order Singular Value Decomposition (HOSVD) algorithm for the recovery of the underlying low-rank tensor from noisy observations and then derive the error bounds under a properly weighted metric. Additionally, the efficiency and accuracy of our algorithm are both tested using synthetic and real datasets in numerical simulations.


Author(s):  
Tianheng Zhang ◽  
Jianli Zhao ◽  
Qiuxia Sun ◽  
Bin Zhang ◽  
Jianjian Chen ◽  
...  

2019 ◽  
Vol 73 ◽  
pp. 62-69 ◽  
Author(s):  
Wen-Hao Xu ◽  
Xi-Le Zhao ◽  
Teng-Yu Ji ◽  
Jia-Qing Miao ◽  
Tian-Hui Ma ◽  
...  

Author(s):  
Jize Xue ◽  
Yongqiang Zhao ◽  
Shaoguang Huang ◽  
Wenzhi Liao ◽  
Jonathan Cheung-Wai Chan ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document