Abstract
We study the problem of estimating a low-rank tensor when we have noisy observations of a subset of its entries. A rank-$r$, order-$d$, $N \times N \times \cdots \times N$ tensor, where $r=O(1)$, has $O(dN)$ free variables. On the other hand, prior to our work, the best sample complexity that was achieved in the literature is $O\left(N^{\frac{d}{2}}\right)$, obtained by solving a tensor nuclear-norm minimization problem. In this paper, we consider the ‘M-norm’, an atomic norm whose atoms are rank-1 sign tensors. We also consider a generalization of the matrix max-norm to tensors, which results in a quasi-norm that we call ‘max-qnorm’. We prove that solving an M-norm constrained least squares (LS) problem results in nearly optimal sample complexity for low-rank tensor completion (TC). A similar result holds for max-qnorm as well. Furthermore, we show that these bounds are nearly minimax rate-optimal. We also provide promising numerical results for max-qnorm constrained TC, showing improved recovery compared to matricization and alternating LS.