scholarly journals Near-optimal sample complexity bounds for circulant binary embedding

Author(s):  
Samet Oymak ◽  
Christos Thrampoulidis ◽  
Babak Hassibi
2020 ◽  
Vol 67 (6) ◽  
pp. 1-42
Author(s):  
Hassan Ashtiani ◽  
Shai Ben-David ◽  
Nicholas J. A. Harvey ◽  
Christopher Liaw ◽  
Abbas Mehrabian ◽  
...  

2008 ◽  
Vol 8 (3&4) ◽  
pp. 345-358
Author(s):  
M. Hayashi ◽  
A. Kawachi ◽  
H. Kobayashi

One of the central issues in the hidden subgroup problem is to bound the sample complexity, i.e., the number of identical samples of coset states sufficient and necessary to solve the problem. In this paper, we present general bounds for the sample complexity of the identification and decision versions of the hidden subgroup problem. As a consequence of the bounds, we show that the sample complexity for both of the decision and identification versions is $\Theta(\log|\HH|/\log p)$ for a candidate set $\HH$ of hidden subgroups in the case \REVISE{where the candidate nontrivial subgroups} have the same prime order $p$, which implies that the decision version is at least as hard as the identification version in this case. In particular, it does so for the important \REVISE{cases} such as the dihedral and the symmetric hidden subgroup problems. Moreover, the upper bound of the identification is attained \REVISE{by a variant of the pretty good measurement}. \REVISE{This implies that the concept of the pretty good measurement is quite useful for identification of hidden subgroups over an arbitrary group with optimal sample complexity}.


2017 ◽  
Vol 3 (1) ◽  
Author(s):  
Shelby Kimmel ◽  
Cedric Yen-Yu Lin ◽  
Guang Hao Low ◽  
Maris Ozols ◽  
Theodore J. Yoder

Author(s):  
Philipp Trunschke ◽  
Martin Eigel ◽  
Reinhold Schneider

We consider best approximation problems in a nonlinear subset  [[EQUATION]] of a Banach space of functions [[EQUATION]] . The norm is assumed to be a generalization of the [[EQUATION]] -norm for which only a weighted Monte Carlo estimate [[EQUATION]] can be computed. The objective is to obtain an approximation [[EQUATION]] of an unknown function [[EQUATION]] by minimizing the empirical norm [[EQUATION]] . We consider this problem for general nonlinear subsets and establish error bounds for the empirical best approximation error. Our results are based on a restricted isometry property (RIP) which holds in probability and is independent of the nonlinear least squares setting. Several model classes are examined where analytical statements can be made about the RIP and the results are compared to existing sample complexity bounds from the literature. We find that for well-studied model classes our general bound is weaker but exhibits many of the same properties as these specialized bounds. Notably, we demonstrate the advantage of an optimal sampling density (as known for linear spaces) for sets of functions with sparse representations.


2018 ◽  
Vol 8 (3) ◽  
pp. 577-619 ◽  
Author(s):  
Navid Ghadermarzy ◽  
Yaniv Plan ◽  
Özgür Yilmaz

Abstract We study the problem of estimating a low-rank tensor when we have noisy observations of a subset of its entries. A rank-$r$, order-$d$, $N \times N \times \cdots \times N$ tensor, where $r=O(1)$, has $O(dN)$ free variables. On the other hand, prior to our work, the best sample complexity that was achieved in the literature is $O\left(N^{\frac{d}{2}}\right)$, obtained by solving a tensor nuclear-norm minimization problem. In this paper, we consider the ‘M-norm’, an atomic norm whose atoms are rank-1 sign tensors. We also consider a generalization of the matrix max-norm to tensors, which results in a quasi-norm that we call ‘max-qnorm’. We prove that solving an M-norm constrained least squares (LS) problem results in nearly optimal sample complexity for low-rank tensor completion (TC). A similar result holds for max-qnorm as well. Furthermore, we show that these bounds are nearly minimax rate-optimal. We also provide promising numerical results for max-qnorm constrained TC, showing improved recovery compared to matricization and alternating LS.


Sign in / Sign up

Export Citation Format

Share Document