scholarly journals Random polytopes obtained by matrices with heavy-tailed entries

2019 ◽  
Vol 22 (04) ◽  
pp. 1950027
Author(s):  
O. Guédon ◽  
A. E. Litvak ◽  
K. Tatarko

Let [Formula: see text] be an [Formula: see text] random matrix with independent entries and such that in each row entries are i.i.d. Assume also that the entries are symmetric, have unit variances, and satisfy a small ball probabilistic estimate uniformly. We investigate properties of the corresponding random polytope [Formula: see text] in [Formula: see text] (the absolute convex hull of rows of [Formula: see text]). In particular, we show that [Formula: see text] where [Formula: see text] depends only on parameters in small ball inequality. This extends results of [A. E. Litvak, A. Pajor, M. Rudelson and N. Tomczak-Jaegermann, Smallest singular value of random matrices and geometry of random polytopes, Adv. Math. 195 (2005) 491–523] and recent results of [F. Krahmer, C. Kummerle and H. Rauhut, A quotient property for matrices with heavy-tailed entries and its application to noise-blind compressed sensing, preprint (2018); arXiv:1806.04261]. This inclusion is equivalent to so-called [Formula: see text]-quotient property and plays an important role in compressed sensing (see [F. Krahmer, C. Kummerle and H. Rauhut, A quotient property for matrices with heavy-tailed entries and its application to noise-blind compressed sensing, preprint (2018); arXiv:1806.04261] and references therein).

2005 ◽  
Vol 195 (2) ◽  
pp. 491-523 ◽  
Author(s):  
A.E. Litvak ◽  
A. Pajor ◽  
M. Rudelson ◽  
N. Tomczak-Jaegermann

2021 ◽  
Vol 49 (3) ◽  
Author(s):  
Galyna V. Livshyts ◽  
Konstantin Tikhomirov ◽  
Roman Vershynin

2018 ◽  
Vol 07 (01) ◽  
pp. 1750014 ◽  
Author(s):  
Kyle Luh

Let [Formula: see text] where [Formula: see text] are iid copies of a mean zero, variance one, subgaussian random variable. Let [Formula: see text] be an [Formula: see text] random matrix with entries that are iid copies of [Formula: see text]. We prove that there exists a [Formula: see text] such that the probability that [Formula: see text] has any real eigenvalues is less than [Formula: see text] where [Formula: see text] only depends on the subgaussian moment of [Formula: see text]. The bound is optimal up to the value of the constant [Formula: see text]. The principal component of the proof is an optimal tail bound on the least singular value of matrices of the form [Formula: see text] where [Formula: see text] is a deterministic complex matrix with the condition that [Formula: see text] for some constant [Formula: see text] depending on the subgaussian moment of [Formula: see text]. For this class of random variables, this result improves on the results of Pan–Zhou [Circular law, extreme singular values and potential theory, J. Multivariate Anal. 101(3) (2010) 645–656] and Rudelson–Vershynin [The Littlewood–Offord problem and invertibility of random matrices, Adv. Math. 218(2) (2008) 600–633]. In the proof of the tail bound, we develop an optimal small-ball probability bound for complex random variables that generalizes the Littlewood–Offord theory developed by Tao–Vu [From the Littlewood–Offord problem to the circular law: Universality of the spectral distribution of random matrices, Bull. Amer. Math. Soc.[Formula: see text]N.S.[Formula: see text] 46(3) (2009) 377–396; Inverse Littlewood–Offord theorems and the condition number of random discrete matrices, Ann. of Math.[Formula: see text] 169(2) (2009) 595–632] and Rudelson–Vershynin [The Littlewood–Offord problem and invertibility of random matrices, Adv. Math. 218(2) (2008) 600–633; Smallest singular value of a random rectangular matrix, Comm. Pure Appl. Math. 62(12) (2009) 1707–1739].


2015 ◽  
Vol 04 (02) ◽  
pp. 1550006 ◽  
Author(s):  
F. Götze ◽  
A. Naumov ◽  
A. Tikhomirov

Let X be a random matrix whose pairs of entries Xjk and Xkj are correlated and vectors (Xjk, Xkj), for 1 ≤ j < k ≤ n, are mutually independent. Assume that the diagonal entries are independent from off-diagonal entries as well. We assume that [Formula: see text], for any j, k = 1, …, n and 𝔼 XjkXkj = ρ for 1 ≤ j < k ≤ n. Let Mn be a non-random n × n matrix with ‖Mn‖ ≤ KnQ, for some positive constants K > 0 and Q ≥ 0. Let sn(X + Mn) denote the least singular value of the matrix X + Mn. It is shown that there exist positive constants A and B depending on K, Q, ρ only such that [Formula: see text] As an application of this result we prove the elliptic law for this class of matrices with non-identically distributed correlated entries.


2012 ◽  
Vol 212 (3) ◽  
pp. 195-218 ◽  
Author(s):  
Alexander E. Litvak ◽  
Omar Rivasplata

2016 ◽  
Vol 09 (04) ◽  
pp. 1650075
Author(s):  
Yang Liu ◽  
Yang Wang

In this paper, we study the decay of the smallest singular value of submatrices that consist of bounded column vectors. We find that the smallest singular value of submatrices is related to the minimal distance of points to the lines connecting other two points in a bounded point set. Using a technique from integral geometry and from the perspective of combinatorial geometry, we show the decay rate of the minimal distance for the sets of points if the number of the points that are on the boundary of the convex hull of any subset is not too large, relative to the cardinality of the set. In the numeral or computational aspect, we conduct some numerical experiments for many sets of points and analyze the smallest distance for some extremal configurations.


Sign in / Sign up

Export Citation Format

Share Document