scholarly journals Asymptotic Expansion of the Misclassification Probabilities of D- and A-Criteria for Discrimination from Two High Dimensional Populations Using the Theory of Large Dimensional Random Matrices

1993 ◽  
Vol 46 (1) ◽  
pp. 154-174 ◽  
Author(s):  
H. Saranadasa
Symmetry ◽  
2019 ◽  
Vol 11 (5) ◽  
pp. 638
Author(s):  
Xianjie Gao ◽  
Chao Zhang ◽  
Hongwei Zhang

Random matrices have played an important role in many fields including machine learning, quantum information theory, and optimization. One of the main research focuses is on the deviation inequalities for eigenvalues of random matrices. Although there are intensive studies on the large-deviation inequalities for random matrices, only a few works discuss the small-deviation behavior of random matrices. In this paper, we present the small-deviation inequalities for the largest eigenvalues of sums of random matrices. Since the resulting inequalities are independent of the matrix dimension, they are applicable to high-dimensional and even the infinite-dimensional cases.


Author(s):  
Gao-Fan Ha ◽  
Qiuyan Zhang ◽  
Zhidong Bai ◽  
You-Gan Wang

In this paper, a ridgelized Hotelling’s [Formula: see text] test is developed for a hypothesis on a large-dimensional mean vector under certain moment conditions. It generalizes the main result of Chen et al. [A regularized Hotelling’s [Formula: see text] test for pathway analysis in proteomic studies, J. Am. Stat. Assoc. 106(496) (2011) 1345–1360.] by relaxing their Gaussian assumption. This is achieved by establishing an exact four-moment theorem that is a simplified version of Tao and Vu’s [Random matrices: universality of local statistics of eigenvalues, Ann. Probab. 40(3) (2012) 1285–1315] work. Simulation results demonstrate the superiority of the proposed test over the traditional Hotelling’s [Formula: see text] test and its several extensions in high-dimensional situations.


Sign in / Sign up

Export Citation Format

Share Document