First-Order Perturbation Analysis of Singular Vectors in Singular Value Decomposition

2008 ◽  
Vol 56 (7) ◽  
pp. 3044-3049 ◽  
Author(s):  
Jun Liu ◽  
Xiangqian Liu ◽  
Xiaoli Ma
2010 ◽  
Vol 20 (04) ◽  
pp. 293-318 ◽  
Author(s):  
ALEXANDER KAISER ◽  
WOLFRAM SCHENCK ◽  
RALF MÖLLER

We derive coupled on-line learning rules for the singular value decomposition (SVD) of a cross-covariance matrix. In coupled SVD rules, the singular value is estimated alongside the singular vectors, and the effective learning rates for the singular vector rules are influenced by the singular value estimates. In addition, we use a first-order approximation of Gram-Schmidt orthonormalization as decorrelation method for the estimation of multiple singular vectors and singular values. Experiments on synthetic data show that coupled learning rules converge faster than Hebbian learning rules and that the first-order approximation of Gram-Schmidt orthonormalization produces more precise estimates and better orthonormality than the standard deflation method.


2019 ◽  
Vol 62 (4) ◽  
pp. 975-984
Author(s):  
Michael Albert ◽  
Vincent Vatter

AbstractBevan established that the growth rate of a monotone grid class of permutations is equal to the square of the spectral radius of a related bipartite graph. We give an elementary and self-contained proof of a generalization of this result using only Stirling's formula, the method of Lagrange multipliers, and the singular value decomposition of matrices. Our proof relies on showing that the maximum over the space of n × n matrices with non-negative entries summing to one of a certain function of those entries, parametrized by the entries of another matrix Γ of non-negative real numbers, is equal to the square of the largest singular value of Γ and that the maximizing point can be expressed as a Hadamard product of Γ with the tensor product of singular vectors for its greatest singular value.


Sign in / Sign up

Export Citation Format

Share Document