scholarly journals An Implicit Form of Krasulina's k-PCA Update without the Orthonormality Constraint

2020 ◽  
Vol 34 (04) ◽  
pp. 3179-3186
Author(s):  
Ehsan Amid ◽  
Manfred K. Warmuth

We shed new insights on the two commonly used updates for the online k-PCA problem, namely, Krasulina's and Oja's updates. We show that Krasulina's update corresponds to a projected gradient descent step on the Stiefel manifold of orthonormal k-frames, while Oja's update amounts to a gradient descent step using the unprojected gradient. Following these observations, we derive a more implicit form of Krasulina's k-PCA update, i.e. a version that uses the information of the future gradient as much as possible. Most interestingly, our implicit Krasulina update avoids the costly QR-decomposition step by bypassing the orthonormality constraint. A related update, called the Sanger's rule, can be seen as an explicit approximation of our implicit update. We show that the new update in fact corresponds to an online EM step applied to a probabilistic k-PCA model. The probabilistic view of the update allows us to combine multiple models in a distributed setting. We show experimentally that the implicit Krasulina update yields superior convergence while being significantly faster. We also give strong evidence that the new update can benefit from parallelism and is more stable w.r.t. tuning of the learning rate.

Author(s):  
Mauro Mangia ◽  
Letizia Magenta ◽  
Alex Marchioni ◽  
Fabio Pareschi ◽  
Riccardo Rovatti ◽  
...  

2018 ◽  
Vol 28 (3) ◽  
pp. 2625-2653 ◽  
Author(s):  
Jian-Feng Cai ◽  
Tianming Wang ◽  
Ke Wei

2016 ◽  
Vol 62 (4) ◽  
pp. 2092-2099 ◽  
Author(s):  
Sohail Bahmani ◽  
Petros T. Boufounos ◽  
Bhiksha Raj

Sign in / Sign up

Export Citation Format

Share Document