Estimating Gene Function With Least Squares Nonnegative Matrix Factorization

Author(s):  
Guoli Wang ◽  
Michael F. Ochs
2019 ◽  
Vol 31 (2) ◽  
pp. 417-439 ◽  
Author(s):  
Andersen Man Shun Ang ◽  
Nicolas Gillis

We propose a general framework to accelerate significantly the algorithms for nonnegative matrix factorization (NMF). This framework is inspired from the extrapolation scheme used to accelerate gradient methods in convex optimization and from the method of parallel tangents. However, the use of extrapolation in the context of the exact coordinate descent algorithms tackling the nonconvex NMF problems is novel. We illustrate the performance of this approach on two state-of-the-art NMF algorithms: accelerated hierarchical alternating least squares and alternating nonnegative least squares, using synthetic, image, and document data sets.


2012 ◽  
Vol 24 (4) ◽  
pp. 1085-1105 ◽  
Author(s):  
Nicolas Gillis ◽  
François Glineur

Nonnegative matrix factorization (NMF) is a data analysis technique used in a great variety of applications such as text mining, image processing, hyperspectral data analysis, computational biology, and clustering. In this letter, we consider two well-known algorithms designed to solve NMF problems: the multiplicative updates of Lee and Seung and the hierarchical alternating least squares of Cichocki et al. We propose a simple way to significantly accelerate these schemes, based on a careful analysis of the computational cost needed at each iteration, while preserving their convergence properties. This acceleration technique can also be applied to other algorithms, which we illustrate on the projected gradient method of Lin. The efficiency of the accelerated algorithms is empirically demonstrated on image and text data sets and compares favorably with a state-of-the-art alternating nonnegative least squares algorithm.


Sign in / Sign up

Export Citation Format

Share Document