scholarly journals A Low-Rank Matrix Equation Method for Solving PDE-Constrained Optimization Problems

2021 ◽  
pp. S637-S654
Author(s):  
Alexandra Bünger ◽  
Valeria Simoncini ◽  
Martin Stoll
2020 ◽  
Vol 40 (4) ◽  
pp. 2626-2651
Author(s):  
André Uschmajew ◽  
Bart Vandereycken

Abstract The absence of spurious local minima in certain nonconvex low-rank matrix recovery problems has been of recent interest in computer science, machine learning and compressed sensing since it explains the convergence of some low-rank optimization methods to global optima. One such example is low-rank matrix sensing under restricted isometry properties (RIPs). It can be formulated as a minimization problem for a quadratic function on the Riemannian manifold of low-rank matrices, with a positive semidefinite Riemannian Hessian that acts almost like an identity on low-rank matrices. In this work new estimates for singular values of local minima for such problems are given, which lead to improved bounds on RIP constants to ensure absence of nonoptimal local minima and sufficiently negative curvature at all other critical points. A geometric viewpoint is taken, which is inspired by the fact that the Euclidean distance function to a rank-$k$ matrix possesses no critical points on the corresponding embedded submanifold of rank-$k$ matrices except for the single global minimum.


2013 ◽  
Vol 2013 ◽  
pp. 1-9 ◽  
Author(s):  
Lingchen Kong ◽  
Levent Tunçel ◽  
Naihua Xiu

Low-rank matrix recovery (LMR) is a rank minimization problem subject to linear equality constraints, and it arises in many fields such as signal and image processing, statistics, computer vision, and system identification and control. This class of optimization problems is generally𝒩𝒫hard. A popular approach replaces the rank function with the nuclear norm of the matrix variable. In this paper, we extend and characterize the concept ofs-goodness for a sensing matrix in sparse signal recovery (proposed by Juditsky and Nemirovski (Math Program, 2011)) to linear transformations in LMR. Using the two characteristics-goodness constants,γsandγ^s, of a linear transformation, we derive necessary and sufficient conditions for a linear transformation to bes-good. Moreover, we establish the equivalence ofs-goodness and the null space properties. Therefore,s-goodness is a necessary and sufficient condition for exacts-rank matrix recovery via the nuclear norm minimization.


Sign in / Sign up

Export Citation Format

Share Document