scholarly journals Alternating minimization methods for strongly convex optimization

2021 ◽  
Vol 0 (0) ◽  
Author(s):  
Nazarii Tupitsa ◽  
Pavel Dvurechensky ◽  
Alexander Gasnikov ◽  
Sergey Guminov

Abstract We consider alternating minimization procedures for convex and non-convex optimization problems with the vector of variables divided into several blocks, each block being amenable for minimization with respect to its variables while maintaining other variables blocks constant. In the case of two blocks, we prove a linear convergence rate for an alternating minimization procedure under the Polyak–Łojasiewicz (PL) condition, which can be seen as a relaxation of the strong convexity assumption. Under the strong convexity assumption in the many-blocks setting, we provide an accelerated alternating minimization procedure with linear convergence rate depending on the square root of the condition number as opposed to just the condition number for the non-accelerated method. We also consider the problem of finding an approximate non-negative solution to a linear system of equations A ⁢ x = y {Ax=y} with alternating minimization of Kullback–Leibler (KL) divergence between Ax and y.

Author(s):  
Jakub Wiktor Both

AbstractIn this paper, the convergence of the fundamental alternating minimization is established for non-smooth non-strongly convex optimization problems in Banach spaces, and novel rates of convergence are provided. As objective function a composition of a smooth, and a block-separable, non-smooth part is considered, covering a large range of applications. For the former, three different relaxations of strong convexity are considered: (i) quasi-strong convexity; (ii) quadratic functional growth; and (iii) plain convexity. With new and improved rates benefiting from both separate steps of the scheme, linear convergence is proved for (i) and (ii), whereas sublinear convergence is showed for (iii).


2019 ◽  
Vol 2019 (1) ◽  
Author(s):  
Shijie Sun ◽  
Meiling Feng ◽  
Luoyi Shi

Abstract This paper considers an iterative algorithm of solving the multiple-sets split equality problem (MSSEP) whose step size is independent of the norm of the related operators, and investigates its sublinear and linear convergence rate. In particular, we present a notion of bounded Hölder regularity property for the MSSEP, which is a generalization of the well-known concept of bounded linear regularity property, and give several sufficient conditions to ensure it. Then we use this property to conclude the sublinear and linear convergence rate of the algorithm. In the end, some numerical experiments are provided to verify the validity of our consequences.


Author(s):  
Ran Gu ◽  
Qiang Du

Abstract How to choose the step size of gradient descent method has been a popular subject of research. In this paper we propose a modified limited memory steepest descent method (MLMSD). In each iteration we propose a selection rule to pick a unique step size from a candidate set, which is calculated by Fletcher’s limited memory steepest descent method (LMSD), instead of going through all the step sizes in a sweep, as in Fletcher’s original LMSD algorithm. MLMSD is motivated by an inexact super-linear convergence rate analysis. The R-linear convergence of MLMSD is proved for a strictly convex quadratic minimization problem. Numerical tests are presented to show that our algorithm is efficient and robust.


Sign in / Sign up

Export Citation Format

Share Document