linear convergence rate
Recently Published Documents


TOTAL DOCUMENTS

35
(FIVE YEARS 18)

H-INDEX

5
(FIVE YEARS 1)

2021 ◽  
Vol 0 (0) ◽  
Author(s):  
Nazarii Tupitsa ◽  
Pavel Dvurechensky ◽  
Alexander Gasnikov ◽  
Sergey Guminov

Abstract We consider alternating minimization procedures for convex and non-convex optimization problems with the vector of variables divided into several blocks, each block being amenable for minimization with respect to its variables while maintaining other variables blocks constant. In the case of two blocks, we prove a linear convergence rate for an alternating minimization procedure under the Polyak–Łojasiewicz (PL) condition, which can be seen as a relaxation of the strong convexity assumption. Under the strong convexity assumption in the many-blocks setting, we provide an accelerated alternating minimization procedure with linear convergence rate depending on the square root of the condition number as opposed to just the condition number for the non-accelerated method. We also consider the problem of finding an approximate non-negative solution to a linear system of equations A ⁢ x = y {Ax=y} with alternating minimization of Kullback–Leibler (KL) divergence between Ax and y.


Author(s):  
Min Li ◽  
Zhongming Wu

In this paper, we propose an inexact majorized symmetric Gauss–Seidel (sGS) alternating direction method of multipliers (ADMM) with indefinite proximal terms for multi-block convex composite programming. This method is a specific form of the inexact majorized ADMM which is further proposed to solve a general two-block separable optimization problem. The new methods adopt certain relative error criteria to solve the involving subproblems approximately, and the step-sizes allow to choose in the scope [Formula: see text]. Under more general conditions, we establish the global convergence and Q-linear convergence rate of the proposed methods.


Symmetry ◽  
2020 ◽  
Vol 12 (9) ◽  
pp. 1456
Author(s):  
Aviv Gibali ◽  
Yekini Shehu

The forward–backward–forward (FBF) splitting method is a popular iterative procedure for finding zeros of the sum of maximal monotone and Lipschitz continuous monotone operators. In this paper, we introduce a forward–backward–forward splitting method with reflection steps (symmetric) in real Hilbert spaces. Weak and strong convergence analyses of the proposed method are established under suitable assumptions. Moreover, a linear convergence rate of an inertial modified forward–backward–forward splitting method is also presented.


Author(s):  
Yi Zhou ◽  
Zhe Wang ◽  
Kaiyi Ji ◽  
Yingbin Liang ◽  
Vahid Tarokh

Various types of parameter restart schemes have been proposed for proximal gradient algorithm with momentum to facilitate their convergence in convex optimization. However, under parameter restart, the convergence of proximal gradient algorithm with momentum remains obscure in nonconvex optimization. In this paper, we propose a novel proximal gradient algorithm with momentum and parameter restart for solving nonconvex and nonsmooth problems. Our algorithm is designed to 1) allow for adopting flexible parameter restart schemes that cover many existing ones; 2) have a global sub-linear convergence rate in nonconvex and nonsmooth optimization; and 3) have guaranteed convergence to a critical point and have various types of asymptotic convergence rates depending on the parameterization of local geometry in nonconvex and nonsmooth optimization. Numerical experiments demonstrate the convergence and effectiveness of our proposed algorithm.


Sign in / Sign up

Export Citation Format

Share Document