scholarly journals A dual Bregman proximal gradient method for relatively-strongly convex optimization

2021 ◽  
Vol 0 (0) ◽  
pp. 0
Author(s):  
Jin-Zan Liu ◽  
Xin-Wei Liu

<p style='text-indent:20px;'>We consider a convex composite minimization problem, whose objective is the sum of a relatively-strongly convex function and a closed proper convex function. A dual Bregman proximal gradient method is proposed for solving this problem and is shown that the convergence rate of the primal sequence is <inline-formula><tex-math id="M1">\begin{document}$ O(\frac{1}{k}) $\end{document}</tex-math></inline-formula>. Moreover, based on the acceleration scheme, we prove that the convergence rate of the primal sequence is <inline-formula><tex-math id="M2">\begin{document}$ O(\frac{1}{k^{\gamma}}) $\end{document}</tex-math></inline-formula>, where <inline-formula><tex-math id="M3">\begin{document}$ \gamma\in[1,2] $\end{document}</tex-math></inline-formula> is determined by the triangle scaling property of the Bregman distance.</p>

2015 ◽  
Vol 56 ◽  
pp. 160 ◽  
Author(s):  
Jueyou Li ◽  
Changzhi Wu ◽  
Zhiyou Wu ◽  
Qiang Long ◽  
Xiangyu Wang

Author(s):  
Hui Zhang ◽  
Yu-Hong Dai ◽  
Lei Guo ◽  
Wei Peng

We introduce a unified algorithmic framework, called the proximal-like incremental aggregated gradient (PLIAG) method, for minimizing the sum of a convex function that consists of additive relatively smooth convex components and a proper lower semicontinuous convex regularization function over an abstract feasible set whose geometry can be captured by using the domain of a Legendre function. The PLIAG method includes many existing algorithms in the literature as special cases, such as the proximal gradient method, the Bregman proximal gradient method (also called the NoLips algorithm), the incremental aggregated gradient method, the incremental aggregated proximal method, and the proximal incremental aggregated gradient method. It also includes some novel interesting iteration schemes. First, we show that the PLIAG method is globally sublinearly convergent without requiring a growth condition, which extends the sublinear convergence result for the proximal gradient algorithm to incremental aggregated-type first-order methods. Then, by embedding a so-called Bregman distance growth condition into a descent-type lemma to construct a special Lyapunov function, we show that the PLIAG method is globally linearly convergent in terms of both function values and Bregman distances to the optimal solution set, provided that the step size is not greater than some positive constant. The convergence results derived in this paper are all established beyond the standard assumptions in the literature (i.e., without requiring the strong convexity and the Lipschitz gradient continuity of the smooth part of the objective). When specialized to many existing algorithms, our results recover or supplement their convergence results under strictly weaker conditions.


Sign in / Sign up

Export Citation Format

Share Document