strongly convex optimization
Recently Published Documents


TOTAL DOCUMENTS

23
(FIVE YEARS 13)

H-INDEX

5
(FIVE YEARS 2)

Author(s):  
Jakub Wiktor Both

AbstractIn this paper, the convergence of the fundamental alternating minimization is established for non-smooth non-strongly convex optimization problems in Banach spaces, and novel rates of convergence are provided. As objective function a composition of a smooth, and a block-separable, non-smooth part is considered, covering a large range of applications. For the former, three different relaxations of strong convexity are considered: (i) quasi-strong convexity; (ii) quadratic functional growth; and (iii) plain convexity. With new and improved rates benefiting from both separate steps of the scheme, linear convergence is proved for (i) and (ii), whereas sublinear convergence is showed for (iii).


2021 ◽  
Vol 0 (0) ◽  
pp. 0
Author(s):  
Jin-Zan Liu ◽  
Xin-Wei Liu

<p style='text-indent:20px;'>We consider a convex composite minimization problem, whose objective is the sum of a relatively-strongly convex function and a closed proper convex function. A dual Bregman proximal gradient method is proposed for solving this problem and is shown that the convergence rate of the primal sequence is <inline-formula><tex-math id="M1">\begin{document}$ O(\frac{1}{k}) $\end{document}</tex-math></inline-formula>. Moreover, based on the acceleration scheme, we prove that the convergence rate of the primal sequence is <inline-formula><tex-math id="M2">\begin{document}$ O(\frac{1}{k^{\gamma}}) $\end{document}</tex-math></inline-formula>, where <inline-formula><tex-math id="M3">\begin{document}$ \gamma\in[1,2] $\end{document}</tex-math></inline-formula> is determined by the triangle scaling property of the Bregman distance.</p>


Sign in / Sign up

Export Citation Format

Share Document