Proximal-like incremental aggregated gradient method with Bregman distance in weakly convex optimization problems

Author(s):  
Zehui Jia ◽  
Jieru Huang ◽  
Xingju Cai
2012 ◽  
Vol 2012 ◽  
pp. 1-10
Author(s):  
Shunhou Fan ◽  
Yonghong Yao

The projected-gradient method is a powerful tool for solving constrained convex optimization problems and has extensively been studied. In the present paper, a projected-gradient method is presented for solving the minimization problem, and the strong convergence analysis of the suggested gradient projection method is given.


Author(s):  
V. Kungurtsev ◽  
F. Rinaldi

AbstractIn this paper, we consider stochastic weakly convex optimization problems, however without the existence of a stochastic subgradient oracle. We present a derivative free algorithm that uses a two point approximation for computing a gradient estimate of the smoothed function. We prove convergence at a similar rate as state of the art methods, however with a larger constant, and report some numerical results showing the effectiveness of the approach.


Sign in / Sign up

Export Citation Format

Share Document