scholarly journals A Multilevel Proximal Gradient Algorithm for a Class of Composite Optimization Problems

2017 ◽  
Vol 39 (5) ◽  
pp. S681-S701 ◽  
Author(s):  
Panos Parpas
2019 ◽  
Vol 35 (3) ◽  
pp. 371-378
Author(s):  
PORNTIP PROMSINCHAI ◽  
NARIN PETROT ◽  
◽  
◽  

In this paper, we consider convex constrained optimization problems with composite objective functions over the set of a minimizer of another function. The main aim is to test numerically a new algorithm, namely a stochastic block coordinate proximal-gradient algorithm with penalization, by comparing both the number of iterations and CPU times between this introduced algorithm and the other well-known types of block coordinate descent algorithm for finding solutions of the randomly generated optimization problems with regularization term.


2015 ◽  
Vol 63 (22) ◽  
pp. 6013-6023 ◽  
Author(s):  
Wei Shi ◽  
Qing Ling ◽  
Gang Wu ◽  
Wotao Yin

Author(s):  
Youcheng Niu ◽  
Huaqing Li ◽  
Zheng Wang ◽  
Qingguo Lu ◽  
Dawen Xia ◽  
...  

Author(s):  
Dmitry Grishchenko ◽  
Franck Iutzeler ◽  
Jérôme Malick

Many applications in machine learning or signal processing involve nonsmooth optimization problems. This nonsmoothness brings a low-dimensional structure to the optimal solutions. In this paper, we propose a randomized proximal gradient method harnessing this underlying structure. We introduce two key components: (i) a random subspace proximal gradient algorithm; and (ii) an identification-based sampling of the subspaces. Their interplay brings a significant performance improvement on typical learning problems in terms of dimensions explored.


2013 ◽  
Vol 2013 ◽  
pp. 1-6
Author(s):  
Congying Han ◽  
Mingqiang Li ◽  
Tong Zhao ◽  
Tiande Guo

Recently, the existed proximal gradient algorithms had been used to solve non-smooth convex optimization problems. As a special nonsmooth convex problem, the singly linearly constrained quadratic programs with box constraints appear in a wide range of applications. Hence, we propose an accelerated proximal gradient algorithm for singly linearly constrained quadratic programs with box constraints. At each iteration, the subproblem whose Hessian matrix is diagonal and positive definite is an easy model which can be solved efficiently via searching a root of a piecewise linear function. It is proved that the new algorithm can terminate at anε-optimal solution withinO(1/ε)iterations. Moreover, no line search is needed in this algorithm, and the global convergence can be proved under mild conditions. Numerical results are reported for solving quadratic programs arising from the training of support vector machines, which show that the new algorithm is efficient.


Sign in / Sign up

Export Citation Format

Share Document