scholarly journals Convergence of a stochastic subgradient method with averaging for nonsmooth nonconvex constrained optimization

2020 ◽  
Vol 14 (7) ◽  
pp. 1615-1625
Author(s):  
Andrzej Ruszczyński
Author(s):  
Gabriele Eichfelder ◽  
Kathrin Klamroth ◽  
Julia Niebling

AbstractA major difficulty in optimization with nonconvex constraints is to find feasible solutions. As simple examples show, the $$\alpha $$ α BB-algorithm for single-objective optimization may fail to compute feasible solutions even though this algorithm is a popular method in global optimization. In this work, we introduce a filtering approach motivated by a multiobjective reformulation of the constrained optimization problem. Moreover, the multiobjective reformulation enables to identify the trade-off between constraint satisfaction and objective value which is also reflected in the quality guarantee. Numerical tests validate that we indeed can find feasible and often optimal solutions where the classical single-objective $$\alpha $$ α BB method fails, i.e., it terminates without ever finding a feasible solution.


2014 ◽  
Vol 2014 ◽  
pp. 1-10 ◽  
Author(s):  
Wenling Zhao ◽  
Daojin Song ◽  
Bingzhuang Liu

We present a global error bound for the projected gradient of nonconvex constrained optimization problems and a local error bound for the distance from a feasible solution to the optimal solution set of convex constrained optimization problems, by using the merit function involved in the sequential quadratic programming (SQP) method. For the solution sets (stationary points set andKKTpoints set) of nonconvex constrained optimization problems, we establish the definitions of generalized nondegeneration and generalized weak sharp minima. Based on the above, the necessary and sufficient conditions for a feasible solution of the nonconvex constrained optimization problems to terminate finitely at the two solutions are given, respectively. Accordingly, the results in this paper improve and popularize existing results known in the literature. Further, we utilize the global error bound for the projected gradient with the merit function being computed easily to describe these necessary and sufficient conditions.


2018 ◽  
Vol 12 (2) ◽  
pp. 165-176
Author(s):  
Zhujun Wang ◽  
Li Cai ◽  
Zheng Peng

We present a family of new inexact secant methods in association with Armijo line search technique for solving nonconvex constrained optimization. Different from the existing inexact secant methods, the algorithms proposed in this paper need not compute exact directions. By adopting the nonsmooth exact penalty function as the merit function, the global convergence of the proposed algorithms is established under some reasonable conditions. Some numerical results indicate that the proposed algorithms are both feasible and effective.


Sign in / Sign up

Export Citation Format

Share Document