scholarly journals On Nonconvex Optimization Problems with D.C. Equality and Inequality Constraints

2018 ◽  
Vol 51 (32) ◽  
pp. 895-900
Author(s):  
Alexander S. Strekalovsky
2013 ◽  
Vol 2013 ◽  
pp. 1-9
Author(s):  
Yuan Lu ◽  
Wei Wang ◽  
Li-Ping Pang ◽  
Dan Li

A class of constrained nonsmooth nonconvex optimization problems, that is, piecewiseC2objectives with smooth inequality constraints are discussed in this paper. Based on the𝒱𝒰-theory, a superlinear convergent𝒱𝒰-algorithm, which uses a nonconvex redistributed proximal bundle subroutine, is designed to solve these optimization problems. An illustrative example is given to show how this convergent method works on a Second-Order Cone programming problem.


2021 ◽  
Author(s):  
Tianyi Liu ◽  
Zhehui Chen ◽  
Enlu Zhou ◽  
Tuo Zhao

Momentum stochastic gradient descent (MSGD) algorithm has been widely applied to many nonconvex optimization problems in machine learning (e.g., training deep neural networks, variational Bayesian inference, etc.). Despite its empirical success, there is still a lack of theoretical understanding of convergence properties of MSGD. To fill this gap, we propose to analyze the algorithmic behavior of MSGD by diffusion approximations for nonconvex optimization problems with strict saddle points and isolated local optima. Our study shows that the momentum helps escape from saddle points but hurts the convergence within the neighborhood of optima (if without the step size annealing or momentum annealing). Our theoretical discovery partially corroborates the empirical success of MSGD in training deep neural networks.


2021 ◽  
Vol Volume 2 (Original research articles>) ◽  
Author(s):  
Lisa C. Hegerhorst-Schultchen ◽  
Christian Kirches ◽  
Marc C. Steinbach

This work continues an ongoing effort to compare non-smooth optimization problems in abs-normal form to Mathematical Programs with Complementarity Constraints (MPCCs). We study general Nonlinear Programs with equality and inequality constraints in abs-normal form, so-called Abs-Normal NLPs, and their relation to equivalent MPCC reformulations. We introduce the concepts of Abadie's and Guignard's kink qualification and prove relations to MPCC-ACQ and MPCC-GCQ for the counterpart MPCC formulations. Due to non-uniqueness of a specific slack reformulation suggested in [10], the relations are non-trivial. It turns out that constraint qualifications of Abadie type are preserved. We also prove the weaker result that equivalence of Guginard's (and Abadie's) constraint qualifications for all branch problems hold, while the question of GCQ preservation remains open. Finally, we introduce M-stationarity and B-stationarity concepts for abs-normal NLPs and prove first order optimality conditions corresponding to MPCC counterpart formulations.


Sign in / Sign up

Export Citation Format

Share Document