scholarly journals Extensions of Firefly Algorithm for Nonsmooth Nonconvex Constrained Optimization Problems

Author(s):  
Rogério B. Francisco ◽  
M. Fernanda P. Costa ◽  
Ana Maria A. C. Rocha
2014 ◽  
Vol 2014 ◽  
pp. 1-10 ◽  
Author(s):  
Wenling Zhao ◽  
Daojin Song ◽  
Bingzhuang Liu

We present a global error bound for the projected gradient of nonconvex constrained optimization problems and a local error bound for the distance from a feasible solution to the optimal solution set of convex constrained optimization problems, by using the merit function involved in the sequential quadratic programming (SQP) method. For the solution sets (stationary points set andKKTpoints set) of nonconvex constrained optimization problems, we establish the definitions of generalized nondegeneration and generalized weak sharp minima. Based on the above, the necessary and sufficient conditions for a feasible solution of the nonconvex constrained optimization problems to terminate finitely at the two solutions are given, respectively. Accordingly, the results in this paper improve and popularize existing results known in the literature. Further, we utilize the global error bound for the projected gradient with the merit function being computed easily to describe these necessary and sufficient conditions.


2018 ◽  
Vol 2018 ◽  
pp. 1-21
Author(s):  
Wilhelm P. Freire ◽  
Afonso C. C. Lemonge ◽  
Tales L. Fonseca ◽  
Hernando J. R. Franco

The Interior Epigraph Directions (IED) method for solving constrained nonsmooth and nonconvex optimization problem via Generalized Augmented Lagrangian Duality considers the dual problem induced by a Generalized Augmented Lagrangian Duality scheme and obtains the primal solution by generating a sequence of iterates in the interior of the epigraph of the dual function. In this approach, the value of the dual function at some point in the dual space is given by minimizing the Lagrangian. The first version of the IED method uses the Matlab routine fminsearch for this minimization. The second version uses NFDNA, a tailored algorithm for unconstrained, nonsmooth and nonconvex problems. However, the results obtained with fminsearch and NFDNA were not satisfactory. The current version of the IED method, presented in this work, employs a Genetic Algorithm, which is free of any strategy to handle the constraints, a difficult task when a metaheuristic, such as GA, is applied alone to solve constrained optimization problems. Two sets of constrained optimization problems from mathematics and mechanical engineering were solved and compared with literature. It is shown that the proposed hybrid algorithm is able to solve problems where fminsearch and NFDNA fail.


Author(s):  
Francisco Facchinei ◽  
Vyacheslav Kungurtsev ◽  
Lorenzo Lampariello ◽  
Gesualdo Scutari

We consider nonconvex constrained optimization problems and propose a new approach to the convergence analysis based on penalty functions. We make use of classical penalty functions in an unconventional way, in that penalty functions only enter in the theoretical analysis of convergence while the algorithm itself is penalty free. Based on this idea, we are able to establish several new results, including the first general analysis for diminishing stepsize methods in nonconvex, constrained optimization, showing convergence to generalized stationary points, and a complexity study for sequential quadratic programming–type algorithms.


Sign in / Sign up

Export Citation Format

Share Document