Fundamentals of Nonlinear Optimization

Author(s):  
Christodoulos A. Floudas

This chapter discusses the fundamentals of nonlinear optimization. Section 3.1 focuses on optimality conditions for unconstrained nonlinear optimization. Section 3.2 presents the first-order and second-order optimality conditions for constrained nonlinear optimization problems. This section presents the formulation and basic definitions of unconstrained nonlinear optimization along with the necessary, sufficient, and necessary and sufficient optimality conditions. An unconstrained nonlinear optimization problem deals with the search for a minimum of a nonlinear function f(x) of n real variables x = (x1, x2 , . . . , xn and is denoted as Each of the n nonlinear variables x1, x2 , . . . , xn are allowed to take any value from - ∞ to + ∞. Unconstrained nonlinear optimization problems arise in several science and engineering applications ranging from simultaneous solution of nonlinear equations (e.g., chemical phase equilibrium) to parameter estimation and identification problems (e.g., nonlinear least squares).

Mathematics ◽  
2018 ◽  
Vol 7 (1) ◽  
pp. 12 ◽  
Author(s):  
Xiangkai Sun ◽  
Hongyong Fu ◽  
Jing Zeng

This paper deals with robust quasi approximate optimal solutions for a nonsmooth semi-infinite optimization problems with uncertainty data. By virtue of the epigraphs of the conjugates of the constraint functions, we first introduce a robust type closed convex constraint qualification. Then, by using the robust type closed convex constraint qualification and robust optimization technique, we obtain some necessary and sufficient optimality conditions for robust quasi approximate optimal solution and exact optimal solution of this nonsmooth uncertain semi-infinite optimization problem. Moreover, the obtained results in this paper are applied to a nonsmooth uncertain optimization problem with cone constraints.


2011 ◽  
Vol 18 (1) ◽  
pp. 53-66
Author(s):  
Najia Benkenza ◽  
Nazih Gadhi ◽  
Lahoussine Lafhim

Abstract Using a special scalarization employed for the first time for the study of necessary optimality conditions in vector optimization by Ciligot-Travain [Numer. Funct. Anal. Optim. 15: 689–693, 1994], we give necessary optimality conditions for a set-valued optimization problem by establishing the existence of Lagrange–Fritz–John multipliers. Also, sufficient optimality conditions are given without any Lipschitz assumption.


Author(s):  
Tadeusz Antczak ◽  
Gabriel Ruiz-Garzón

In this paper, a new class of nonconvex nonsmooth multiobjective programming problems with directionally differentiable functions is considered. The so-called G-V-type I objective and constraint functions and their generalizations are introduced for such nonsmooth vector optimization problems. Based upon these generalized invex functions, necessary and sufficient optimality conditions are established for directionally differentiable multiobjective programming problems. Thus, new Fritz John type and Karush-Kuhn-Tucker type necessary optimality conditions are proved for the considered directionally differentiable multiobjective programming problem. Further, weak, strong and converse duality theorems are also derived for Mond-Weir type vector dual programs.


Author(s):  
Jutamas Kerdkaew ◽  
Rabian Wangkeeree ◽  
Rattanaporn Wangkeereee

AbstractIn this paper, we investigate an uncertain multiobjective optimization problem involving nonsmooth and nonconvex functions. The notion of a (local/global) robust weak sharp efficient solution is introduced. Then, we establish necessary and sufficient optimality conditions for local and/or the robust weak sharp efficient solutions of the considered problem. These optimality conditions are presented in terms of multipliers and Mordukhovich/limiting subdifferentials of the related functions.


Mathematics ◽  
2020 ◽  
Vol 8 (7) ◽  
pp. 1152
Author(s):  
Gabriel Ruiz-Garzón ◽  
Jaime Ruiz-Zapatero ◽  
Rafaela Osuna-Gómez ◽  
Antonio Rufián-Lizana

This work is intended to lead a study of necessary and sufficient optimality conditions for scalar optimization problems on Hadamard manifolds. In the context of this geometry, we obtain and present new function types characterized by the property of having all their second-order stationary points be global minimums. In order to do so, we extend the concept convexity in Euclidean space to a more general notion of invexity on Hadamard manifolds. This is done employing notions of second-order directional derivatives, second-order pseudoinvexity functions, and the second-order Karush–Kuhn–Tucker-pseudoinvexity problem. Thus, we prove that every second-order stationary point is a global minimum if and only if the problem is either second-order pseudoinvex or second-order KKT-pseudoinvex depending on whether the problem regards unconstrained or constrained scalar optimization, respectively. This result has not been presented in the literature before. Finally, examples of these new characterizations are provided in the context of “Higgs Boson like” potentials, among others.


Sign in / Sign up

Export Citation Format

Share Document