Necessary and Sufficient Optimality Conditions for Fractional Interval-Valued Optimization Problems

Author(s):  
Indira P. Debnath ◽  
S. K. Gupta
Mathematics ◽  
2018 ◽  
Vol 7 (1) ◽  
pp. 12 ◽  
Author(s):  
Xiangkai Sun ◽  
Hongyong Fu ◽  
Jing Zeng

This paper deals with robust quasi approximate optimal solutions for a nonsmooth semi-infinite optimization problems with uncertainty data. By virtue of the epigraphs of the conjugates of the constraint functions, we first introduce a robust type closed convex constraint qualification. Then, by using the robust type closed convex constraint qualification and robust optimization technique, we obtain some necessary and sufficient optimality conditions for robust quasi approximate optimal solution and exact optimal solution of this nonsmooth uncertain semi-infinite optimization problem. Moreover, the obtained results in this paper are applied to a nonsmooth uncertain optimization problem with cone constraints.


2014 ◽  
Vol 2014 ◽  
pp. 1-5 ◽  
Author(s):  
Vasile Preda

We consider an interval-valued multiobjective problem. Some necessary and sufficient optimality conditions for weak efficient solutions are established under new generalized convexities with the tool-right upper-Dini-derivative, which is an extension of directional derivative. Also some duality results are proved for Wolfe and Mond-Weir duals.


Author(s):  
Christodoulos A. Floudas

This chapter discusses the fundamentals of nonlinear optimization. Section 3.1 focuses on optimality conditions for unconstrained nonlinear optimization. Section 3.2 presents the first-order and second-order optimality conditions for constrained nonlinear optimization problems. This section presents the formulation and basic definitions of unconstrained nonlinear optimization along with the necessary, sufficient, and necessary and sufficient optimality conditions. An unconstrained nonlinear optimization problem deals with the search for a minimum of a nonlinear function f(x) of n real variables x = (x1, x2 , . . . , xn and is denoted as Each of the n nonlinear variables x1, x2 , . . . , xn are allowed to take any value from - ∞ to + ∞. Unconstrained nonlinear optimization problems arise in several science and engineering applications ranging from simultaneous solution of nonlinear equations (e.g., chemical phase equilibrium) to parameter estimation and identification problems (e.g., nonlinear least squares).


2011 ◽  
Vol 18 (1) ◽  
pp. 53-66
Author(s):  
Najia Benkenza ◽  
Nazih Gadhi ◽  
Lahoussine Lafhim

Abstract Using a special scalarization employed for the first time for the study of necessary optimality conditions in vector optimization by Ciligot-Travain [Numer. Funct. Anal. Optim. 15: 689–693, 1994], we give necessary optimality conditions for a set-valued optimization problem by establishing the existence of Lagrange–Fritz–John multipliers. Also, sufficient optimality conditions are given without any Lipschitz assumption.


Author(s):  
Mohsine Jennane ◽  
Lhoussain El Fadil ◽  
El Mostafa Kalmoun

Interval-valued functions have been widely used to accommodate data inexactness in optimization and decision theory. In this paper, we study interval-valued vector optimization problems, and derive their relationships to interval variational inequality problems, of both Stampacchia and Minty types. Using the concept of interval approximate convexity, we establish necessary and sufficient optimality conditions for local strong quasi and approximate $LU$-efficient solutions to nonsmooth optimization problems with interval-valued multiobjective functions.


Author(s):  
Tadeusz Antczak ◽  
Gabriel Ruiz-Garzón

In this paper, a new class of nonconvex nonsmooth multiobjective programming problems with directionally differentiable functions is considered. The so-called G-V-type I objective and constraint functions and their generalizations are introduced for such nonsmooth vector optimization problems. Based upon these generalized invex functions, necessary and sufficient optimality conditions are established for directionally differentiable multiobjective programming problems. Thus, new Fritz John type and Karush-Kuhn-Tucker type necessary optimality conditions are proved for the considered directionally differentiable multiobjective programming problem. Further, weak, strong and converse duality theorems are also derived for Mond-Weir type vector dual programs.


Author(s):  
Jutamas Kerdkaew ◽  
Rabian Wangkeeree ◽  
Rattanaporn Wangkeereee

AbstractIn this paper, we investigate an uncertain multiobjective optimization problem involving nonsmooth and nonconvex functions. The notion of a (local/global) robust weak sharp efficient solution is introduced. Then, we establish necessary and sufficient optimality conditions for local and/or the robust weak sharp efficient solutions of the considered problem. These optimality conditions are presented in terms of multipliers and Mordukhovich/limiting subdifferentials of the related functions.


Sign in / Sign up

Export Citation Format

Share Document