Second-order sequence-based necessary optimality conditions in constrained nonsmooth vector optimization and applications

Positivity ◽  
2017 ◽  
Vol 22 (1) ◽  
pp. 159-190 ◽  
Author(s):  
Nguyen Dinh Tuan
2020 ◽  
Vol 9 (2) ◽  
pp. 383-398
Author(s):  
Sunila Sharma ◽  
Priyanka Yadav

Recently, Suneja et al. [26] introduced new classes of second-order cone-(η; ξ)-convex functions along with theirgeneralizations and used them to prove second-order Karush–Kuhn–Tucker (KKT) type optimality conditions and duality results for the vector optimization problem involving first-order differentiable and second-order directionally differentiable functions. In this paper, we move one step ahead and study a nonsmooth vector optimization problem wherein the functions involved are first and second-order directionally differentiable. We introduce new classes of nonsmooth second-order cone-semipseudoconvex and nonsmooth second-order cone-semiquasiconvex functions in terms of second-order directional derivatives. Second-order KKT type sufficient optimality conditions and duality results for the same problem are proved using these functions.


2003 ◽  
Vol 8 (2) ◽  
pp. 165-174 ◽  
Author(s):  
Davide La Torre

In this paper we introduce a notion of generalized derivative for nonsmooth vector functions in order to obtain necessary optimality conditions for vector optimization problems. This definition generalizes to the vector case the notion introduced by Michel and Penot and extended by Yang and Jeyakumar. This generalized derivative is contained in the Clarke subdifferential and then the corresponding optimality conditions are sharper than the Clarke's ones.


2018 ◽  
Vol 52 (2) ◽  
pp. 567-575 ◽  
Author(s):  
Do Sang Kim ◽  
Nguyen Van Tuyen

The aim of this note is to present some second-order Karush–Kuhn–Tucker necessary optimality conditions for vector optimization problems, which modify the incorrect result in ((10), Thm. 3.2).


Sign in / Sign up

Export Citation Format

Share Document