We take into condideration necessary optimality conditions of minimum principle-type, that is for optimization problems having, besides the usual inequality and/or equality constraints, a set constraint. The first part pf the paper is concerned with scalar optimization problems; the second part of the paper deals with vector optimization problems.
In this paper we introduce a notion of generalized derivative for nonsmooth vector functions in order to obtain necessary optimality conditions for vector optimization problems. This definition generalizes to the vector case the notion introduced by Michel and Penot and extended by Yang and Jeyakumar. This generalized derivative is contained in the Clarke subdifferential and then the corresponding optimality conditions are sharper than the Clarke's ones.
The aim of this note is to present some second-order Karush–Kuhn–Tucker necessary optimality conditions for vector optimization problems, which modify the incorrect result in ((10), Thm. 3.2).