scholarly journals Sufficient Optimality and Sensitivity Analysis of a Parameterized Min-Max Programming

2012 ◽  
Vol 2012 ◽  
pp. 1-9
Author(s):  
Huijuan Xiong ◽  
Yu Xiao ◽  
Chaohong Song

Sufficient optimality and sensitivity of a parameterized min-max programming with fixed feasible set are analyzed. Based on Clarke's subdifferential and Chaney's second-order directional derivative, sufficient optimality of the parameterized min-max programming is discussed first. Moreover, under a convex assumption on the objective function, a subdifferential computation formula of the marginal function is obtained. The assumptions are satisfied naturally for some application problems. Moreover, the formulae based on these assumptions are concise and convenient for algorithmic purpose to solve the applications.

Author(s):  
Gabriel Ruiz-Garzón ◽  
Jaime Ruiz-Zapatero ◽  
Rafaela Osuna-Gómez ◽  
Antonio Rufián-Lizana

This work is intended to lead a study of necessary and sufficient optimality conditions for scalar optimization problems on Hadamard manifolds. In the context of this geometry, we obtain and present new function types characterized by the property of having all their second-order stationary points to be global minimums. In order to do so, we extend the concept convexity in Euclidean space to a more general notion of invexity on Hadamard manifolds. This is done employing notions of second-order directional derivative, second-order pseudoinvexity functions and the second-order Karush-Kuhn-Tucker-pseudoinvexity problem. Thus, we prove that every second-order stationary point is a global minimum if and only if the problem is either second-order pseudoinvex or second-order KKT-pseudoinvex depending on whether the problem regards unconstrained or constrained scalar optimization respectively. This result has not been presented in the literature before. Finally, examples of these new characterizations are provided in the context of \textit{"Higgs Boson like"} potentials among others.


Author(s):  
X.Q. Yang

AbstractWe study certain types of composite nonsmooth minimization problems by introducing a general smooth approximation method. Under various conditions we derive bounds on error estimates of the functional values of original objective function at an approximate optimal solution and at the optimal solution. Finally, we obtain second-order necessary optimality conditions for the smooth approximation prob lems using a recently introduced generalized second-order directional derivative.


2013 ◽  
Vol 23 (1) ◽  
pp. 59-71
Author(s):  
Nada Djuranovic-Milicic ◽  
Milanka Gardasevic-Filipovic

In this paper an algorithm for minimization of a nondifferentiable function is presented. The algorithm uses the Moreau-Yosida regularization of the objective function and its second order Dini upper directional derivative. The purpose of the paper is to establish general hypotheses for this algorithm, under which convergence occurs to optimal points. A convergence proof is given, as well as an estimate of the rate of the convergence.


2016 ◽  
Vol 37 (9) ◽  
pp. 1142-1151
Author(s):  
赵爱罡 ZHAO Ai-gang ◽  
王宏力 WANG Hong-li ◽  
杨小冈 YANG Xiao-gang ◽  
陆敬辉 LU Jing-hui ◽  
姜伟 JIANG Wei ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document