Directional Derivatives for Extremal-Value Functions with Applications to the Completely Convex Case

1973 ◽  
Vol 21 (1) ◽  
pp. 188-209 ◽  
Author(s):  
William Hogan
Optimization ◽  
2013 ◽  
Vol 64 (2) ◽  
pp. 389-407 ◽  
Author(s):  
L. Minchenko ◽  
A. Tarakanov

Author(s):  
Kamil A. Khan ◽  
Yingwei Yuan

For any scalar-valued bivariate function that is locally Lipschitz continuous and directionally differentiable, it is shown that a subgradient may always be constructed from the function's directional derivatives in the four compass directions, arranged in a so-called "compass difference". When the original function is nonconvex, the obtained subgradient is an element of Clarke's generalized gradient, but the result appears to be novel even for convex functions. The function is not required to be represented in any particular form, and no further assumptions are required, though the result is strengthened when the function is additionally L-smooth in the sense of Nesterov. For certain optimal-value functions and certain parametric solutions of differential equation systems, these new results appear to provide the only known way to compute a subgradient. These results also imply that centered finite differences will converge to a subgradient for bivariate nonsmooth functions. As a dual result, we find that any compact convex set in two dimensions contains the midpoint of its interval hull. Examples are included for illustration, and it is demonstrated that these results do not extend directly to functions of more than two variables or sets in higher dimensions.


1985 ◽  
Vol 37 (6) ◽  
pp. 1074-1084 ◽  
Author(s):  
Jay S. Treiman

In the study of optimization problems it is necessary to consider functions that are not differentiable. This has led to the consideration of generalized gradients and a corresponding calculus for certain classes of functions. Rockafellar [16] and others have developed a very strong and elegant theory of subgradients for convex functions. This convex theory gives point-wise criteria for the existence of extrema in optimization problems.There are however many optimization problems that involve functions which are neither differentiable nor convex. Such functions arise in many settings including optimal value functions [15]. In order to deal with such problems Clarke [3] defined a type of subgradient for nonconvex functions. This definition was initially for Lipschitz functions on R”. Clarke extended this definition to include lower semicontinuous (l.s.c.) functions on Banach spaces through the use of a directional derivative, the distance function from a closed set and tangent and normal cones to closed sets.


Author(s):  
Alain B. Zemkoho

AbstractWe consider the optimal value function of a parametric optimization problem. A large number of publications have been dedicated to the study of continuity and differentiability properties of the function. However, the differentiability aspect of works in the current literature has mostly been limited to first order analysis, with focus on estimates of its directional derivatives and subdifferentials, given that the function is typically nonsmooth. With the progress made in the last two to three decades in major subfields of optimization such as robust, minmax, semi-infinite and bilevel optimization, and their connection to the optimal value function, there is a need for a second order analysis of the generalized differentiability properties of this function. This could enable the development of robust solution algorithms, such as the Newton method. The main goal of this paper is to provide estimates of the generalized Hessian for the optimal value function. Our results are based on two handy tools from parametric optimization, namely the optimal solution and Lagrange multiplier mappings, for which completely detailed estimates of their generalized derivatives are either well-known or can easily be obtained.


Sign in / Sign up

Export Citation Format

Share Document