scholarly journals On derivative based bounding for simplicial branch and bound

Author(s):  
Eligius M.T. Hendrix ◽  
Boglarka G. -Tóth ◽  
Frederic Messine ◽  
Leocadio G. Casado

Simplicial based Global Optimization branch and bound methods require tight bounds on the objective function value. Recently, a renewed interest appears on bound calculation based on Interval Arithmetic by Karhbet and Kearfott (2017) and on exploiting second derivative bounds by Mohand (2021). The investigated question here is how partial derivative ranges can be used to provide bounds of the objective function value over the simplex. Moreover, we provide theoretical properties of how this information can be used from a monotonicity perspective to reduce the search space in simplicial branch and bound.

2011 ◽  
Vol 08 (03) ◽  
pp. 535-544 ◽  
Author(s):  
BOUDJEHEM DJALIL ◽  
BOUDJEHEM BADREDDINE ◽  
BOUKAACHE ABDENOUR

In this paper, we propose a very interesting idea in global optimization making it easer and a low-cost task. The main idea is to reduce the dimension of the optimization problem in hand to a mono-dimensional one using variables coding. At this level, the algorithm will look for the global optimum of a mono-dimensional cost function. The new algorithm has the ability to avoid local optima, reduces the number of evaluations, and improves the speed of the algorithm convergence. This method is suitable for functions that have many extremes. Our algorithm can determine a narrow space around the global optimum in very restricted time based on a stochastic tests and an adaptive partition of the search space. Illustrative examples are presented to show the efficiency of the proposed idea. It was found that the algorithm was able to locate the global optimum even though the objective function has a large number of optima.


2008 ◽  
Vol 13 (1) ◽  
pp. 145-159 ◽  
Author(s):  
J. Žilinskas

Branch and bound methods for global optimization are considered in this paper. Advantages and disadvantages of simplicial partitions for branch and bound are shown. A new general combinatorial approach for vertex triangulation of hyper‐rectangular feasible regions is presented. Simplicial partitions may be used to vertex triangulate feasible regions of non rectangular shape defined by linear inequality constraints. Linear inequality constraints may be used to avoid symmetries in optimization problems.


2011 ◽  
Vol 16 (3) ◽  
pp. 451-460 ◽  
Author(s):  
Antanas Žilinskas ◽  
Julius Žilinskas

A simplicial statistical model of multimodal functions is used to construct a global optimization algorithm. The search for the global minimum in the multidimensional space is reduced to the search over the edges of simplices covering the feasible region combined with the refinement of the cover. The refinement is performed by subdivision of selected simplices taking into account the point where the objective function value has been computed at the current iteration. For the search over the edges the one-dimensional P-algorithm based on the statistical smooth function model is adapted. Differently from the recently proposed algorithm here the statistical model is used for modelling the behaviour of the objective function not over the whole simplex but only over its edges. Testing results of the proposed algorithm are included.


Author(s):  
B. G.-Tóth ◽  
L. G. Casado ◽  
E. M. T. Hendrix ◽  
F. Messine

AbstractBranch and Bound (B&B) algorithms in Global Optimization are used to perform an exhaustive search over the feasible area. One choice is to use simplicial partition sets. Obtaining sharp and cheap bounds of the objective function over a simplex is very important in the construction of efficient Global Optimization B&B algorithms. Although enclosing a simplex in a box implies an overestimation, boxes are more natural when dealing with individual coordinate bounds, and bounding ranges with Interval Arithmetic (IA) is computationally cheap. This paper introduces several linear relaxations using gradient information and Affine Arithmetic and experimentally studies their efficiency compared to traditional lower bounds obtained by natural and centered IA forms and their adaption to simplices. A Global Optimization B&B algorithm with monotonicity test over a simplex is used to compare their efficiency over a set of low dimensional test problems with instances that either have a box constrained search region or where the feasible set is a simplex. Numerical results show that it is possible to obtain tight lower bounds over simplicial subsets.


2020 ◽  
Author(s):  
Alberto Bemporad ◽  
Dario Piga

AbstractThis paper proposes a method for solving optimization problems in which the decision-maker cannot evaluate the objective function, but rather can only express a preference such as “this is better than that” between two candidate decision vectors. The algorithm described in this paper aims at reaching the global optimizer by iteratively proposing the decision maker a new comparison to make, based on actively learning a surrogate of the latent (unknown and perhaps unquantifiable) objective function from past sampled decision vectors and pairwise preferences. A radial-basis function surrogate is fit via linear or quadratic programming, satisfying if possible the preferences expressed by the decision maker on existing samples. The surrogate is used to propose a new sample of the decision vector for comparison with the current best candidate based on two possible criteria: minimize a combination of the surrogate and an inverse weighting distance function to balance between exploitation of the surrogate and exploration of the decision space, or maximize a function related to the probability that the new candidate will be preferred. Compared to active preference learning based on Bayesian optimization, we show that our approach is competitive in that, within the same number of comparisons, it usually approaches the global optimum more closely and is computationally lighter. Applications of the proposed algorithm to solve a set of benchmark global optimization problems, for multi-objective optimization, and for optimal tuning of a cost-sensitive neural network classifier for object recognition from images are described in the paper. MATLAB and a Python implementations of the algorithms described in the paper are available at http://cse.lab.imtlucca.it/~bemporad/glis.


Sign in / Sign up

Export Citation Format

Share Document