The method of feasible directions for optimization problems with subdifferentiable objective function

Author(s):  
K. Beer
Author(s):  
Pengfei (Taylor) Li ◽  
Peirong (Slade) Wang ◽  
Farzana Chowdhury ◽  
Li Zhang

Traditional formulations for transportation optimization problems mostly build complicating attributes into constraints while keeping the succinctness of objective functions. A popular solution is the Lagrangian decomposition by relaxing complicating constraints and then solving iteratively. Although this approach is effective for many problems, it generates intractability in other problems. To address this issue, this paper presents an alternative formulation for transportation optimization problems in which the complicating attributes of target problems are partially or entirely built into the objective function instead of into the constraints. Many mathematical complicating constraints in transportation problems can be efficiently modeled in dynamic network loading (DNL) models based on the demand–supply equilibrium, such as the various road or vehicle capacity constraints or “IF–THEN” type constraints. After “pre-building” complicating constraints into the objective functions, the objective function can be approximated well with customized high-fidelity DNL models. Three types of computing benefits can be achieved in the alternative formulation: ( a) the original problem will be kept the same; ( b) computing complexity of the new formulation may be significantly reduced because of the disappearance of hard constraints; ( c) efficiency loss on the objective function side can be mitigated via multiple high-performance computing techniques. Under this new framework, high-fidelity and problem-specific DNL models will be critical to maintain the attributes of original problems. Therefore, the authors’ recent efforts in enhancing the DNL’s fidelity and computing efficiency are also described in the second part of this paper. Finally, a demonstration case study is conducted to validate the new approach.


2020 ◽  
Author(s):  
Alberto Bemporad ◽  
Dario Piga

AbstractThis paper proposes a method for solving optimization problems in which the decision-maker cannot evaluate the objective function, but rather can only express a preference such as “this is better than that” between two candidate decision vectors. The algorithm described in this paper aims at reaching the global optimizer by iteratively proposing the decision maker a new comparison to make, based on actively learning a surrogate of the latent (unknown and perhaps unquantifiable) objective function from past sampled decision vectors and pairwise preferences. A radial-basis function surrogate is fit via linear or quadratic programming, satisfying if possible the preferences expressed by the decision maker on existing samples. The surrogate is used to propose a new sample of the decision vector for comparison with the current best candidate based on two possible criteria: minimize a combination of the surrogate and an inverse weighting distance function to balance between exploitation of the surrogate and exploration of the decision space, or maximize a function related to the probability that the new candidate will be preferred. Compared to active preference learning based on Bayesian optimization, we show that our approach is competitive in that, within the same number of comparisons, it usually approaches the global optimum more closely and is computationally lighter. Applications of the proposed algorithm to solve a set of benchmark global optimization problems, for multi-objective optimization, and for optimal tuning of a cost-sensitive neural network classifier for object recognition from images are described in the paper. MATLAB and a Python implementations of the algorithms described in the paper are available at http://cse.lab.imtlucca.it/~bemporad/glis.


2021 ◽  
Vol 8 (1) ◽  
Author(s):  
Yaoxin Li ◽  
Jing Liu ◽  
Guozheng Lin ◽  
Yueyuan Hou ◽  
Muyun Mou ◽  
...  

AbstractIn computer science, there exist a large number of optimization problems defined on graphs, that is to find a best node state configuration or a network structure, such that the designed objective function is optimized under some constraints. However, these problems are notorious for their hardness to solve, because most of them are NP-hard or NP-complete. Although traditional general methods such as simulated annealing (SA), genetic algorithms (GA), and so forth have been devised to these hard problems, their accuracy and time consumption are not satisfying in practice. In this work, we proposed a simple, fast, and general algorithm framework based on advanced automatic differentiation technique empowered by deep learning frameworks. By introducing Gumbel-softmax technique, we can optimize the objective function directly by gradient descent algorithm regardless of the discrete nature of variables. We also introduce evolution strategy to parallel version of our algorithm. We test our algorithm on four representative optimization problems on graph including modularity optimization from network science, Sherrington–Kirkpatrick (SK) model from statistical physics, maximum independent set (MIS) and minimum vertex cover (MVC) problem from combinatorial optimization on graph, and Influence Maximization problem from computational social science. High-quality solutions can be obtained with much less time-consuming compared to the traditional approaches.


Author(s):  
T. E. Potter ◽  
K. D. Willmert ◽  
M. Sathyamoorthy

Abstract Mechanism path generation problems which use link deformations to improve the design lead to optimization problems involving a nonlinear sum-of-squares objective function subjected to a set of linear and nonlinear constraints. Inclusion of the deformation analysis causes the objective function evaluation to be computationally expensive. An optimization method is presented which requires relatively few objective function evaluations. The algorithm, based on the Gauss method for unconstrained problems, is developed as an extension of the Gauss constrained technique for linear constraints and revises the Gauss nonlinearly constrained method for quadratic constraints. The derivation of the algorithm, using a Lagrange multiplier approach, is based on the Kuhn-Tucker conditions so that when the iteration process terminates, these conditions are automatically satisfied. Although the technique was developed for mechanism problems, it is applicable to any optimization problem having the form of a sum of squares objective function subjected to nonlinear constraints.


2015 ◽  
Vol 6 (3) ◽  
pp. 55-60
Author(s):  
Pritibhushan Sinha

Abstract We consider the median solution of the Newsvendor Problem. Some properties of such a solution are shown through a theoretical analysis and a numerical experiment. Sometimes, though not often, median solution may be better than solutions maximizing expected profit, or maximizing minimum possible, over distribution with the same average and standard deviation, expected profit, according to some criteria. We discuss the practical suitability of the objective function set and the solution derived, for the Newsvendor Problem, and other such random optimization problems.


2016 ◽  
Vol 38 (4) ◽  
pp. 307-317
Author(s):  
Pham Hoang Anh

In this paper, the optimal sizing of truss structures is solved using a novel evolutionary-based optimization algorithm. The efficiency of the proposed method lies in the combination of global search and local search, in which the global move is applied for a set of random solutions whereas the local move is performed on the other solutions in the search population. Three truss sizing benchmark problems with discrete variables are used to examine the performance of the proposed algorithm. Objective functions of the optimization problems are minimum weights of the whole truss structures and constraints are stress in members and displacement at nodes. Here, the constraints and objective function are treated separately so that both function and constraint evaluations can be saved. The results show that the new algorithm can find optimal solution effectively and it is competitive with some recent metaheuristic algorithms in terms of number of structural analyses required.


Fresa implements a nature inspired plant propagation algorithm for the solution of single and multiple objective optimization problems. The method is population based and evolutionary. Treating the objective function as a black box, the implementation is able to solve problems exhibiting behaviour that is challenging for mathematical programming methods. Fresa is easily adapted to new problems which may benefit from bespoke representations of solutions by taking advantage of the dynamic typing and multiple dispatch capabilities of the Julia language. Further, the support for threads in Julia enables an efficient implementation on multi-core computers.


Sign in / Sign up

Export Citation Format

Share Document