Multiwinner Voting in Genetic Algorithms for Solving Ill-Posed Global Optimization Problems

Author(s):  
Piotr Faliszewski ◽  
Jakub Sawicki ◽  
Robert Schaefer ◽  
Maciej Smołka
2013 ◽  
Vol 310 ◽  
pp. 609-613
Author(s):  
Ioana D. Balea ◽  
Radu Hulea ◽  
Georgios E. Stavroulakis

This paper presents an implementation of Eurocode load cases for discrete global optimization algorithm for planar structures based on the principles of finite element methods and genetic algorithms. The final optimal design is obtained using IPE sections chosen as feasible by the algorithm, from the available steel sections from industry. The algorithm is tested on an asymmetric planar steel frame with promising results.


2015 ◽  
Vol 11 (11) ◽  
pp. 1025-1031
Author(s):  
Siew Mooi Lim ◽  
Md. Nasir Sulaiman ◽  
Norwati Mustapha ◽  
Abu Bakar Md. Sultan

2010 ◽  
Vol 07 (02) ◽  
pp. 279-297 ◽  
Author(s):  
PANDIAN VASANT

This paper describes the origin and the significant contribution of the development of the hybrid simulated annealing and genetic algorithms (HSAGA) approach to obtaining global optimization. HSAGA provides an insightful way to solve complex optimization problems. It is a combination of the metaheuristic approaches of simulated annealing and novel genetic algorithms to solving a nonlinear objective function with uncertain technical coefficients in industrial production management problems. The proposed novel hybrid method is designed to search for global optimization for the nonlinear objective function and to search for the best feasible solutions to the decision variables. Simulated experiments were carried out rigorously to reflect the advantages of the method. A description of the well-developed method and the advanced computational experiment with the Matlab® technical tool is presented. An industrial production management optimization problem is solved using the HSAGA technique. The results are very promising.


2013 ◽  
Vol 32 (4) ◽  
pp. 981-985
Author(s):  
Ya-fei HUANG ◽  
Xi-ming LIANG ◽  
Yi-xiong CHEN

1995 ◽  
Vol 29 (4) ◽  
pp. 39-56 ◽  
Author(s):  
S. Hurley ◽  
L. Moutinho ◽  
N.M. Stephens

2020 ◽  
Author(s):  
Alberto Bemporad ◽  
Dario Piga

AbstractThis paper proposes a method for solving optimization problems in which the decision-maker cannot evaluate the objective function, but rather can only express a preference such as “this is better than that” between two candidate decision vectors. The algorithm described in this paper aims at reaching the global optimizer by iteratively proposing the decision maker a new comparison to make, based on actively learning a surrogate of the latent (unknown and perhaps unquantifiable) objective function from past sampled decision vectors and pairwise preferences. A radial-basis function surrogate is fit via linear or quadratic programming, satisfying if possible the preferences expressed by the decision maker on existing samples. The surrogate is used to propose a new sample of the decision vector for comparison with the current best candidate based on two possible criteria: minimize a combination of the surrogate and an inverse weighting distance function to balance between exploitation of the surrogate and exploration of the decision space, or maximize a function related to the probability that the new candidate will be preferred. Compared to active preference learning based on Bayesian optimization, we show that our approach is competitive in that, within the same number of comparisons, it usually approaches the global optimum more closely and is computationally lighter. Applications of the proposed algorithm to solve a set of benchmark global optimization problems, for multi-objective optimization, and for optimal tuning of a cost-sensitive neural network classifier for object recognition from images are described in the paper. MATLAB and a Python implementations of the algorithms described in the paper are available at http://cse.lab.imtlucca.it/~bemporad/glis.


Sign in / Sign up

Export Citation Format

Share Document