Multi Agent Normal Sampling Technique (MANST) for Global Optimization

Author(s):  
Alireza Saremi ◽  
Nasr Al-Hinai ◽  
G. Gary Wang ◽  
Tarek ElMekkawy

The current work discusses a novel global optimization method called the Multi-Agent Normal Sampling Technique (MANST). MANST is based on systematic sampling of points around agents; each agent in MANST represents a candidate solution of the problem. All agents compete with each other for a larger share of available resources. The performance of all agents is periodically evaluated and a specific number of agents who show no promising achievements are deleted; new agents are generated in the proximity of those promising agents. This process continues until the agents converge to the global optimum. MANST is a standalone global optimization technique. It is benchmarked with six well-known test cases and the results are then compared with those obtained from Matlab™ 7.1 GA Toolbox. The test results showed that MANST outperformed Matlab™ 7.1 GA Toolbox for the benchmark problems in terms of accuracy, number of function evaluations, and CPU time.

Author(s):  
Alireza Saremi ◽  
Amir H. Birjandi ◽  
G. Gary Wang ◽  
Tarek ElMekkawy ◽  
Eric Bibeau

This paper describes an enhanced version of a new global optimization method, Multi-Agent Normal Sampling Technique (MANST) described in reference [1]. Each agent in MANST includes a number of points that sample around the mean point with a certain standard deviation. In each step the point with the minimum value in the agent is chosen as the center point for the next step normal sampling. Then the chosen points of all agents are compared to each other and agents receive a certain share of the resources for the next step according to their lowest mean function value at the current step. The performance of all agents is periodically evaluated and a specific number of agents who show no promising achievements are deleted; new agents are generated in the proximity of those promising agents. This process continues until the agents converge to the global optimum. MANST is a standalone global optimization technique and does not require equations or knowledge about the objective function. The unique feature of this method in comparison with other global optimization methods is its dynamic normal distribution search. This work presents our recent research in enhancing MANST to handle variable boundaries and constraints. Moreover, a lean group sampling approach is implemented to prevent sampling in the same region for different agents. The overall capability and efficiency of the MANST has been improved as a result in the newer version. The enhanced MANST is highly competitive with other stochastic methods such as Genetic Algorithm (GA). In most of the test cases, the performance of the MANST is significantly higher than the Matlab™ GA Toolbox.


Author(s):  
J. Gu ◽  
G. Y. Li ◽  
Z. Dong

Metamodeling techniques are increasingly used in solving computation intensive design optimization problems today. In this work, the issue of automatic identification of appropriate metamodeling techniques in global optimization is addressed. A generic, new hybrid metamodel based global optimization method, particularly suitable for design problems involving computation intensive, black-box analyses and simulations, is introduced. The method employs three representative metamodels concurrently in the search process and selects sample data points adaptively according to the values calculated using the three metamodels to improve the accuracy of modeling. The global optimum is identified when the metamodels become reasonably accurate. The new method is tested using various benchmark global optimization problems and applied to a real industrial design optimization problem involving vehicle crash simulation, to demonstrate the superior performance of the new algorithm over existing search methods. Present limitations of the proposed method are also discussed.


Author(s):  
Liqun Wang ◽  
Songqing Shan ◽  
G. Gary Wang

The presence of black-box functions in engineering design, which are usually computation-intensive, demands efficient global optimization methods. This work proposes a new global optimization method for black-box functions. The global optimization method is based on a novel mode-pursuing sampling (MPS) method which systematically generates more sample points in the neighborhood of the function mode while statistically covers the entire search space. Quadratic regression is performed to detect the region containing the global optimum. The sampling and detection process iterates until the global optimum is obtained. Through intensive testing, this method is found to be effective, efficient, robust, and applicable to both continuous and discontinuous functions. It supports simultaneous computation and applies to both unconstrained and constrained optimization problems. Because it does not call any existing global optimization tool, it can be used as a standalone global optimization method for inexpensive problems as well. Limitation of the method is also identified and discussed.


Author(s):  
Martin Macaš ◽  
Lenka Lhotská

A novel binary optimization technique is introduced called Social Impact Theory based Optimizer (SITO), which is based on social psychology model of social interactions. The algorithm is based on society of individuals. Each individual holds a set of its attitudes, which encodes a candidate solution of a binary optimization problem. Individuals change their attitudes according to their spatial neighbors and their fitness, which leads to convergence to local (or global) optimum. This chapter also tries to demonstrate different aspects of the SITO’s behavior and to give some suggestions for potential user. Further, a comparison to similar techniques – genetic algorithm and binary particle swarm optimizer – is discussed and some possibilities of formal analysis are briefly presented.


1993 ◽  
Vol 115 (4) ◽  
pp. 770-775 ◽  
Author(s):  
P. Jain ◽  
A. M. Agogino

Multistart is a stochastic global optimization method for finding the global optimum of highly nonlinear mechanical problems. In this paper we introduce and develop a variant of the multistart method in which a fraction of the sample points in the feasible region with smallest function value are clustered using the Vector Quantization technique. The theories of lattices and sphere packing are used to define optimal lattices. These lattices are optimal with respect to quantization error and are used as code points for vector quantization. The implementation of these ideas has resulted in the VQ-multistart algorithm for finding the global optimum with substantial reductions in both the incore memory requirements and the computation time. We solve several mathematical test problems and a mechanical optimal design problem using the VQ-multistart algorithm.


Author(s):  
Mingjun Ji ◽  
Jacek Klinowski

We introduce taboo evolutionary programming, a very efficient global optimization method which combines features of single-point mutation evolutionary programming (SPMEP) and taboo search. As demonstrated by solving 18 benchmark problems, the algorithm is not trapped in local minima and quickly approaches the global minimum. The results are superior to those from SPMEP, fast evolutionary programming and generalized evolutionary programming. The method is easily applicable to real-world problems, and the central idea may be introduced into other algorithms.


2017 ◽  
Vol 145 (4) ◽  
pp. 1275-1294 ◽  
Author(s):  
Pascal Horton ◽  
Michel Jaboyedoff ◽  
Charles Obled

Abstract Analog methods are based on a statistical relationship between synoptic meteorological variables (predictors) and local weather (predictand, to be predicted). This relationship is defined by several parameters, which are often calibrated by means of a semiautomatic sequential procedure. This calibration approach is fast, but has strong limitations. It proceeds through successive steps, and thus cannot handle all parameter dependencies. Furthermore, it cannot automatically optimize some parameters, such as the selection of pressure levels and temporal windows (hours of the day) at which the predictors are compared. To overcome these limitations, the global optimization technique of genetic algorithms is considered, which can jointly optimize all parameters of the method, and get closer to a global optimum, by taking into account the dependencies of the parameters. Moreover, it can objectively calibrate parameters that were previously assessed manually and can take into account new degrees of freedom. However, genetic algorithms must be tailored to the problem under consideration. Multiple combinations of algorithms were assessed, and new algorithms were developed (e.g., the chromosome of adaptive search radius, which is found to be very robust), in order to provide recommendations regarding the use of genetic algorithms for optimizing several variants of analog methods. A global optimization approach provides new perspectives for the improvement of analog methods, and for their application to new regions or new predictands.


2016 ◽  
Vol 25 (02) ◽  
pp. 1550030 ◽  
Author(s):  
Gai-Ge Wang ◽  
Amir H. Gandomi ◽  
Amir H. Alavi ◽  
Suash Deb

A multi-stage krill herd (MSKH) algorithm is presented to fully exploit the global and local search abilities of the standard krill herd (KH) optimization method. The proposed method involves exploration and exploitation stages. The exploration stage uses the basic KH algorithm to select a good candidate solution set. This phase is followed by fine-tuning a good candidate solution in the exploitation stage with a focused local mutation and crossover (LMC) operator in order to enhance the reliability of the method for solving global numerical optimization problems. Moreover, the elitism scheme is introduced into the MSKH method to guarantee the best solution. The performance of MSKH is verified using twenty-five standard and rotated and shifted benchmark problems. The results show the superiority of the proposed algorithm to the standard KH and other well-known optimization methods.


Author(s):  
Moslem Kazemi ◽  
G. Gary Wang ◽  
Shahryar Rahnamayan ◽  
Kamal Gupta

Many engineering design problems deal with global optimization of constrained black-box problems which is usually computation-intensive. Ref. [1] proposed a Mode-Pursuing Sampling (MPS) method for global optimization based on a sampling technique which systematically generates more sample points in the neighborhood of the function mode while statistically covering the entire problem domain. In this paper, we propose a novel and more efficient sampling technique which greatly enhances the performance of the MPS method, especially in the presence of expensive constraints. The effective sampling of the search space is attained via biasing the sample points towards feasible regions and being away from the forbidden regions. This is achieved by utilizing the incrementally obtained information about the constraints, hence, it is called Constraint-importance Mode Pursuing Sampling (CiMPS). According to intensive comparisons and experimental verifications, the new sampling technique is found to be more efficient in solving constrained optimization problems compared to the original MPS method. To the best of our knowledge, this is the first metamodel-based global optimization method that directly aims at reducing the number of function evaluations for both expensive objective functions and constraints.


2014 ◽  
Vol 5 (3) ◽  
pp. 14-41 ◽  
Author(s):  
Marwa Elhajj ◽  
Rafic Younes ◽  
Sebastien Charles ◽  
Eric Padiolleau

The calibration of the model is one of the most important steps in the development of models of engineering systems. A new approach is presented in this study to calibrate a complex multi-domain system. This approach respects the real characteristics of the circuit, the accuracy of the results, and minimizes the cost of the experimental phase. This paper proposes a complete method, the Global Optimization Method for Parameter Calibration (GOMPC). This method uses an optimization technique coupled with the simulated model on simulation software. In this paper, two optimization techniques, the Genetic Algorithm (GA) and the two-level Genetic Algorithm, are applied and then compared on two case studies: a theoretical and a real hydro-electromechanical circuit. In order to optimize the number of measured outputs, a sensitivity analysis is used to identify the objective function (OBJ) of the two studied optimization techniques. Finally, results concluded that applying GOMPC by combining the two-level GA with the simulated model was an efficient solution as it proves its accuracy and efficiency with less computation time. It is believed that this approach is able to converge to the expected results and to find the system's unknown parameters faster and with more accuracy than GA.


Sign in / Sign up

Export Citation Format

Share Document