scholarly journals Improved Modified Symbiosis Organisms Search (IMSOS): A New and Adaptive Approach for Determining Model Parameters from Geoelectrical Data

2021 ◽  
Vol 53 (5) ◽  
pp. 210505
Author(s):  
Sungkono Sungkono ◽  
Hendra Grandis

Symbiotic Organisms Search (SOS) is a global optimization algorithm inspired by the natural synergy between the organisms in an ecosystem. The interactive behavior among organisms in nature simulated in SOS consists of mutualism, commensalism, and parasitism strategies to find the global optimum solution in the search space. The SOS algorithm does not require a tuning parameter, which is usually used to balance explorative and exploitative search by providing posterior sampling of the model parameters. This paper proposes an improvement of the Modified SOS (MSOS) algorithm, called IMSOS, to enhance exploitation along with exploration strategies via a modified parasitism vector. This improves the search efficiency in finding the global minimum of two multimodal testing functions. Furthermore, the algorithm is proposed for solving inversion problems in geophysics. The performance of IMSOS was tested on the inversion of synthetic and field data sets from self-potential (SP) and vertical electrical sounding (VES) measurements. The IMSOS results were comparable to those of other global optimization algorithms, including the Particle Swarm Optimization, the Differential Evolution and the Black Holes Algorithms. IMSOS accurately determined the model parameters and their uncertainties. It can be adapted and can potentially be used to solve the inversion of other geophysical data as well.

2019 ◽  
Vol 77 ◽  
pp. 567-583 ◽  
Author(s):  
Khoa H. Truong ◽  
Perumal Nallagownden ◽  
Zuhairi Baharudin ◽  
Dieu N. Vo

2011 ◽  
Vol 08 (03) ◽  
pp. 535-544 ◽  
Author(s):  
BOUDJEHEM DJALIL ◽  
BOUDJEHEM BADREDDINE ◽  
BOUKAACHE ABDENOUR

In this paper, we propose a very interesting idea in global optimization making it easer and a low-cost task. The main idea is to reduce the dimension of the optimization problem in hand to a mono-dimensional one using variables coding. At this level, the algorithm will look for the global optimum of a mono-dimensional cost function. The new algorithm has the ability to avoid local optima, reduces the number of evaluations, and improves the speed of the algorithm convergence. This method is suitable for functions that have many extremes. Our algorithm can determine a narrow space around the global optimum in very restricted time based on a stochastic tests and an adaptive partition of the search space. Illustrative examples are presented to show the efficiency of the proposed idea. It was found that the algorithm was able to locate the global optimum even though the objective function has a large number of optima.


Author(s):  
Liqun Wang ◽  
Songqing Shan ◽  
G. Gary Wang

The presence of black-box functions in engineering design, which are usually computation-intensive, demands efficient global optimization methods. This work proposes a new global optimization method for black-box functions. The global optimization method is based on a novel mode-pursuing sampling (MPS) method which systematically generates more sample points in the neighborhood of the function mode while statistically covers the entire search space. Quadratic regression is performed to detect the region containing the global optimum. The sampling and detection process iterates until the global optimum is obtained. Through intensive testing, this method is found to be effective, efficient, robust, and applicable to both continuous and discontinuous functions. It supports simultaneous computation and applies to both unconstrained and constrained optimization problems. Because it does not call any existing global optimization tool, it can be used as a standalone global optimization method for inexpensive problems as well. Limitation of the method is also identified and discussed.


2021 ◽  
Vol 2021 ◽  
pp. 1-16
Author(s):  
Xiang Yu ◽  
Yu Qiao

Comprehensive learning particle swarm optimization (CLPSO) and enhanced CLPSO (ECLPSO) are two literature metaheuristics for global optimization. ECLPSO significantly improves the exploitation and convergence performance of CLPSO by perturbation-based exploitation and adaptive learning probabilities. However, ECLPSO still cannot locate the global optimum or find a near-optimum solution for a number of problems. In this paper, we study further bettering the exploration performance of ECLPSO. We propose to assign an independent inertia weight and an independent acceleration coefficient corresponding to each dimension of the search space, as well as an independent learning probability for each particle on each dimension. Like ECLPSO, a normative interval bounded by the minimum and maximum personal best positions is determined with respect to each dimension in each generation. The dimensional independent maximum velocities, inertia weights, acceleration coefficients, and learning probabilities are proposed to be adaptively updated based on the dimensional normative intervals in order to facilitate exploration, exploitation, and convergence, particularly exploration. Our proposed metaheuristic, called adaptive CLPSO (ACLPSO), is evaluated on various benchmark functions. Experimental results demonstrate that the dimensional independent and adaptive maximum velocities, inertia weights, acceleration coefficients, and learning probabilities help to significantly mend ECLPSO’s exploration performance, and ACLPSO is able to derive the global optimum or a near-optimum solution on all the benchmark functions for all the runs with parameters appropriately set.


1997 ◽  
Vol 36 (5) ◽  
pp. 53-60 ◽  
Author(s):  
V. A. Cooper ◽  
V. T. V. Nguyen ◽  
J. A. Nicell

The calibration of conceptual rainfall runoff (CRR) models is an optimization problem whose objective is to determine the values of the model parameters which provide the best fit between observed and estimated flows. This study investigated the performance of three probabilistic optimization techniques for calibrating the Tank model, a hydrologic model typical of CRR models. These methods were the Shuffled Complex Evolution (SCE), genetic algorithms (GA) and simulated annealing (SA) methods. It was found that performances depended on the choice of the objective function considered and also an the position of the start of the optimization search relative to the global optimum. Of the three global optimization methods (GOM) in the study, the SCE method provided better estimates of the optimal solution than the GA and SA methods. Regarding the efficiency of the GOMs, as expressed by the number of iterations for convergence, the ranking in order of decreasing performance was the SCE, the GA and the SA methods.


2012 ◽  
Vol 2012 ◽  
pp. 1-36 ◽  
Author(s):  
Jui-Yu Wu

This work presents a hybrid real-coded genetic algorithm with a particle swarm optimization (RGA-PSO) algorithm and a hybrid artificial immune algorithm with a PSO (AIA-PSO) algorithm for solving 13 constrained global optimization (CGO) problems, including six nonlinear programming and seven generalized polynomial programming optimization problems. External RGA and AIA approaches are used to optimize the constriction coefficient, cognitive parameter, social parameter, penalty parameter, and mutation probability of an internal PSO algorithm. CGO problems are then solved using the internal PSO algorithm. The performances of the proposed RGA-PSO and AIA-PSO algorithms are evaluated using 13 CGO problems. Moreover, numerical results obtained using the proposed RGA-PSO and AIA-PSO algorithms are compared with those obtained using published individual GA and AIA approaches. Experimental results indicate that the proposed RGA-PSO and AIA-PSO algorithms converge to a global optimum solution to a CGO problem. Furthermore, the optimum parameter settings of the internal PSO algorithm can be obtained using the external RGA and AIA approaches. Also, the proposed RGA-PSO and AIA-PSO algorithms outperform some published individual GA and AIA approaches. Therefore, the proposed RGA-PSO and AIA-PSO algorithms are highly promising stochastic global optimization methods for solving CGO problems.


2019 ◽  
Vol 2019 ◽  
pp. 1-17
Author(s):  
Meiji Cui ◽  
Li Li ◽  
Miaojing Shi

Biogeography-based optimization (BBO), a recent proposed metaheuristic algorithm, has been successfully applied to many optimization problems due to its simplicity and efficiency. However, BBO is sensitive to the curse of dimensionality; its performance degrades rapidly as the dimensionality of the search space increases. In this paper, a selective migration operator is proposed to scale up the performance of BBO and we name it selective BBO (SBBO). The differential migration operator is selected heuristically to explore the global area as far as possible whist the normal distributed migration operator is chosen to exploit the local area. By the means of heuristic selection, an appropriate migration operator can be used to search the global optimum efficiently. Moreover, the strategy of cooperative coevolution (CC) is adopted to solve large-scale global optimization problems (LSOPs). To deal with subgroup imbalance contribution to the whole solution in the context of CC, a more efficient computing resource allocation is proposed. Extensive experiments are conducted on the CEC 2010 benchmark suite for large-scale global optimization, and the results show the effectiveness and efficiency of SBBO compared with BBO variants and other representative algorithms for LSOPs. Also, the results confirm that the proposed computing resource allocation is vital to the large-scale optimization within the limited computation budget.


Sign in / Sign up

Export Citation Format

Share Document