scholarly journals A Modified NM-PSO Method for Parameter Estimation Problems of Models

2012 ◽  
Vol 2012 ◽  
pp. 1-12 ◽  
Author(s):  
An Liu ◽  
Erwie Zahara ◽  
Ming-Ta Yang

Ordinary differential equations usefully describe the behavior of a wide range of dynamic physical systems. The particle swarm optimization (PSO) method has been considered an effective tool for solving the engineering optimization problems for ordinary differential equations. This paper proposes a modified hybrid Nelder-Mead simplex search and particle swarm optimization (M-NM-PSO) method for solving parameter estimation problems. The M-NM-PSO method improves the efficiency of the PSO method and the conventional NM-PSO method by rapid convergence and better objective function value. Studies are made for three well-known cases, and the solutions of the M-NM-PSO method are compared with those by other methods published in the literature. The results demonstrate that the proposed M-NM-PSO method yields better estimation results than those obtained by the genetic algorithm, the modified genetic algorithm (real-coded GA (RCGA)), the conventional particle swarm optimization (PSO) method, and the conventional NM-PSO method.

2012 ◽  
Vol 510 ◽  
pp. 472-477
Author(s):  
Jian Hui Zhou ◽  
Shu Zhong Zhao ◽  
Li Xi Yue ◽  
Yan Nan Lu ◽  
Xin Yi Si

In fluid mechanics, how to solve multiple solutions in ordinary differential equations is always a concerned and difficult problem. A particle swarm optimization algorithm combining with the direct search method (DSPO) is proposed for solving the parameter estimation problems of the multiple solutions in fluid mechanics. This algorithm has improved greatly in precision and the success rate. In this paper, multiple solutions can be found through changing accuracy and search coverage and multi-iterations of computer. Parameter estimation problems of the multiple solutions of ordinary differential equations are calculated, and the result has great accuracy and this method is practical.


2018 ◽  
Vol 2018 ◽  
pp. 1-9 ◽  
Author(s):  
Devin Akman ◽  
Olcay Akman ◽  
Elsa Schaefer

Researchers using ordinary differential equations to model phenomena face two main challenges among others: implementing the appropriate model and optimizing the parameters of the selected model. The latter often proves difficult or computationally expensive. Here, we implement Particle Swarm Optimization, which draws inspiration from the optimizing behavior of insect swarms in nature, as it is a simple and efficient method for fitting models to data. We demonstrate its efficacy by showing that it outstrips evolutionary computing methods previously used to analyze an epidemic model.


Author(s):  
Shailendra Aote ◽  
Mukesh M. Raghuwanshi

To solve the problems of optimization, various methods are provided in different domain. Evolutionary computing (EC) is one of the methods to solve these problems. Mostly used EC techniques are available like Particle Swarm Optimization (PSO), Genetic Algorithm (GA) and Differential Evolution (DE). These techniques have different working structure but the inner working structure is same. Different names and formulae are given for different task but ultimately all do the same. Here we tried to find out the similarities among these techniques and give the working structure in each step. All the steps are provided with proper example and code written in MATLAB, for better understanding. Here we started our discussion with introduction about optimization and solution to optimization problems by PSO, GA and DE. Finally, we have given brief comparison of these.


Author(s):  
Rongrong Li ◽  
Linrun Qiu ◽  
Dongbo Zhang

In this article, a hierarchical cooperative algorithm based on the genetic algorithm and the particle swarm optimization is proposed that the paper should utilize the global searching ability of genetic algorithm and the fast convergence speed of particle swarm optimization. The proposed algorithm starts from Individual organizational structure of subgroups and takes full advantage of the merits of the particle swarm optimization algorithm and the genetic algorithm (HCGA-PSO). The algorithm uses a layered structure with two layers. The bottom layer is composed of a series of genetic algorithm by subgroup that contributes to the global searching ability of the algorithm. The upper layer is an elite group consisting of the best individuals of each subgroup and the particle swarm algorithm is used to perform precise local search. The experimental results demonstrate that the HCGA-PSO algorithm has better convergence and stronger continuous search capability, which makes it suitable for solving complex optimization problems.


Author(s):  
Hrvoje Markovic ◽  
◽  
Fangyan Dong ◽  
Kaoru Hirota

A parallel multi-population based metaheuristic optimization framework, called Concurrent Societies, inspired by human intellectual evolution, is proposed. It uses population based metaheuristics to evolve its populations, and fitness function approximations as representations of knowledge. By utilizing iteratively refined approximations it reduces the number of required evaluations and, as a byproduct, it produces models of the fitness function. The proposed framework is implemented as two Concurrent Societies: one based on genetic algorithm and one based on particle swarm optimization both using k -nearest neighbor regression as fitness approximation. The performance is evaluated on 10 standard test problems and compared to other commonly used metaheuristics. Results show that the usage of the framework considerably increases efficiency (by a factor of 7.6 to 977) and effectiveness (absolute error reduced by more than few orders of magnitude). The proposed framework is intended for optimization problems with expensive fitness functions, such as optimization in design and interactive optimization.


Author(s):  
Jenn-Long Liu ◽  

Particle swarm optimization (PSO) is a promising evolutionary approach related to a particle moves over the search space with velocity, which is adjusted according to the flying experiences of the particle and its neighbors, and flies towards the better and better search area over the course of search process. Although the PSO is effective in solving the global optimization problems, there are some crucial user-input parameters, such as cognitive and social learning rates, affect the performance of algorithm since the search process of a PSO algorithm is nonlinear and complex. Consequently, a PSO with well-selected parameter settings may result in good performance. This work develops an evolving PSO based on the Clerc’s PSO to evaluate the fitness of objective function and a genetic algorithm (GA) to evolve the optimal design parameters to provide the usage of PSO. The crucial design parameters studied herein include the cognitive and social learning rates as well as constriction factor for the Clerc’s PSO. Several benchmarking cases are experimented to generalize a set of optimal parameters via the evolving PSO. Furthermore, the better parameters are applied to the engineering optimization of a pressure vessel design.


2014 ◽  
Vol 989-994 ◽  
pp. 2621-2624
Author(s):  
Shao Song Wan ◽  
Jian Cao ◽  
Qun Song Zhu

In order to resolve these problems, we put forward a new design of the intelligent lock which is mainly based on the technology of wireless sensor network. Particle swarm optimization (PSO) is a recently proposed intelligent algorithm which is motivated by swarm intelligence. PSO has been shown to perform well on many benchmark and real-world optimization problems; it easily falls into local optima when solving complex multimodal problems. To avoid the local optimization, the algorithm renews population and enhances the diversity of population by using density calculation of immune theory and adjusting new chaos sequence. The paper gives the circuit diagram of the hardware components based on single chip and describe how to design the software. The experimental results show that the immune genetic algorithm based on chaos theory can search the result of the optimization and evidently improve the convergent speed and astringency.


2014 ◽  
Vol 2014 ◽  
pp. 1-6 ◽  
Author(s):  
Hamid Reza Mohammadi ◽  
Ali Akhavan

A cost effective off-line method for equivalent circuit parameter estimation of an induction motor using hybrid of genetic algorithm and particle swarm optimization (HGAPSO) is proposed. The HGAPSO inherits the advantages of both genetic algorithm (GA) and particle swarm optimization (PSO). The parameter estimation methodology describes a method for estimating the steady-state equivalent circuit parameters from the motor performance characteristics, which is normally available from the nameplate data or experimental tests. In this paper, the problem formulation uses the starting torque, the full load torque, the maximum torque, and the full load power factor which are normally available from the manufacturer data. The proposed method is used to estimate the stator and rotor resistances, the stator and rotor leakage reactances, and the magnetizing reactance in the steady-state equivalent circuit. The optimization problem is formulated to minimize an objective function containing the error between the estimated and the manufacturer data. The validity of the proposed method is demonstrated for a preset model of induction motor in MATLAB/Simulink. Also, the performance evaluation of the proposed method is carried out by comparison between the results of the HGAPSO, GA, and PSO.


Sign in / Sign up

Export Citation Format

Share Document