Particle Swarm Optimization and Differential Evolution Methods Hybridized with Pattern Search for Solving Optimization Problems

2018 ◽  
Author(s):  
Viviane J. Galvão ◽  
Helio J. C. Barbosa ◽  
Heder S. Bernardino
Author(s):  
Shailendra Aote ◽  
Mukesh M. Raghuwanshi

To solve the problems of optimization, various methods are provided in different domain. Evolutionary computing (EC) is one of the methods to solve these problems. Mostly used EC techniques are available like Particle Swarm Optimization (PSO), Genetic Algorithm (GA) and Differential Evolution (DE). These techniques have different working structure but the inner working structure is same. Different names and formulae are given for different task but ultimately all do the same. Here we tried to find out the similarities among these techniques and give the working structure in each step. All the steps are provided with proper example and code written in MATLAB, for better understanding. Here we started our discussion with introduction about optimization and solution to optimization problems by PSO, GA and DE. Finally, we have given brief comparison of these.


2014 ◽  
Vol 2014 ◽  
pp. 1-16 ◽  
Author(s):  
Xiaobing Yu ◽  
Jie Cao ◽  
Haiyan Shan ◽  
Li Zhu ◽  
Jun Guo

Particle swarm optimization (PSO) and differential evolution (DE) are both efficient and powerful population-based stochastic search techniques for solving optimization problems, which have been widely applied in many scientific and engineering fields. Unfortunately, both of them can easily fly into local optima and lack the ability of jumping out of local optima. A novel adaptive hybrid algorithm based on PSO and DE (HPSO-DE) is formulated by developing a balanced parameter between PSO and DE. Adaptive mutation is carried out on current population when the population clusters around local optima. The HPSO-DE enjoys the advantages of PSO and DE and maintains diversity of the population. Compared with PSO, DE, and their variants, the performance of HPSO-DE is competitive. The balanced parameter sensitivity is discussed in detail.


2014 ◽  
Vol 31 (7) ◽  
pp. 1198-1220 ◽  
Author(s):  
Gai-Ge Wang ◽  
Amir Hossein Gandomi ◽  
Xin-She Yang ◽  
Amir Hossein Alavi

Purpose – Meta-heuristic algorithms are efficient in achieving the optimal solution for engineering problems. Hybridization of different algorithms may enhance the quality of the solutions and improve the efficiency of the algorithms. The purpose of this paper is to propose a novel, robust hybrid meta-heuristic optimization approach by adding differential evolution (DE) mutation operator to the accelerated particle swarm optimization (APSO) algorithm to solve numerical optimization problems. Design/methodology/approach – The improvement includes the addition of DE mutation operator to the APSO updating equations so as to speed up convergence. Findings – A new optimization method is proposed by introducing DE-type mutation into APSO, and the hybrid algorithm is called differential evolution accelerated particle swarm optimization (DPSO). The difference between DPSO and APSO is that the mutation operator is employed to fine-tune the newly generated solution for each particle, rather than random walks used in APSO. Originality/value – A novel hybrid method is proposed and used to optimize 51 functions. It is compared with other methods to show its effectiveness. The effect of the DPSO parameters on convergence and performance is also studied and analyzed by detailed parameter sensitivity studies.


2018 ◽  
Vol 7 (2.6) ◽  
pp. 88
Author(s):  
S R.Sujatha ◽  
M Siddappa

An original learning algorithm for solving global numerical optimization problems is proposed. The proposed algorithm is strong stochastic search method which is based on evaluation and optimization of a hypercube and is called the hypercube optimization (HO) algorithm. The hypercube optimization algorithm includes the initialization and evaluation process, and searching space process. The designed HO algorithm is tested on specific benchmark functions. The comparative performance analysis have made against with other approaches of dynamic weight particle swarm optimization and self-adaptive differential evolution algorithm. Convergence characteristics of self-adaptive differential evolution algorithm has deliver the much better functional   value in compare to dynamic weight based particle swarm optimization.


Author(s):  
Shailendra Aote ◽  
Mukesh M. Raghuwanshi

To solve the problems of optimization, various methods are provided in different domain. Evolutionary computing (EC) is one of the methods to solve these problems. Mostly used EC techniques are available like Particle Swarm Optimization (PSO), Genetic Algorithm (GA) and Differential Evolution (DE). These techniques have different working structure but the inner working structure is same. Different names and formulae are given for different task but ultimately all do the same. Here we tried to find out the similarities among these techniques and give the working structure in each step. All the steps are provided with proper example and code written in MATLAB, for better understanding. Here we started our discussion with introduction about optimization and solution to optimization problems by PSO, GA and DE. Finally, we have given brief comparison of these.


Author(s):  
Dhafar Al-Ani ◽  
Saeid Habibi

Real-world problems are often complex and may need to deal with constrained optimization problems (COPs). This has led to a growing interest in optimization techniques that involve more than one objective function to be simultaneously optimized. Accordingly, at the end of the multi-objective optimization process, there will be more than one solution to be considered. This enables a trade-off of high-quality solutions and provides options to the decision-maker to choose a solution based on qualitative preferences. Particle Swarm Optimization (PSO) algorithms are increasingly being used to solve NP-hard and constrained optimization problems that involve multi-objective mathematical representations by finding accurate and robust solutions. PSOs are currently used in many real-world applications, including (but not limited to) medical diagnosis, image processing, speech recognition, chemical reactor, weather forecasting, system identification, reactive power control, stock exchange market, and economic power generation. In this paper, a new version of Multi-objective PSO and Differential Evolution (MOPSO-DE) is proposed to solve constrained optimization problems (COPs). As presented in this paper, the proposed MOPSO-DE scheme incorporates a new leader(s) updating mechanism that is invoked when the system is under the risk of converging to premature solutions, parallel islands mechanism, adaptive mutation, and then integrated to the DE in order to update the particles’ best position in the search-space. A series of experiments are conducted using 12 well-known benchmark test problems collected from the 2006 IEEE Congress on Evolutionary Computation (CEC2006) to verify the feasibility, performance, and effectiveness of the proposed MOPSO-DE algorithm. The simulation results show the proposed MOPSO-DE is highly competitive and is able to obtain the optimal solutions for the all test problems.


Author(s):  
Sotirios K. Goudos

Antenna and microwave design problems are, in general, multi-objective. Multi-objective Evolutionary Algorithms (MOEAs) are suitable optimization techniques for solving such problems. Particle Swarm Optimization (PSO) and Differential Evolution (DE) have received increased interest from the electromagnetics community. The fact that both algorithms can efficiently handle arbitrary optimization problems has made them popular for solving antenna and microwave design problems. This chapter presents three different state-of-the-art MOEAs based on PSO and DE, namely: the Multi-objective Particle Swarm Optimization (MOPSO), the Multi-objective Particle Swarm Optimization with fitness sharing (MOPSO-fs), and the Generalized Differential Evolution (GDE3). Their applications to different design cases from antenna and microwave problems are reported. These include microwave absorber, microwave filters and Yagi-uda antenna design. The algorithms are compared and evaluated against other evolutionary multi-objective algorithms like Nondominated Sorting Genetic Algorithm-II (NSGA-II). The results show the advantages of using each algorithm.


Sign in / Sign up

Export Citation Format

Share Document