Clustering Using Genetic Algorithm-Based Self-Organising Map

2015 ◽  
Vol 1115 ◽  
pp. 573-577
Author(s):  
Azmi Hassan ◽  
Muhammad Ridwan Andi Purnomo ◽  
Putri Dwi Annisa

This paper presents a comparative study of clustering using Artificial Intelligence (AI) techniques. There are 3 methods to be compared, two methods are pure method, called Self Organising Map (SOM) which is branch of Artificial Neural Network (ANN) and Genetic Algorithm (GA), while one method is hybrid between GA and SOM, called GA-based SOM. SOM is one of the most popular method for cluster analysis. SOM will group objects based on the nearest distance between object and updateable cluster centres. However, there are disadvantages of SOM. Solution quality is depend on initial cluster centres that are generated randomly and cluster centres update algorithm is just based on a delta value without considering the searching direction. Basically, clustering case could be modelled as optimisation case. The objective function is to minimise total distance of all data to their cluster centre. Hence, GA has potentiality to be applied for clustering. Advantage of GA is it has multi searching points in finding the solution and stochastic movement from a phase to the next phase. Therefore, possibility of GA to find global optimum solution will be higher. However, there is still some possibility of GA just find near-optimum solution. The advantage of SOM is the smooth iterative procedure to improve existing cluster centres. Hybridisation of GA and SOM believed could provide better solution. In this study, there are 2 data sets used to test the performance of the three techniques. The study shows that when the solution domain is very wide then SOM and GA-based SOM perform better compared to GA while when the solution domain is not very wide then GA performs better.

Author(s):  
Foo Fong Yeng ◽  
Soo Kum Yoke ◽  
Azrina Suhaimi

Genetic Algorithm is an algorithm imitating the natural evolution process in solving optimization problems. All feasible (candidate) solutions would be encoded into chromosomes and undergo the execution of genetic operators in evolution. The evolution itself is a process searching for optimum solution. The searching would stop when a stopping criterion is met. Then, the fittest chromosome of last generation is declared as the optimum solution. However, this optimum solution might be a local optimum or a global optimum solution. Hence, an appropriate stopping criterion is important such that the search is not ended before a global optimum solution is found. In this paper, saturation of population fitness is proposed as a stopping criterion for ending the search. The proposed stopping criteria was compared with conventional stopping criterion, fittest chromosomes repetition, under various parameters setting. The results show that the performance of proposed stopping criterion is superior as compared to the conventional stopping criterion.


Author(s):  
Hiroyuki Kawagishi ◽  
Kazuhiko Kudo

A new optimization method which can search for the global optimum solution and decrease the number of iterations was developed. The performance of the new method was found to be effective in finding the optimum solution for single- and multi-peaked functions for which the global optimum solution was known in advance. According to the application of the method to the optimum design of turbine stages, it was shown that the method can search the global optimum solution at approximately one seventh of the iterations of GA (Genetic Algorithm) or SA (Simulated Annealing).


2016 ◽  
pp. 450-475
Author(s):  
Dipti Singh ◽  
Kusum Deep

Due to their wide applicability and easy implementation, Genetic algorithms (GAs) are preferred to solve many optimization problems over other techniques. When a local search (LS) has been included in Genetic algorithms, it is known as Memetic algorithms. In this chapter, a new variant of single-meme Memetic Algorithm is proposed to improve the efficiency of GA. Though GAs are efficient at finding the global optimum solution of nonlinear optimization problems but usually converge slow and sometimes arrive at premature convergence. On the other hand, LS algorithms are fast but are poor global searchers. To exploit the good qualities of both techniques, they are combined in a way that maximum benefits of both the approaches are reaped. It lets the population of individuals evolve using GA and then applies LS to get the optimal solution. To validate our claims, it is tested on five benchmark problems of dimension 10, 30 and 50 and a comparison between GA and MA has been made.


Author(s):  
Bo-Suk Yang

This chapter describes a hybrid artificial life optimization algorithm (ALRT) based on emergent colonization to compute the solutions of global function optimization problem. In the ALRT, the emergent colony is a fundamental mechanism to search the optimum solution and can be accomplished through the metabolism, movement and reproduction among artificial organisms which appear at the optimum locations in the artificial world. In this case, the optimum locations mean the optimum solutions in the optimization problem. Hence, the ALRT focuses on the searching for the optimum solution in the location of emergent colonies and can achieve more accurate global optimum. The optimization results using different types of test functions are presented to demonstrate the described approach successfully achieves optimum performance. The algorithm is also applied to the test function optimization and optimum design of short journal bearing as a practical application. The optimized results are compared with those of genetic algorithm and successive quadratic programming to identify the optimizing ability.


Author(s):  
Masao Arakawa ◽  
Tomoyuki Miyashita ◽  
Hiroshi Ishikawa

In some cases of developing a new product, response surface of an objective function is not always single peaked function, and it is often multi-peaked function. In that case, designers would like to have not oniy global optimum solution but also as many local optimum solutions and/or quasi-optimum solutions as possible, so that he or she can select one out of them considering the other conditions that are not taken into account priori to optimization. Although this information is quite useful, it is not that easy to obtain with a single trial of optimization. In this study, we will propose a screening of fitness function in genetic algorithms (GA). Which change fitness function during searching. Therefore, GA needs to have higher flexibility in searching. Genetic Range Genetic Algorithms include a number of searching range in a single generation. Just like there are a number of species in wild life. Therefore, it can arrange to have both global searching range and also local searching range with different fitness function. In this paper, we demonstrate the effectiveness of the proposed method through a simple benchmark test problems.


2015 ◽  
Vol 713-715 ◽  
pp. 1491-1494 ◽  
Author(s):  
Zhi Qiang Gao ◽  
Li Xia Liu ◽  
Wei Wei Kong ◽  
Xiao Hong Wang

A novel composite framework of Cuckoo Search (CS) and Particle Swarm Optimization (PSO) algorithm called CS-PSO is proposed in this paper. In CS-PSO, initialization is substituted by chaotic system, and then Cuckoo shares optimums in the global best solutions pool with particles in PSO to improve parallel cooperation and social interaction. Furthermore, Cloud Model, famous for its outstanding characteristics of the process of transforming qualitative concepts to a set of quantitative numerical values, is adopted to exploit the surrounding of the local solutions obtained from the global best solution pool. Benchmark test results show that, CS-PSO can converge to the global optimum solution rapidly and accurately, compared with other algorithms, especially in high dimensional problems.


2017 ◽  
Vol 14 (1) ◽  
pp. 161-176
Author(s):  
Maja Rosic ◽  
Mirjana Simic ◽  
Predrag Pejovic ◽  
Milan Bjelica

Determining an optimal emitting source location based on the time of arrival (TOA) measurements is one of the important problems in Wireless Sensor Networks (WSNs). The nonlinear least-squares (NLS) estimation technique is employed to obtain the location of an emitting source. This optimization problem has been formulated by the minimization of the sum of squared residuals between estimated and measured data as the objective function. This paper presents a hybridization of Genetic Algorithm (GA) for the determination of the global optimum solution with the local search Newton-Raphson (NR) method. The corresponding Cramer-Rao lower bound (CRLB) on the localization errors is derived, which gives a lower bound on the variance of any unbiased estimator. Simulation results under different signal-to-noise-ratio (SNR) conditions show that the proposed hybrid Genetic Algorithm-Newton-Raphson (GA-NR) improves the accuracy and efficiency of the optimal solution compared to the regular GA.


2021 ◽  
Vol 36 (1) ◽  
pp. 35-40
Author(s):  
Shanshan Tu ◽  
Obaid Rehman ◽  
Sadaqat Rehman ◽  
Shafi Khan ◽  
Muhammad Waqas ◽  
...  

Particle swarm optimizer is one of the searched based stochastic technique that has a weakness of being trapped into local optima. Thus, to tradeoff between the local and global searches and to avoid premature convergence in PSO, a new dynamic quantum-based particle swarm optimization (DQPSO) method is proposed in this work. In the proposed method a beta probability distribution technique is used to mutate the particle with the global best position of the swarm. The proposed method can ensure the particles to escape from local optima and will achieve the global optimum solution more easily. Also, to enhance the global searching capability of the proposed method, a dynamic updated formula is proposed that will keep a good balance between the local and global searches. To evaluate the merit and efficiency of the proposed DQPSO method, it has been tested on some well-known mathematical test functions and a standard benchmark problem known as Loney’s solenoid design.


Author(s):  
Dipti Singh ◽  
Kusum Deep

Due to their wide applicability and easy implementation, Genetic algorithms (GAs) are preferred to solve many optimization problems over other techniques. When a local search (LS) has been included in Genetic algorithms, it is known as Memetic algorithms. In this chapter, a new variant of single-meme Memetic Algorithm is proposed to improve the efficiency of GA. Though GAs are efficient at finding the global optimum solution of nonlinear optimization problems but usually converge slow and sometimes arrive at premature convergence. On the other hand, LS algorithms are fast but are poor global searchers. To exploit the good qualities of both techniques, they are combined in a way that maximum benefits of both the approaches are reaped. It lets the population of individuals evolve using GA and then applies LS to get the optimal solution. To validate our claims, it is tested on five benchmark problems of dimension 10, 30 and 50 and a comparison between GA and MA has been made.


2014 ◽  
Vol 543-547 ◽  
pp. 1822-1826 ◽  
Author(s):  
Yi Ge Xue ◽  
Hui Wen Deng

The cuckoo search (CS) algorithm is a very efficient swarm optimization algorithm. Based on CS, a cuckoo search algorithm based on dynamic grouping to adjust flight scale (DGCS) is proposed: All cuckoos are divided into three groups according to the fitness of the individual and the average fitness of the population, then different flight scale is adopted dynamically for each group. Simulation experiments show that the DGCS can quickly converge to the global optimum solution, and has better optimization performance.


Sign in / Sign up

Export Citation Format

Share Document