scholarly journals Hardware Self-Organizing Map Based on Digital Frequency-Locked Loop and Triangular Neighborhood Function

Author(s):  
Hiroomi Hikawa
Author(s):  
Le Anh Tu

This chapter presents a study on improving the quality of the self-organizing map (SOM). We have synthesized the relevant research on assessing and improving the quality of SOM in recent years, and then proposed a solution to improve the quality of the feature map by adjusting parameters of the Gaussian neighborhood function. We have used quantization error and topographical error to evaluate the quality of the obtained feature map. The experiment was conducted on 12 published datasets and compared the obtained results with some other improving neighborhood function methods. The proposed method received the feature map with better quality than other solutions.


Author(s):  
Robert Tatoian ◽  
Lutz Hamel

Self-organizing maps are artificial neural networks designed for unsupervised machine learning. Here in this article, the authors introduce a new quality measure called the convergence index. The convergence index is a linear combination of map embedding accuracy and estimated topographic accuracy and since it reports a single statistically meaningful number it is perhaps more intuitive to use than other quality measures. The convergence index in the context of clustering problems was proposed by Ultsch as part of his fundamental clustering problem suite as well as real world datasets. First demonstrated is that the convergence index captures the notion that a SOM has learned the multivariate distribution of a training data set by looking at the convergence of the marginals. The convergence index is then used to study the convergence of SOMs with respect to the different parameters that govern self-organizing map learning. One result is that the constant neighborhood function produces better self-organizing map models than the popular Gaussian neighborhood function.


2012 ◽  
Vol 25 ◽  
pp. 146-160 ◽  
Author(s):  
Marta Kolasa ◽  
Rafał Długosz ◽  
Witold Pedrycz ◽  
Michał Szulc

2007 ◽  
Vol 19 (9) ◽  
pp. 2515-2535 ◽  
Author(s):  
Takaaki Aoki ◽  
Toshio Aoyagi

The self-organizing map (SOM) is an unsupervised learning method as well as a type of nonlinear principal component analysis that forms a topologically ordered mapping from the high-dimensional data space to a low-dimensional representation space. It has recently found wide applications in such areas as visualization, classification, and mining of various data. However, when the data sets to be processed are very large, a copious amount of time is often required to train the map, which seems to restrict the range of putative applications. One of the major culprits for this slow ordering time is that a kind of topological defect (e.g., a kink in one dimension or a twist in two dimensions) gets created in the map during training. Once such a defect appears in the map during training, the ordered map cannot be obtained until the defect is eliminated, for which the number of iterations required is typically several times larger than in the absence of the defect. In order to overcome this weakness, we propose that an asymmetric neighborhood function be used for the SOM algorithm. Compared with the commonly used symmetric neighborhood function, we found that an asymmetric neighborhood function accelerates the ordering process of the SOM algorithm, though this asymmetry tends to distort the generated ordered map. We demonstrate that the distortion of the map can be suppressed by improving the asymmetric neighborhood function SOM algorithm. The number of learning steps required for perfect ordering in the case of the one-dimensional SOM is numerically shown to be reduced from O(N3) to O(N2) with an asymmetric neighborhood function, even when the improved algorithm is used to get the final map without distortion.


Sign in / Sign up

Export Citation Format

Share Document