Fuzzy Neural Gas for Unsupervised Vector Quantization

Author(s):  
Thomas Villmann ◽  
Tina Geweniger ◽  
Marika Kästner ◽  
Mandy Lange
2013 ◽  
Vol 2013 ◽  
pp. 1-10 ◽  
Author(s):  
Tina Geweniger ◽  
Lydia Fischer ◽  
Marika Kaden ◽  
Mandy Lange ◽  
Thomas Villmann

We consider some modifications of the neural gas algorithm. First, fuzzy assignments as known from fuzzy c-means and neighborhood cooperativeness as known from self-organizing maps and neural gas are combined to obtain a basic Fuzzy Neural Gas. Further, a kernel variant and a simulated annealing approach are derived. Finally, we introduce a fuzzy extension of the ConnIndex to obtain an evaluation measure for clusterings based on fuzzy vector quantization.


2011 ◽  
Vol 23 (5) ◽  
pp. 1343-1392 ◽  
Author(s):  
Thomas Villmann ◽  
Sven Haase

Supervised and unsupervised vector quantization methods for classification and clustering traditionally use dissimilarities, frequently taken as Euclidean distances. In this article, we investigate the applicability of divergences instead, focusing on online learning. We deduce the mathematical fundamentals for its utilization in gradient-based online vector quantization algorithms. It bears on the generalized derivatives of the divergences known as Fréchet derivatives in functional analysis, which reduces in finite-dimensional problems to partial derivatives in a natural way. We demonstrate the application of this methodology for widely applied supervised and unsupervised online vector quantization schemes, including self-organizing maps, neural gas, and learning vector quantization. Additionally, principles for hyperparameter optimization and relevance learning for parameterized divergences in the case of supervised vector quantization are given to achieve improved classification accuracy.


2008 ◽  
Vol 71 (7-9) ◽  
pp. 1210-1219 ◽  
Author(s):  
Aree Witoelar ◽  
Michael Biehl ◽  
Anarta Ghosh ◽  
Barbara Hammer

2006 ◽  
Vol 18 (2) ◽  
pp. 446-469 ◽  
Author(s):  
Thomas Villmann ◽  
Jens Christian Claussen

We consider different ways to control the magnification in self-organizing maps (SOM) and neural gas (NG). Starting from early approaches of magnification control in vector quantization, we then concentrate on different approaches for SOM and NG. We show that three structurally similar approaches can be applied to both algorithms that are localized learning, concave-convex learning, and winner-relaxing learning. Thereby, the approach of concave-convex learning in SOM is extended to a more general description, whereas the concave-convex learning for NG is new. In general, the control mechanisms generate only slightly different behavior comparing both neural algorithms. However, we emphasize that the NG results are valid for any data dimension, whereas in the SOM case, the results hold only for the one-dimensional case.


2016 ◽  
Vol 2016 ◽  
pp. 1-7 ◽  
Author(s):  
Iveta Dirgová Luptáková ◽  
Marek Šimon ◽  
Ladislav Huraj ◽  
Jiří Pospíchal

Clustering algorithms belong to major topics in big data analysis. Their main goal is to separate an unlabelled dataset into several subsets, with each subset ideally characterized by some unique characteristic of its data structure. Common clustering approaches cannot impose constraints on sizes of clusters. However, in many applications, sizes of clusters are bounded or known in advance. One of the more recent robust clustering algorithms is called neural gas which is popular, for example, for data compression and vector quantization used in speech recognition and signal processing. In this paper, we have introduced an adapted neural gas algorithm able to accommodate requirements for the size of clusters. The convergence of algorithm towards an optimum is tested on simple illustrative examples. The proposed algorithm provides better statistical results than its direct counterpart, balancedk-means algorithm, and, moreover, unlike the balancedk-means, the quality of results of our proposed algorithm can be straightforwardly controlled by user defined parameters.


Sign in / Sign up

Export Citation Format

Share Document