An Improved E-Learner Communities Self-organizing Algorithm Based on Hebbian Learning Law

Author(s):  
LingNing Li ◽  
Peng Han ◽  
Fan Yang
2010 ◽  
Vol 22 (3) ◽  
pp. 689-729 ◽  
Author(s):  
Vilson Luiz Dalle Mole ◽  
Aluizio Fausto Ribeiro Araújo

The growing self-organizing surface map (GSOSM) is a novel map model that learns a folded surface immersed in a 3D space. Starting from a dense point cloud, the surface is reconstructed through an incremental mesh composed of approximately equilateral triangles. Unlike other models such as neural meshes (NM), the GSOSM builds a surface topology while accepting any sequence of sample presentation. The GSOSM model introduces a novel connection learning rule called competitive connection Hebbian learning (CCHL), which produces a complete triangulation. GSOSM reconstructions are accurate and often free of false or overlapping faces. This letter presents and discusses the GSOSM model. It also presents and analyzes a set of results and compares GSOSM with some other models.


2004 ◽  
Vol 16 (3) ◽  
pp. 535-561 ◽  
Author(s):  
Reiner Schulz ◽  
James A. Reggia

We examine the extent to which modified Kohonen self-organizing maps (SOMs) can learn unique representations of temporal sequences while still supporting map formation. Two biologically inspired extensions are made to traditional SOMs: selection of multiple simultaneous rather than single “winners” and the use of local intramap connections that are trained according to a temporally asymmetric Hebbian learning rule. The extended SOM is then trained with variable-length temporal sequences that are composed of phoneme feature vectors, with each sequence corresponding to the phonetic transcription of a noun. The model transforms each input sequence into a spatial representation (final activation pattern on the map). Training improves this transformation by, for example, increasing the uniqueness of the spatial representations of distinct sequences, while still retaining map formation based on input patterns. The closeness of the spatial representations of two sequences is found to correlate significantly with the sequences' similarity. The extended model presented here raises the possibility that SOMs may ultimately prove useful as visualization tools for temporal sequences and as preprocessors for sequence pattern recognition systems.


Author(s):  
Nikola Kasabov ◽  
◽  
Robert Kozma ◽  

This special issue is devoted to one of the important topics of current intelligent information systems-their ability to adapt to the environment they operate in, as adaptation is one of the most important features of intelligence. Several milestones in the literature on adaptive systems mark the development in this area. The Hebbian learning rule,1) self-organizing maps,2,3) and adaptive resonance theory4) have influenced the research in this area a great deal. Some current development suggests methods for building adaptive neurofuzzy systems,5) and adaptive self-organizing systems based on principles from biological brains.6) The papers in this issue are organized as follows: The first two papers present material on organization and adaptation in the human brain. The third paper, by Kasabov, presents a novel approach to building open structured adaptive systems for on-line adaptation called evolving connectionist systems. The fourth paper by Kawahara and Saito suggests a method for building virtually connected adaptive cell structures. Papers 5 and 6 discuss the use of genetic algorithms and evolutionary computation for optimizing and adapting the structure of an intelligent system. The last two papers suggest methods for adaptive learning of a sequence of data in a feed-forward neural network that has a fixed structure. References: 1) D.O. Hebb, "The Organization of Behavior," Jwiley, New York, (1949). 2) T. Kohonen, "Self-organisation and associative memory," Springer-Verlag, Berlin, (1988). 3) T. Kohonen, "Self-Organizing Maps, second edition," Springer Verlag, (1997). 4) G. Carpenter and S. Grossberg, "Pattern recognition by self-organizing neural networks," The MIT Press, Cambridge, Massachusetts, (1991). 5) N. Kasabov, "Foundations of Neural Networks, Fuzzy Systems and Knowledge Engineering," The MIT Press, CA, MA, (1996). 6) S. Amari and N. Kasabov "Brain-like Computing and Intelligent Information Systems," Springer Verlag, Singapore, (1997).


2019 ◽  
Author(s):  
Anthony Marinac ◽  
Brian Simpson ◽  
Caroline Hart ◽  
Rhianna Chisholm ◽  
Jennifer Nielsen ◽  
...  
Keyword(s):  

1993 ◽  
Author(s):  
Steven A. Harp ◽  
Tariq Samad ◽  
Michael Villano

1998 ◽  
Author(s):  
Svetlana Apenova ◽  
Igor Yevin

Sign in / Sign up

Export Citation Format

Share Document