ACTIVATION HEBBIAN LEARNING RULE FOR FUZZY COGNITIVE MAPS

2002 ◽  
Vol 35 (1) ◽  
pp. 319-324 ◽  
Author(s):  
Elpiniki Papageorgiou ◽  
Chrysostomos D. Stylios ◽  
Peter P. Groumpos
2004 ◽  
Vol 37 (3) ◽  
pp. 219-249 ◽  
Author(s):  
E.I. Papageorgiou ◽  
C.D. Stylios ◽  
P.P. Groumpos

2019 ◽  
Vol 6 (4) ◽  
pp. 181098 ◽  
Author(s):  
Le Zhao ◽  
Jie Xu ◽  
Xiantao Shang ◽  
Xue Li ◽  
Qiang Li ◽  
...  

Non-volatile memristors are promising for future hardware-based neurocomputation application because they are capable of emulating biological synaptic functions. Various material strategies have been studied to pursue better device performance, such as lower energy cost, better biological plausibility, etc. In this work, we show a novel design for non-volatile memristor based on CoO/Nb:SrTiO 3 heterojunction. We found the memristor intrinsically exhibited resistivity switching behaviours, which can be ascribed to the migration of oxygen vacancies and charge trapping and detrapping at the heterojunction interface. The carrier trapping/detrapping level can be finely adjusted by regulating voltage amplitudes. Gradual conductance modulation can therefore be realized by using proper voltage pulse stimulations. And the spike-timing-dependent plasticity, an important Hebbian learning rule, has been implemented in the device. Our results indicate the possibility of achieving artificial synapses with CoO/Nb:SrTiO 3 heterojunction. Compared with filamentary type of the synaptic device, our device has the potential to reduce energy consumption, realize large-scale neuromorphic system and work more reliably, since no structural distortion occurs.


1989 ◽  
Vol 03 (07) ◽  
pp. 555-560 ◽  
Author(s):  
M.V. TSODYKS

We consider the Hopfield model with the most simple form of the Hebbian learning rule, when only simultaneous activity of pre- and post-synaptic neurons leads to modification of synapse. An extra inhibition proportional to full network activity is needed. Both symmetric nondiluted and asymmetric diluted networks are considered. The model performs well at extremely low level of activity p<K−1/2, where K is the mean number of synapses per neuron.


Author(s):  
M. Shamim Khan ◽  
◽  
Alex Chong ◽  
Tom Gedeon

Differential Hebbian Learning (DHL) was proposed by Kosko as an unsupervised learning scheme for Fuzzy Cognitive Maps (FCMs). DHL can be used with a sequence of state vectors to adapt the causal link strengths of an FCM. However, it does not guarantee learning of the sequence by the FCM and no concrete procedures for the use of DHL has been developed. In this paper a formal methodology is proposed for using DHL in the development of FCMs in a decision support context. The four steps in the methodology are: (1) Creation of a crisp cognitive map; (2) Identification of event sequences for use in DHL; (3) Event sequence encoding using DHL; (4) Revision of the trained FCM. Feasibility of the proposed methodology is demonstrated with an example involving a dynamic system with feedback based on a real-life scenario.


1996 ◽  
Vol 8 (3) ◽  
pp. 545-566 ◽  
Author(s):  
Christopher W. Lee ◽  
Bruno A. Olshausen

An intrinsic limitation of linear, Hebbian networks is that they are capable of learning only from the linear pairwise correlations within an input stream. To explore what higher forms of structure could be learned with a nonlinear Hebbian network, we constructed a model network containing a simple form of nonlinearity and we applied it to the problem of learning to detect the disparities present in random-dot stereograms. The network consists of three layers, with nonlinear sigmoidal activation functions in the second-layer units. The nonlinearities allow the second layer to transform the pixel-based representation in the input layer into a new representation based on coupled pairs of left-right inputs. The third layer of the network then clusters patterns occurring on the second-layer outputs according to their disparity via a standard competitive learning rule. Analysis of the network dynamics shows that the second-layer units' nonlinearities interact with the Hebbian learning rule to expand the region over which pairs of left-right inputs are stable. The learning rule is neurobiologically inspired and plausible, and the model may shed light on how the nervous system learns to use coincidence detection in general.


1991 ◽  
Vol 3 (2) ◽  
pp. 201-212 ◽  
Author(s):  
Peter J. B. Hancock ◽  
Leslie S. Smith ◽  
William A. Phillips

We show that a form of synaptic plasticity recently discovered in slices of the rat visual cortex (Artola et al. 1990) can support an error-correcting learning rule. The rule increases weights when both pre- and postsynaptic units are highly active, and decreases them when pre-synaptic activity is high and postsynaptic activation is less than the threshold for weight increment but greater than a lower threshold. We show that this rule corrects false positive outputs in feedforward associative memory, that in an appropriate opponent-unit architecture it corrects misses, and that it performs better than the optimal Hebbian learning rule reported by Willshaw and Dayan (1990).


Sign in / Sign up

Export Citation Format

Share Document