xor problem
Recently Published Documents


TOTAL DOCUMENTS

40
(FIVE YEARS 4)

H-INDEX

7
(FIVE YEARS 0)

2021 ◽  
Vol 17 (5) ◽  
pp. e1009015
Author(s):  
Toviah Moldwin ◽  
Menachem Kalmenson ◽  
Idan Segev

Synaptic clustering on neuronal dendrites has been hypothesized to play an important role in implementing pattern recognition. Neighboring synapses on a dendritic branch can interact in a synergistic, cooperative manner via nonlinear voltage-dependent mechanisms, such as NMDA receptors. Inspired by the NMDA receptor, the single-branch clusteron learning algorithm takes advantage of location-dependent multiplicative nonlinearities to solve classification tasks by randomly shuffling the locations of “under-performing” synapses on a model dendrite during learning (“structural plasticity”), eventually resulting in synapses with correlated activity being placed next to each other on the dendrite. We propose an alternative model, the gradient clusteron, or G-clusteron, which uses an analytically-derived gradient descent rule where synapses are "attracted to" or "repelled from" each other in an input- and location-dependent manner. We demonstrate the classification ability of this algorithm by testing it on the MNIST handwritten digit dataset and show that, when using a softmax activation function, the accuracy of the G-clusteron on the all-versus-all MNIST task (~85%) approaches that of logistic regression (~93%). In addition to the location update rule, we also derive a learning rule for the synaptic weights of the G-clusteron (“functional plasticity”) and show that a G-clusteron that utilizes the weight update rule can achieve ~89% accuracy on the MNIST task. We also show that a G-clusteron with both the weight and location update rules can learn to solve the XOR problem from arbitrary initial conditions.



2020 ◽  
Author(s):  
Toviah Moldwin ◽  
Menachem Kalmenson ◽  
Idan Segev

Synaptic clustering on neuronal dendrites has been hypothesized to play an important role in implementing pattern recognition. Neighboring synapses on a dendritic branch can interact in a synergistic, cooperative manner via the nonlinear voltage-dependence of NMDA receptors. Inspired by the NMDA receptor, the single-branch clusteron learning algorithm (Mel 1991) takes advantage of location-dependent multiplicative nonlinearities to solve classification tasks by randomly shuffling the locations of “under-performing” synapses on a model dendrite during learning (“structural plasticity”), eventually resulting in synapses with correlated activity being placed next to each other on the dendrite. We propose an alternative model, the gradient clusteron, or G-clusteron, which uses an analytically-derived gradient descent rule where synapses are “attracted to” or “repelled from” each other in an input- and location- dependent manner. We demonstrate the classification ability of this algorithm by testing it on the MNIST handwritten digit dataset and show that, when using a softmax activation function, the accuracy of the G-clusteron on the All-vs-All MNIST task (85.9%) approaches that of logistic regression (92.6%). In addition to the synaptic location update plasticity rule, we also derive a learning rule for the synaptic weights of the G-clusteron (“functional plasticity”) and show that the G-clusteron with both plasticity rules can achieve 89.5% accuracy on the MNIST task and can learn to solve the XOR problem from arbitrary initial conditions.



2019 ◽  
Vol 32 (14) ◽  
pp. 9965-9973
Author(s):  
André Cyr ◽  
Frédéric Thériault ◽  
Sylvain Chartier
Keyword(s):  


2019 ◽  
Vol 305 ◽  
pp. 154-168
Author(s):  
Pascal Caron ◽  
Edwin Hamel-de le Court ◽  
Jean-Gabriel Luque
Keyword(s):  


2019 ◽  
Vol 9 (15) ◽  
pp. 3176 ◽  
Author(s):  
Kang-moon Park ◽  
Donghoon Shin ◽  
Sung-do Chi

This paper proposes the variable chromosome genetic algorithm (VCGA) for structure learning in neural networks. Currently, the structural parameters of neural networks, i.e., number of neurons, coupling relations, number of layers, etc., have mostly been designed on the basis of heuristic knowledge of an artificial intelligence (AI) expert. To overcome this limitation, in this study evolutionary approach (EA) has been utilized to automatically generate the proper artificial neural network (ANN) structures. VCGA has a new genetic operation called a chromosome attachment. By applying the VCGA, the initial ANN structures can be flexibly evolved toward the proper structure. The case study applied to the typical exclusive or (XOR) problem shows the feasibility of our methodology. Our approach is differentiated with others in that it uses a variable chromosome in the genetic algorithm. It makes a neural network structure vary naturally, both constructively and destructively. It has been shown that the XOR problem is successfully optimized using a VCGA with a chromosome attachment to learn the structure of neural networks. Research on the structure learning of more complex problems is the topic of our future research.





Author(s):  
Mohammed Sarhan Al_Duais ◽  
Fatma Susilawati. Mohamad

<span lang="EN-US">The man problem of batch back propagation (BBP) algorithm is slow training and there are several parameters needs to be adjusted manually, also suffers from saturation training.</span><span lang="EN-US">The learning rate and momentum factor are significant parameters for increasing the efficiency of the (BBP). In this study, we created a new dynamic function of each learning rate and momentum facor. We present the DBBPLM algorithm, which trains with a dynamic function for each the learning rate and momentum factor.<br /> A Sigmoid function used as activation function. The XOR problem, balance, breast cancer and iris dataset were used as benchmarks for testing the effects of the dynamic DBBPLM algorithm. All the experiments were performed on Matlab 2012 a. The stop training was determined ten power -5. From the experimental results, the DBBPLM algorithm provides superior performance in terms of training, and faster training with higher accuracy compared to the BBP algorithm and with existing works.</span>



Author(s):  
Lorenzo Grassi ◽  
María Naya-Plasencia ◽  
André Schrottenloher


Sign in / Sign up

Export Citation Format

Share Document