scholarly journals Non-linear Memristive Synaptic Dynamics for Efficient Unsupervised Learning in Spiking Neural Networks

2021 ◽  
Vol 15 ◽  
Author(s):  
Stefano Brivio ◽  
Denys R. B. Ly ◽  
Elisa Vianello ◽  
Sabina Spiga

Spiking neural networks (SNNs) are a computational tool in which the information is coded into spikes, as in some parts of the brain, differently from conventional neural networks (NNs) that compute over real-numbers. Therefore, SNNs can implement intelligent information extraction in real-time at the edge of data acquisition and correspond to a complementary solution to conventional NNs working for cloud-computing. Both NN classes face hardware constraints due to limited computing parallelism and separation of logic and memory. Emerging memory devices, like resistive switching memories, phase change memories, or memristive devices in general are strong candidates to remove these hurdles for NN applications. The well-established training procedures of conventional NNs helped in defining the desiderata for memristive device dynamics implementing synaptic units. The generally agreed requirements are a linear evolution of memristive conductance upon stimulation with train of identical pulses and a symmetric conductance change for conductance increase and decrease. Conversely, little work has been done to understand the main properties of memristive devices supporting efficient SNN operation. The reason lies in the lack of a background theory for their training. As a consequence, requirements for NNs have been taken as a reference to develop memristive devices for SNNs. In the present work, we show that, for efficient CMOS/memristive SNNs, the requirements for synaptic memristive dynamics are very different from the needs of a conventional NN. System-level simulations of a SNN trained to classify hand-written digit images through a spike timing dependent plasticity protocol are performed considering various linear and non-linear plausible synaptic memristive dynamics. We consider memristive dynamics bounded by artificial hard conductance values and limited by the natural dynamics evolution toward asymptotic values (soft-boundaries). We quantitatively analyze the impact of resolution and non-linearity properties of the synapses on the network training and classification performance. Finally, we demonstrate that the non-linear synapses with hard boundary values enable higher classification performance and realize the best trade-off between classification accuracy and required training time. With reference to the obtained results, we discuss how memristive devices with non-linear dynamics constitute a technologically convenient solution for the development of on-line SNN training.

2021 ◽  
Vol 15 ◽  
Author(s):  
Aditi Anand ◽  
Sanchari Sen ◽  
Kaushik Roy

Quantifying the similarity between artificial neural networks (ANNs) and their biological counterparts is an important step toward building more brain-like artificial intelligence systems. Recent efforts in this direction use neural predictivity, or the ability to predict the responses of a biological brain given the information in an ANN (such as its internal activations), when both are presented with the same stimulus. We propose a new approach to quantifying neural predictivity by explicitly mapping the activations of an ANN to brain responses with a non-linear function, and measuring the error between the predicted and actual brain responses. Further, we propose to use a neural network to approximate this mapping function by training it on a set of neural recordings. The proposed method was implemented within the TensorFlow framework and evaluated on a suite of 8 state-of-the-art image recognition ANNs. Our experiments suggest that the use of a non-linear mapping function leads to higher neural predictivity. Our findings also reaffirm the observation that the latest advances in classification performance of image recognition ANNs are not matched by improvements in their neural predictivity. Finally, we examine the impact of pruning, a widely used ANN optimization, on neural predictivity, and demonstrate that network sparsity leads to higher neural predictivity.


2020 ◽  
Author(s):  
Sumedha Gandharava Dahl

In this dissertation, memristor-based spiking neural networks (SNNs) are used to analyze the effect of radiation on the spatio-temporal pattern recognition (STPR) capability of the networks. Two-terminal resistive memory devices (memristors) are used as synapses to manipulate conductivity paths in the network. Spike-timing-dependent plasticity (STDP) learning behavior results in pattern learning and is achieved using biphasic shaped pre- and post-synaptic spikes. A TiO2 based non-linear drift memristor model designed in Verilog-A implements synaptic behavior and is modified to include experimentally observed effects of state-altering, ionizing, and off-state degradation radiation on the device. The impact of neuron "death" (disabled neuron circuits) due to radiation is also examined. In general, radiation interaction events distort the STDP learning curve undesirably, favoring synaptic potentiation. At lower short-term flux, the network is able to recover and relearn the pattern with consistent training, although some pixels may be affected due to stability issues. As the radiation flux and duration increases, it can overwhelm the leaky integrate-and-fire (LIF) post-synaptic neuron circuit, and the network does not learn the pattern. On the other hand, in the absence of the pattern, the radiation effects cumulate, and the system never regains stability. Neuron-death simulation results emphasize the importance of non-participating neurons during the learning process, concluding that non-participating afferents contribute to improving the learning ability of the neural network. Instantaneous neuron death proves to be more detrimental for the network compared to when the afferents die over time thus, retaining the network's pattern learning capability.


2021 ◽  
Vol 17 (4) ◽  
pp. 1-26
Author(s):  
Md Musabbir Adnan ◽  
Sagarvarma Sayyaparaju ◽  
Samuel D. Brown ◽  
Mst Shamim Ara Shawkat ◽  
Catherine D. Schuman ◽  
...  

Spiking neural networks (SNN) offer a power efficient, biologically plausible learning paradigm by encoding information into spikes. The discovery of the memristor has accelerated the progress of spiking neuromorphic systems, as the intrinsic plasticity of the device makes it an ideal candidate to mimic a biological synapse. Despite providing a nanoscale form factor, non-volatility, and low-power operation, memristors suffer from device-level non-idealities, which impact system-level performance. To address these issues, this article presents a memristive crossbar-based neuromorphic system using unsupervised learning with twin-memristor synapses, fully digital pulse width modulated spike-timing-dependent plasticity, and homeostasis neurons. The implemented single-layer SNN was applied to a pattern-recognition task of classifying handwritten-digits. The performance of the system was analyzed by varying design parameters such as number of training epochs, neurons, and capacitors. Furthermore, the impact of memristor device non-idealities, such as device-switching mismatch, aging, failure, and process variations, were investigated and the resilience of the proposed system was demonstrated.


Author(s):  
Xiumin Li ◽  
Qing Chen ◽  
Fangzheng Xue

In recent years, an increasing number of studies have demonstrated that networks in the brain can self-organize into a critical state where dynamics exhibit a mixture of ordered and disordered patterns. This critical branching phenomenon is termed neuronal avalanches. It has been hypothesized that the homeostatic level balanced between stability and plasticity of this critical state may be the optimal state for performing diverse neural computational tasks. However, the critical region for high performance is narrow and sensitive for spiking neural networks (SNNs). In this paper, we investigated the role of the critical state in neural computations based on liquid-state machines, a biologically plausible computational neural network model for real-time computing. The computational performance of an SNN when operating at the critical state and, in particular, with spike-timing-dependent plasticity for updating synaptic weights is investigated. The network is found to show the best computational performance when it is subjected to critical dynamic states. Moreover, the active-neuron-dominant structure refined from synaptic learning can remarkably enhance the robustness of the critical state and further improve computational accuracy. These results may have important implications in the modelling of spiking neural networks with optimal computational performance. This article is part of the themed issue ‘Mathematical methods in medicine: neuroscience, cardiology and pathology’.


2020 ◽  
Vol 34 (02) ◽  
pp. 1316-1323
Author(s):  
Zuozhu Liu ◽  
Thiparat Chotibut ◽  
Christopher Hillar ◽  
Shaowei Lin

Motivated by the celebrated discrete-time model of nervous activity outlined by McCulloch and Pitts in 1943, we propose a novel continuous-time model, the McCulloch-Pitts network (MPN), for sequence learning in spiking neural networks. Our model has a local learning rule, such that the synaptic weight updates depend only on the information directly accessible by the synapse. By exploiting asymmetry in the connections between binary neurons, we show that MPN can be trained to robustly memorize multiple spatiotemporal patterns of binary vectors, generalizing the ability of the symmetric Hopfield network to memorize static spatial patterns. In addition, we demonstrate that the model can efficiently learn sequences of binary pictures as well as generative models for experimental neural spike-train data. Our learning rule is consistent with spike-timing-dependent plasticity (STDP), thus providing a theoretical ground for the systematic design of biologically inspired networks with large and robust long-range sequence storage capacity.


2020 ◽  
Vol 26 (1) ◽  
pp. 130-151 ◽  
Author(s):  
Atsushi Masumori ◽  
Lana Sinapayen ◽  
Norihiro Maruyama ◽  
Takeshi Mita ◽  
Douglas Bakkum ◽  
...  

Living organisms must actively maintain themselves in order to continue existing. Autopoiesis is a key concept in the study of living organisms, where the boundaries of the organism are not static but dynamically regulated by the system itself. To study the autonomous regulation of a self-boundary, we focus on neural homeodynamic responses to environmental changes using both biological and artificial neural networks. Previous studies showed that embodied cultured neural networks and spiking neural networks with spike-timing dependent plasticity (STDP) learn an action as they avoid stimulation from outside. In this article, as a result of our experiments using embodied cultured neurons, we find that there is also a second property allowing the network to avoid stimulation: If the agent cannot learn an action to avoid the external stimuli, it tends to decrease the stimulus-evoked spikes, as if to ignore the uncontrollable input. We also show such a behavior is reproduced by spiking neural networks with asymmetric STDP. We consider that these properties are to be regarded as autonomous regulation of self and nonself for the network, in which a controllable neuron is regarded as self, and an uncontrollable neuron is regarded as nonself. Finally, we introduce neural autopoiesis by proposing the principle of stimulus avoidance.


Electronics ◽  
2018 ◽  
Vol 7 (12) ◽  
pp. 396 ◽  
Author(s):  
Errui Zhou ◽  
Liang Fang ◽  
Binbin Yang

Neuromorphic computing systems are promising alternatives in the fields of pattern recognition, image processing, etc. especially when conventional von Neumann architectures face several bottlenecks. Memristors play vital roles in neuromorphic computing systems and are usually used as synaptic devices. Memristive spiking neural networks (MSNNs) are considered to be more efficient and biologically plausible than other systems due to their spike-based working mechanism. In contrast to previous SNNs with complex architectures, we propose a hardware-friendly architecture and an unsupervised spike-timing dependent plasticity (STDP) learning method for MSNNs in this paper. The architecture, which is friendly to hardware implementation, includes an input layer, a feature learning layer and a voting circuit. To reduce hardware complexity, some constraints are enforced: the proposed architecture has no lateral inhibition and is purely feedforward; it uses the voting circuit as a classifier and does not use additional classifiers; all neurons can generate at most one spike and do not need to consider firing rates and refractory periods; all neurons have the same fixed threshold voltage for classification. The presented unsupervised STDP learning method is time-dependent and uses no homeostatic mechanism. The MNIST dataset is used to demonstrate our proposed architecture and learning method. Simulation results show that our proposed architecture with the learning method achieves a classification accuracy of 94.6%, which outperforms other unsupervised SNNs that use time-based encoding schemes.


2018 ◽  
Vol 108 ◽  
pp. 365-378 ◽  
Author(s):  
Zhengzhong Liang ◽  
David Schwartz ◽  
Gregory Ditzler ◽  
O. Ozan Koyluoglu

Sign in / Sign up

Export Citation Format

Share Document