scholarly journals POETS: A Parallel Cluster Architecture for Spiking Neural Network

2021 ◽  
Vol 11 (4) ◽  
pp. 281-285
Author(s):  
Mahyar Shahsavari ◽  
◽  
Jonathan Beaumont ◽  
David Thomas ◽  
Andrew D. Brown

Spiking Neural Networks (SNNs) are known as a branch of neuromorphic computing and are currently used in neuroscience applications to understand and model the biological brain. SNNs could also potentially be used in many other application domains such as classification, pattern recognition, and autonomous control. This work presents a highly-scalable hardware platform called POETS, and uses it to implement SNN on a very large number of parallel and reconfigurable FPGA-based processors. The current system consists of 48 FPGAs, providing 3072 processing cores and 49152 threads. We use this hardware to implement up to four million neurons with one thousand synapses. Comparison to other similar platforms shows that the current POETS system is twenty times faster than the Brian simulator, and at least two times faster than SpiNNaker.

2021 ◽  
Vol 23 (6) ◽  
pp. 317-326
Author(s):  
E.A. Ryndin ◽  
◽  
N.V. Andreeva ◽  
V.V. Luchinin ◽  
K.S. Goncharov ◽  
...  

In the current era, design and development of artificial neural networks exploiting the architecture of the human brain have evolved rapidly. Artificial neural networks effectively solve a wide range of common for artificial intelligence tasks involving data classification and recognition, prediction, forecasting and adaptive control of object behavior. Biologically inspired underlying principles of ANN operation have certain advantages over the conventional von Neumann architecture including unsupervised learning, architectural flexibility and adaptability to environmental change and high performance under significantly reduced power consumption due to heavy parallel and asynchronous data processing. In this paper, we present the circuit design of main functional blocks (neurons and synapses) intended for hardware implementation of a perceptron-based feedforward spiking neural network. As the third generation of artificial neural networks, spiking neural networks perform data processing utilizing spikes, which are discrete events (or functions) that take place at points in time. Neurons in spiking neural networks initiate precisely timing spikes and communicate with each other via spikes transmitted through synaptic connections or synapses with adaptable scalable weight. One of the prospective approach to emulate the synaptic behavior in hardware implemented spiking neural networks is to use non-volatile memory devices with analog conduction modulation (or memristive structures). Here we propose a circuit design for functional analogues of memristive structure to mimic a synaptic plasticity, pre- and postsynaptic neurons which could be used for developing circuit design of spiking neural network architectures with different training algorithms including spike-timing dependent plasticity learning rule. Two different circuits of electronic synapse were developed. The first one is an analog synapse with photoresistive optocoupler used to ensure the tunable conductivity for synaptic plasticity emulation. While the second one is a digital synapse, in which the synaptic weight is stored in a digital code with its direct conversion into conductivity (without digital-to-analog converter andphotoresistive optocoupler). The results of the prototyping of developed circuits for electronic analogues of synapses, pre- and postsynaptic neurons and the study of transient processes are presented. The developed approach could provide a basis for ASIC design of spiking neural networks based on CMOS (complementary metal oxide semiconductor) design technology.


2020 ◽  
Vol 9 (1) ◽  
pp. 319-325
Author(s):  
Fadilla ‘Atyka Nor Rashid ◽  
Nor Surayahani Suriani

Classifying gesture or movements nowadays become a demanding business as the technologies of sensor rose. This has enchanted many researchers to actively investigated widely within the area of computer vision. Rehabilitation exercises is one of the most popular gestures or movements that being worked by the researchers nowadays. Rehab session usually involves experts that monitored the patients but lacking the experts itself made the session become longer and unproductive. This works adopted a dataset from UI-PRMD that assembled from 10 rehabilitation movements. The data has been encoded into spike trains for spike patterns analysis. Next, we tend to train the spike trains into Spiking Neural Networks and resulting into a promising result. However, in future, this method will be tested with other data to validate the performance, also to enhance the success rate of the accuracy.


2019 ◽  
Author(s):  
Faramarz Faghihi ◽  
Hossein Molhem ◽  
Ahmed A. Moustafa

AbstractConventional deep neural networks capture essential information processing stages in perception. Deep neural networks often require very large volume of training examples, whereas children can learn concepts such as hand-written digits with few examples. The goal of this project is to develop a deep spiking neural network that can learn from few training trials. Using known neuronal mechanisms, a spiking neural network model is developed and trained to recognize hand-written digits with presenting one to four training examples for each digit taken from the MNIST database. The model detects and learns geometric features of the images from MNIST database. In this work, a novel biological back-propagation based learning rule is developed and used to a train the network to detect basic features of different digits. For this purpose, randomly initialized synaptic weights between the layers are being updated. By using a neuroscience inspired mechanism named ‘synaptic pruning’ and a predefined threshold, some of the synapses through the training are deleted. Hence, information channels are constructed that are highly specific for each digit as matrix of synaptic connections between two layers of spiking neural networks. These connection matrixes named ‘information channels’ are used in the test phase to assign a digit class to each test image. As similar to humans’ abilities to learn from small training trials, the developed spiking neural network needs a very small dataset for training, compared to conventional deep learning methods checked on MNIST dataset.


2019 ◽  
Author(s):  
Sushrut Thorat

The dynamics of a quadcopter are unstable and non-linear. As a result, the quadcopter's flight relies heavily on the Flight controller. Here we present a robust control scheme which can act as the flight controller for the quadcopter. We then describe a scheme to translate this scheme into a Spiking Neural Network using a modular approach to control the quadcopter flight in realistic environmental conditions (presence of noisy wind, IMU noise, and delayed signals). (The final part was left incomplete as I graduated and shifted my focus to other questions)


2011 ◽  
Vol 23 (6) ◽  
pp. 1503-1535 ◽  
Author(s):  
Romain Brette ◽  
Dan F. M. Goodman

High-level languages (Matlab, Python) are popular in neuroscience because they are flexible and accelerate development. However, for simulating spiking neural networks, the cost of interpretation is a bottleneck. We describe a set of algorithms to simulate large spiking neural networks efficiently with high-level languages using vector-based operations. These algorithms constitute the core of Brian, a spiking neural network simulator written in the Python language. Vectorized simulation makes it possible to combine the flexibility of high-level languages with the computational efficiency usually associated with compiled languages.


Author(s):  
Taki Hasan Rafi

Recent advancement of deep learning has been elevated the multifaceted nature in various applications of this field. Artificial neural networks are now turning into a genuinely old procedure in the vast area of computer science; the principal thoughts and models are more than fifty years of age. However, in this modern computing era, 3rd generation intelligent models are introduced by scientists. In the biological neuron, actual film channels control the progression of particles over the layer by opening and shutting in light of voltage changes because of inborn current flows and remotely led to signals. A comprehensive 3rd generation, Spiking Neural Network (SNN) is diminishing the distance between deep learning, machine learning, and neuroscience in a biologically-inspired manner. It also connects neuroscience and machine learning to establish high-level efficient computing. Spiking Neural Networks initiate utilizing spikes, which are discrete functions that happen at focuses as expected, as opposed to constant values. This paper is a review of the biological-inspired spiking neural network and its applications in different areas. The author aims to present a brief introduction to SNN, which incorporates the mathematical structure, applications, and implementation of SNN. This paper also represents an overview of machine learning, deep learning, and reinforcement learning. This review paper can help advanced artificial intelligence researchers to get a compact brief intuition of spiking neural networks.


2020 ◽  
Vol 20 (8) ◽  
pp. 4735-4739 ◽  
Author(s):  
Chae Soo Kim ◽  
Taehyung Kim ◽  
Kyung Kyu Min ◽  
Sungjun Kim ◽  
Byung-Gook Park

In this paper, we pose reverse leakage current issue which occurs when resistive random access memory (RRAM) is used as synapse for spiking neural networks (SNNs). To prevent this problem, 1 diode-1 RRAM (1D1R) synapse is suggested and simulated to examine their current rectifying chracteristics, Furthermore, high density of 1 K 3D 1D1R synapse array structure and its process flow are proposed.


Author(s):  
Xiang Cheng ◽  
Yunzhe Hao ◽  
Jiaming Xu ◽  
Bo Xu

Spiking Neural Network (SNN) is considered more biologically plausible and energy-efficient on emerging neuromorphic hardware. Recently backpropagation algorithm has been utilized for training SNN, which allows SNN to go deeper and achieve higher performance. However, most existing SNN models for object recognition are mainly convolutional structures or fully-connected structures, which only have inter-layer connections, but no intra-layer connections. Inspired by Lateral Interactions in neuroscience, we propose a high-performance and noise-robust Spiking Neural Network (dubbed LISNN). Based on the convolutional SNN, we model the lateral interactions between spatially adjacent neurons and integrate it into the spiking neuron membrane potential formula, then build a multi-layer SNN on a popular deep learning framework, i.\,e., PyTorch. We utilize the pseudo-derivative method to solve the non-differentiable problem when applying backpropagation to train LISNN and test LISNN on multiple standard datasets. Experimental results demonstrate that the proposed model can achieve competitive or better performance compared to current state-of-the-art spiking neural networks on MNIST, Fashion-MNIST, and N-MNIST datasets. Besides, thanks to lateral interactions, our model processes stronger noise-robustness than other SNN. Our work brings a biologically plausible mechanism into SNN, hoping that it can help us understand the visual information processing in the brain.


2020 ◽  
Author(s):  
Xumeng Zhang ◽  
Jian Lu ◽  
Rui Wang ◽  
Jinsong Wei ◽  
Tuo Shi ◽  
...  

Abstract Spiking neural network, consisting of spiking neurons and plastic synapses, is a promising but relatively underdeveloped neural network for neuromorphic computing. Inspired by the human brain, it provides a unique solution for highly efficient data processing. Recently, memristor-based neurons and synapses are becoming intriguing candidates to build spiking neural networks in hardware, owing to the close resemblance between their device dynamics and the biological counterparts. However, the functionalities of memristor-based neurons are currently very limited, and a hardware demonstration of fully memristor-based spiking neural networks supporting in situ learning is very challenging. Here, a hybrid spiking neuron by combining the memristor with simple digital circuits is designed and implemented in hardware to enhance the neuron functions. The hybrid neuron with memristive dynamics not only realizes the basic leaky integrate-and-fire neuron function but also enables the in situ tuning of the connected synaptic weights. Finally, a fully hardware spiking neural network with the hybrid neurons and memristive synapses is experimentally demonstrated for the first time, with which in situ Hebbian learning is achieved. This work opens up a way towards the implementation of spiking neurons, supporting in situ learning for future neuromorphic computing systems.


Author(s):  
Xiumin Li ◽  
Qing Chen ◽  
Fangzheng Xue

In recent years, an increasing number of studies have demonstrated that networks in the brain can self-organize into a critical state where dynamics exhibit a mixture of ordered and disordered patterns. This critical branching phenomenon is termed neuronal avalanches. It has been hypothesized that the homeostatic level balanced between stability and plasticity of this critical state may be the optimal state for performing diverse neural computational tasks. However, the critical region for high performance is narrow and sensitive for spiking neural networks (SNNs). In this paper, we investigated the role of the critical state in neural computations based on liquid-state machines, a biologically plausible computational neural network model for real-time computing. The computational performance of an SNN when operating at the critical state and, in particular, with spike-timing-dependent plasticity for updating synaptic weights is investigated. The network is found to show the best computational performance when it is subjected to critical dynamic states. Moreover, the active-neuron-dominant structure refined from synaptic learning can remarkably enhance the robustness of the critical state and further improve computational accuracy. These results may have important implications in the modelling of spiking neural networks with optimal computational performance. This article is part of the themed issue ‘Mathematical methods in medicine: neuroscience, cardiology and pathology’.


Sign in / Sign up

Export Citation Format

Share Document