scholarly journals Spatial Memory in a Spiking Neural Network with Robot Embodiment

Sensors ◽  
2021 ◽  
Vol 21 (8) ◽  
pp. 2678
Author(s):  
Sergey A. Lobov ◽  
Alexey I. Zharinov ◽  
Valeri A. Makarov ◽  
Victor B. Kazantsev

Cognitive maps and spatial memory are fundamental paradigms of brain functioning. Here, we present a spiking neural network (SNN) capable of generating an internal representation of the external environment and implementing spatial memory. The SNN initially has a non-specific architecture, which is then shaped by Hebbian-type synaptic plasticity. The network receives stimuli at specific loci, while the memory retrieval operates as a functional SNN response in the form of population bursts. The SNN function is explored through its embodiment in a robot moving in an arena with safe and dangerous zones. We propose a measure of the global network memory using the synaptic vector field approach to validate results and calculate information characteristics, including learning curves. We show that after training, the SNN can effectively control the robot’s cognitive behavior, allowing it to avoid dangerous regions in the arena. However, the learning is not perfect. The robot eventually visits dangerous areas. Such behavior, also observed in animals, enables relearning in time-evolving environments. If a dangerous zone moves into another place, the SNN remaps positive and negative areas, allowing escaping the catastrophic interference phenomenon known for some AI architectures. Thus, the robot adapts to changing world.

Author(s):  
Alexander N. BUSYGIN ◽  
Andrey N. BOBYLEV ◽  
Alexey A. GUBIN ◽  
Alexander D. PISAREV ◽  
Sergey Yu. UDOVICHENKO

This article presents the results of a numerical simulation and an experimental study of the electrical circuit of a hardware spiking perceptron based on a memristor-diode crossbar. That has required developing and manufacturing a measuring bench, the electrical circuit of which consists of a hardware perceptron circuit and an input peripheral electrical circuit to implement the activation functions of the neurons and ensure the operation of the memory matrix in a spiking mode. The authors have performed a study of the operation of the hardware spiking neural network with memristor synapses in the form of a memory matrix in the mode of a single-layer perceptron synapses. The perceptron can be considered as the first layer of a biomorphic neural network that performs primary processing of incoming information in a biomorphic neuroprocessor. The obtained experimental and simulation learning curves show the expected increase in the proportion of correct classifications with an increase in the number of training epochs. The authors demonstrate generating a new association during retraining caused by the presence of new input information. Comparison of the results of modeling and an experiment on training a small neural network with a small crossbar will allow creating adequate models of hardware neural networks with a large memristor-diode crossbar. The arrival of new unknown information at the input of the hardware spiking neural network can be related with the generation of new associations in the biomorphic neuroprocessor. With further improvement of the neural network, this information will be comprehended and, therefore, will allow the transition from weak to strong artificial intelligence.


Author(s):  
S.A. Lobov

We propose a memory model based on the spiking neural network with Spike-Timing-Dependent Plasticity (STDP). In the model, information is recorded using local external stimulation. The memory decoding is a functional response in the form of population bursts of action potentials synchronized with the applied stimuli. In our model, STDP-mediated weights rearrangements are able to encode the localization of the applied stimulation, while the stimulation focus forms the source of the vector field of synaptic connections. Based on the characteristics of this field, we propose a measure of generalized network memory. With repeated stimulations, we can observe a decrease in time until synchronous activity occurs. In this case, the obtained average learning curve and the dependence of the generalized memory on the stimulation number are characterized by a power-law. We show that the maximum time to reach a functional response is determined by the generalized memory remaining as a result of previous stimulations. Thus, the properly learning curves are due to the presence of incomplete forgetting of previous influences. We study the reliability of generalized network memory, determined by the storage time of memory traces after the termination of external stimulation. The reliability depends on the level of neural noise, and this dependence is also power-law. We found that hubs — neurons that can initiate the generation of population bursts in the absence of noise — play a key role in maintaining generalized network memory. The inclusion of neural noise leads to the occurrence of random bursts initiated by neurons that are not hubs. This noise activity destroys memory traces and reduces the reliability of generalized network memory.


2018 ◽  
Vol 145 ◽  
pp. 488-494 ◽  
Author(s):  
Aleksandr Sboev ◽  
Alexey Serenko ◽  
Roman Rybka ◽  
Danila Vlasov ◽  
Andrey Filchenkov

2021 ◽  
Vol 1914 (1) ◽  
pp. 012036
Author(s):  
LI Wei ◽  
Zhu Wei-gang ◽  
Pang Hong-feng ◽  
Zhao Hong-yu

Sensors ◽  
2021 ◽  
Vol 21 (4) ◽  
pp. 1065
Author(s):  
Moshe Bensimon ◽  
Shlomo Greenberg ◽  
Moshe Haiut

This work presents a new approach based on a spiking neural network for sound preprocessing and classification. The proposed approach is biologically inspired by the biological neuron’s characteristic using spiking neurons, and Spike-Timing-Dependent Plasticity (STDP)-based learning rule. We propose a biologically plausible sound classification framework that uses a Spiking Neural Network (SNN) for detecting the embedded frequencies contained within an acoustic signal. This work also demonstrates an efficient hardware implementation of the SNN network based on the low-power Spike Continuous Time Neuron (SCTN). The proposed sound classification framework suggests direct Pulse Density Modulation (PDM) interfacing of the acoustic sensor with the SCTN-based network avoiding the usage of costly digital-to-analog conversions. This paper presents a new connectivity approach applied to Spiking Neuron (SN)-based neural networks. We suggest considering the SCTN neuron as a basic building block in the design of programmable analog electronics circuits. Usually, a neuron is used as a repeated modular element in any neural network structure, and the connectivity between the neurons located at different layers is well defined. Thus, generating a modular Neural Network structure composed of several layers with full or partial connectivity. The proposed approach suggests controlling the behavior of the spiking neurons, and applying smart connectivity to enable the design of simple analog circuits based on SNN. Unlike existing NN-based solutions for which the preprocessing phase is carried out using analog circuits and analog-to-digital conversion, we suggest integrating the preprocessing phase into the network. This approach allows referring to the basic SCTN as an analog module enabling the design of simple analog circuits based on SNN with unique inter-connections between the neurons. The efficiency of the proposed approach is demonstrated by implementing SCTN-based resonators for sound feature extraction and classification. The proposed SCTN-based sound classification approach demonstrates a classification accuracy of 98.73% using the Real-World Computing Partnership (RWCP) database.


2021 ◽  
pp. 127068
Author(s):  
Shuang Gao ◽  
Shuiying Xiang ◽  
Ziwei Song ◽  
Yanan Han ◽  
Yue Hao

Sign in / Sign up

Export Citation Format

Share Document