scholarly journals Sequence Learning in a Single Trial: A Spiking Neurons Model Based on Hippocampal Circuitry

Author(s):  
Simone Coppolino ◽  
Giuseppe Giacopelli ◽  
Michele Migliore
2003 ◽  
Vol 17 (1) ◽  
pp. 1-13 ◽  
Author(s):  
Terry D. Blumenthal ◽  
Dmitriy Melkonian

Abstract A recently developed method of fragmentary decomposition (FD) of nonstationary physiological signals was extended to eyeblink EMG measurement to quantify all significant stimulus-induced components and identify their parameters. FD provides a representation of single-trial eyeblink EMG as a nonstationary signal with the generic mass potential (GMP) being the universal functional element. The current study uses this model-based signal-processing methodology to identify distinct stimulus-induced components of eyeblink EMG and to extract additional psychophysiological information from the eyeblink signal. Analysis of the single-trial eyeblink EMG records from 10 normal subjects showed that GMP is an adequate functional element of which an eyeblink EMG response is composed. In particular, both spontaneous and stimulus-induced single components of eyeblink EMG are produced by functionally similar mechanisms. However, we found that about 54% of GMPs are combined into complex patterns that respond differently to various experimental conditions. To typify characteristic patterns of eyeblink EMG component composition, we defined two fundamental categories of components: Complex components (CC), comprised of multiple subcomponents (GMPs), versus monolithic components (MC), involving a single GMP. Given the nonstationary character of eyeblink EMG, the stimulus-related appearance of some specific component patterns, such as MCs and CCs, is in essence a probabilistic problem. To characterize the probabilistic structure of eyeblink EMG, we introduce the stimulus dependent probability diagram (SDPD), which shows the probability of appearance of defined component patterns of EMG activity at different times after the stimulus presentation. SDPD analysis shows that the stimulus elicits strong though short-term (phasic) effects on monolithic components and moderate but long-lasting (tonic) effects on complex components.


PLoS ONE ◽  
2011 ◽  
Vol 6 (12) ◽  
pp. e28630 ◽  
Author(s):  
Marzia De Lucia ◽  
Irina Constantinescu ◽  
Virginie Sterpenich ◽  
Gilles Pourtois ◽  
Margitta Seeck ◽  
...  

2015 ◽  
Vol E98.D (11) ◽  
pp. 1976-1981
Author(s):  
Maiko SAKAMOTO ◽  
Hiromi YAMAGUCHI ◽  
Toshimasa YAMAZAKI ◽  
Ken-ichi KAMIJO ◽  
Takahiro YAMANOI

2021 ◽  
Author(s):  
Faramarz Faghihi ◽  
Siqi Cai ◽  
Ahmed Moustafa

Recently, studies have shown that the alpha band (8-13 Hz) EEG signals enable the decoding of auditory spatial attention. However, deep learning methods typically requires a large amount of training data. Inspired by sparse coding in cortical neurons, we propose a spiking neural network model for auditory spatial attention detection. The model is composed of three neural layers, two of them are spiking neurons. We formulate a new learning rule that is based on firing rate of pre synaptic and post-synaptic neurons in the first layer and the second layer of spiking neurons. The third layer consists of 10 spiking neurons that the pattern of their firing rate after training is used in test phase of the method. The proposed method extracts the patterns of recorded EEG of leftward and rightward attention, independently, and uses them to train network to detect the auditory spatial attention. In addition, a computational approach is presented to find the best single-trial EEG data as training samples of leftward and rightward attention EEG. In this model, the role of using low connectivity rate of the layers and specific range of learning parameters in sparse coding is studied. Importantly, unlike most prior model, our method requires 10% of EEG data as training data and has shown 90% accuracy in average. This study suggests new insights into the role of sparse coding in both biological networks and brain-inspired machine learning.


Sign in / Sign up

Export Citation Format

Share Document