Oscillatory Neural Networks: Modeling Binding and Attention by Synchronization of Neural Activity

1999 ◽  
pp. 279-302
1998 ◽  
Vol 21 (6) ◽  
pp. 833-833 ◽  
Author(s):  
Roman Borisyuk ◽  
Galina Borisyuk ◽  
Yakov Kazanovich

Synchronization of neural activity in oscillatory neural networks is a general principle of information processing in the brain at both preattentional and attentional levels. This is confirmed by a model of attention based on an oscillatory neural network with a central element and models of feature binding and working memory based on multi-frequency oscillations.


2010 ◽  
Vol 365 (1551) ◽  
pp. 2347-2362 ◽  
Author(s):  
Dominique M. Durand ◽  
Eun-Hyoung Park ◽  
Alicia L. Jensen

Conventional neural networks are characterized by many neurons coupled together through synapses. The activity, synchronization, plasticity and excitability of the network are then controlled by its synaptic connectivity. Neurons are surrounded by an extracellular space whereby fluctuations in specific ionic concentration can modulate neuronal excitability. Extracellular concentrations of potassium ([K + ] o ) can generate neuronal hyperexcitability. Yet, after many years of research, it is still unknown whether an elevation of potassium is the cause or the result of the generation, propagation and synchronization of epileptiform activity. An elevation of potassium in neural tissue can be characterized by dispersion (global elevation of potassium) and lateral diffusion (local spatial gradients). Both experimental and computational studies have shown that lateral diffusion is involved in the generation and the propagation of neural activity in diffusively coupled networks. Therefore, diffusion-based coupling by potassium can play an important role in neural networks and it is reviewed in four sections. Section 2 shows that potassium diffusion is responsible for the synchronization of activity across a mechanical cut in the tissue. A computer model of diffusive coupling shows that potassium diffusion can mediate communication between cells and generate abnormal and/or periodic activity in small (§3) and in large networks of cells (§4). Finally, in §5, a study of the role of extracellular potassium in the propagation of axonal signals shows that elevated potassium concentration can block the propagation of neural activity in axonal pathways. Taken together, these results indicate that potassium accumulation and diffusion can interfere with normal activity and generate abnormal activity in neural networks.


Author(s):  
Thomas C. Jackson ◽  
Abhishek A. Sharma ◽  
James A. Bain ◽  
Jeffrey A. Weldon ◽  
Lawrence Pileggi

2020 ◽  
Vol 10 (1) ◽  
Author(s):  
Stephan Rinner ◽  
Alberto Trentino ◽  
Heike Url ◽  
Florian Burger ◽  
Julian von Lautz ◽  
...  

AbstractCellular micromotion—a tiny movement of cell membranes on the nm-µm scale—has been proposed as a pathway for inter-cellular signal transduction and as a label-free proxy signal to neural activity. Here we harness several recent approaches of signal processing to detect such micromotion in video recordings of unlabeled cells. Our survey includes spectral filtering of the video signal, matched filtering, as well as 1D and 3D convolutional neural networks acting on pixel-wise time-domain data and a whole recording respectively.


2019 ◽  
Vol 31 (10) ◽  
pp. 1985-2003 ◽  
Author(s):  
Chen Beer ◽  
Omri Barak

Artificial neural networks, trained to perform cognitive tasks, have recently been used as models for neural recordings from animals performing these tasks. While some progress has been made in performing such comparisons, the evolution of network dynamics throughout learning remains unexplored. This is paralleled by an experimental focus on recording from trained animals, with few studies following neural activity throughout training. In this work, we address this gap in the realm of artificial networks by analyzing networks that are trained to perform memory and pattern generation tasks. The functional aspect of these tasks corresponds to dynamical objects in the fully trained network—a line attractor or a set of limit cycles for the two respective tasks. We use these dynamical objects as anchors to study the effect of learning on their emergence. We find that the sequential nature of learning—one trial at a time—has major consequences for the learning trajectory and its final outcome. Specifically, we show that least mean squares (LMS), a simple gradient descent suggested as a biologically plausible version of the FORCE algorithm, is constantly obstructed by forgetting, which is manifested as the destruction of dynamical objects from previous trials. The degree of interference is determined by the correlation between different trials. We show which specific ingredients of FORCE avoid this phenomenon. Overall, this difference results in convergence that is orders of magnitude slower for LMS. Learning implies accumulating information across multiple trials to form the overall concept of the task. Our results show that interference between trials can greatly affect learning in a learning-rule-dependent manner. These insights can help design experimental protocols that minimize such interference, and possibly infer underlying learning rules by observing behavior and neural activity throughout learning.


2012 ◽  
Vol 24 (2) ◽  
pp. 523-540 ◽  
Author(s):  
Dimitrije Marković ◽  
Claudius Gros

A massively recurrent neural network responds on one side to input stimuli and is autonomously active, on the other side, in the absence of sensory inputs. Stimuli and information processing depend crucially on the qualia of the autonomous-state dynamics of the ongoing neural activity. This default neural activity may be dynamically structured in time and space, showing regular, synchronized, bursting, or chaotic activity patterns. We study the influence of nonsynaptic plasticity on the default dynamical state of recurrent neural networks. The nonsynaptic adaption considered acts on intrinsic neural parameters, such as the threshold and the gain, and is driven by the optimization of the information entropy. We observe, in the presence of the intrinsic adaptation processes, three distinct and globally attracting dynamical regimes: a regular synchronized, an overall chaotic, and an intermittent bursting regime. The intermittent bursting regime is characterized by intervals of regular flows, which are quite insensitive to external stimuli, interceded by chaotic bursts that respond sensitively to input signals. We discuss these findings in the context of self-organized information processing and critical brain dynamics.


2018 ◽  
Vol 139 ◽  
pp. 8-14 ◽  
Author(s):  
Andrey Velichko ◽  
Maksim Belyaev ◽  
Vadim Putrolaynen ◽  
Valentin Perminov ◽  
Alexander Pergament

Sign in / Sign up

Export Citation Format

Share Document