scholarly journals Meta-learning local synaptic plasticity for continual familiarity detection

2021 ◽  
Author(s):  
Danil Tyulmankov ◽  
Guangyu Robert Yang ◽  
LF Abbott

Over the course of a lifetime, a continual stream of information is encoded and retrieved from memory. To explore the synaptic mechanisms that enable this ongoing process, we consider a continual familiarity detection task in which a subject must report whether an image has been previously encountered. We design a class of feedforward neural network models endowed with biologically plausible synaptic plasticity dynamics, the parameters of which are meta-learned to optimize familiarity detection over long delay intervals. After training, we find that anti-Hebbian plasticity leads to better performance than Hebbian and replicates experimental results from the inferotemporal cortex, including repetition suppression. Unlike previous models, this network both operates continuously without requiring any synaptic resets and generalizes to intervals it has not been trained on. We demonstrate this not only for uncorrelated random stimuli but also for images of real-world objects. Our work suggests a biologically plausible mechanism for continual learning, and demonstrates an effective application of machine learning for neuroscience discovery.

2020 ◽  
Author(s):  
Thomas Limbacher ◽  
Robert Legenstein

AbstractThe ability to base current computations on memories from the past is critical for many cognitive tasks such as story understanding. Hebbian-type synaptic plasticity is believed to underlie the retention of memories over medium and long time scales in the brain. However, it is unclear how such plasticity processes are integrated with computations in cortical networks. Here, we propose Hebbian Memory Networks (H-Mems), a simple neural network model that is built around a core hetero-associative network subject to Hebbian plasticity. We show that the network can be optimized to utilize the Hebbian plasticity processes for its computations. H-Mems can one-shot memorize associations between stimulus pairs and use these associations for decisions later on. Furthermore, they can solve demanding question-answering tasks on synthetic stories. Our study shows that neural network models are able to enrich their computations with memories through simple Hebbian plasticity processes.


1994 ◽  
Vol 1 (1) ◽  
pp. 1-33
Author(s):  
P R Montague ◽  
T J Sejnowski

Some forms of synaptic plasticity depend on the temporal coincidence of presynaptic activity and postsynaptic response. This requirement is consistent with the Hebbian, or correlational, type of learning rule used in many neural network models. Recent evidence suggests that synaptic plasticity may depend in part on the production of a membrane permeant-diffusible signal so that spatial volume may also be involved in correlational learning rules. This latter form of synaptic change has been called volume learning. In both Hebbian and volume learning rules, interaction among synaptic inputs depends on the degree of coincidence of the inputs and is otherwise insensitive to their exact temporal order. Conditioning experiments and psychophysical studies have shown, however, that most animals are highly sensitive to the temporal order of the sensory inputs. Although these experiments assay the behavior of the entire animal or perceptual system, they raise the possibility that nervous systems may be sensitive to temporally ordered events at many spatial and temporal scales. We suggest here the existence of a new class of learning rule, called a predictive Hebbian learning rule, that is sensitive to the temporal ordering of synaptic inputs. We show how this predictive learning rule could act at single synaptic connections and through diffuse neuromodulatory systems.


2020 ◽  
Vol 5 ◽  
pp. 140-147 ◽  
Author(s):  
T.N. Aleksandrova ◽  
◽  
E.K. Ushakov ◽  
A.V. Orlova ◽  
◽  
...  

The neural network models series used in the development of an aggregated digital twin of equipment as a cyber-physical system are presented. The twins of machining accuracy, chip formation and tool wear are examined in detail. On their basis, systems for stabilization of the chip formation process during cutting and diagnose of the cutting too wear are developed. Keywords cyberphysical system; neural network model of equipment; big data, digital twin of the chip formation; digital twin of the tool wear; digital twin of nanostructured coating choice


Author(s):  
Ann-Sophie Barwich

How much does stimulus input shape perception? The common-sense view is that our perceptions are representations of objects and their features and that the stimulus structures the perceptual object. The problem for this view concerns perceptual biases as responsible for distortions and the subjectivity of perceptual experience. These biases are increasingly studied as constitutive factors of brain processes in recent neuroscience. In neural network models the brain is said to cope with the plethora of sensory information by predicting stimulus regularities on the basis of previous experiences. Drawing on this development, this chapter analyses perceptions as processes. Looking at olfaction as a model system, it argues for the need to abandon a stimulus-centred perspective, where smells are thought of as stable percepts, computationally linked to external objects such as odorous molecules. Perception here is presented as a measure of changing signal ratios in an environment informed by expectancy effects from top-down processes.


Energies ◽  
2021 ◽  
Vol 14 (14) ◽  
pp. 4242
Author(s):  
Fausto Valencia ◽  
Hugo Arcos ◽  
Franklin Quilumba

The purpose of this research is the evaluation of artificial neural network models in the prediction of stresses in a 400 MVA power transformer winding conductor caused by the circulation of fault currents. The models were compared considering the training, validation, and test data errors’ behavior. Different combinations of hyperparameters were analyzed based on the variation of architectures, optimizers, and activation functions. The data for the process was created from finite element simulations performed in the FEMM software. The design of the Artificial Neural Network was performed using the Keras framework. As a result, a model with one hidden layer was the best suited architecture for the problem at hand, with the optimizer Adam and the activation function ReLU. The final Artificial Neural Network model predictions were compared with the Finite Element Method results, showing good agreement but with a much shorter solution time.


2021 ◽  
Vol 11 (3) ◽  
pp. 908
Author(s):  
Jie Zeng ◽  
Panagiotis G. Asteris ◽  
Anna P. Mamou ◽  
Ahmed Salih Mohammed ◽  
Emmanuil A. Golias ◽  
...  

Buried pipes are extensively used for oil transportation from offshore platforms. Under unfavorable loading combinations, the pipe’s uplift resistance may be exceeded, which may result in excessive deformations and significant disruptions. This paper presents findings from a series of small-scale tests performed on pipes buried in geogrid-reinforced sands, with the measured peak uplift resistance being used to calibrate advanced numerical models employing neural networks. Multilayer perceptron (MLP) and Radial Basis Function (RBF) primary structure types have been used to train two neural network models, which were then further developed using bagging and boosting ensemble techniques. Correlation coefficients in excess of 0.954 between the measured and predicted peak uplift resistance have been achieved. The results show that the design of pipelines can be significantly improved using the proposed novel, reliable and robust soft computing models.


Sign in / Sign up

Export Citation Format

Share Document