Evaluation of a Neural Network Model of Amnesia in Diffuse Cerebral Atrophy

1993 ◽  
Vol 163 (2) ◽  
pp. 217-222 ◽  
Author(s):  
James R. G. Carrie

A digital computer program generating a simulated neural network was used to construct a model which can show behaviour resembling human associative memory. The experimental network uses distributed storage, and, in this respect, its functional organisation resembles that suggested by reported observations of neuronal activity in the human temporal lobe during memory storage and recall. Inactivation of increasing numbers of randomly distributed network units simulated advancing cerebral atrophy. This caused progressive impairment of performance, resembling the gradual deterioration of memory function observed in chronic diffuse cerebral degeneration. Unit inactivation had similar effects on recall whether the same units were inactivated before or after learning. This differs from most relevant observations of amnesia resulting from diffuse cerebral disease. While the model may functionally resemble long-term information storage sites in the brain, other cerebral mechanisms participating in learning and remembering are also damaged by diffuse cerebral atrophy.

2013 ◽  
Vol 380-384 ◽  
pp. 421-424
Author(s):  
Jing Liu ◽  
Yu Chi Zhao ◽  
Xiao Hua Shi ◽  
Su Juan Liu

In recent years, it is a very active direction of research to use neural network to control computer. Neural network is a burgeoning crossing subject, and the way it processes information is different from the past symbolic logic system, which has some unique properties: such as the distributed storage and parallel processing of information, the unity of the information storage and information processing, and have the ability of self-organizing and self-learning. And it has been applied widespread in pattern recognition, signal processing, knowledge process, expert system, optimization, intelligent control and so on. Using neural network can deal with some problems such as complicated environment information, fuzzy background knowledge and undefined inference rules, and it allows samples to have relatively large defects and distortion, so it is a very good choice to adopt the recognizing method of neural network. This thesis discusses the application of neural network in computer control.


2017 ◽  
Vol 372 (1715) ◽  
pp. 20160328 ◽  
Author(s):  
Kang K. L. Liu ◽  
Michael F. Hagan ◽  
John E. Lisman

Memory storage involves activity-dependent strengthening of synaptic transmission, a process termed long-term potentiation (LTP). The late phase of LTP is thought to encode long-term memory and involves structural processes that enlarge the synapse. Hence, understanding how synapse size is graded provides fundamental information about the information storage capability of synapses. Recent work using electron microscopy (EM) to quantify synapse dimensions has suggested that synapses may structurally encode as many as 26 functionally distinct states, which correspond to a series of proportionally spaced synapse sizes. Other recent evidence using super-resolution microscopy has revealed that synapses are composed of stereotyped nanoclusters of α-amino-3-hydroxy-5-methyl-4-isoxazolepropionic acid (AMPA) receptors and scaffolding proteins; furthermore, synapse size varies linearly with the number of nanoclusters. Here we have sought to develop a model of synapse structure and growth that is consistent with both the EM and super-resolution data. We argue that synapses are composed of modules consisting of matrix material and potentially one nanocluster. LTP induction can add a trans-synaptic nanocluster to a module, thereby converting a silent module to an AMPA functional module. LTP can also add modules by a linear process, thereby producing an approximately 10-fold gradation in synapse size and strength. This article is part of the themed issue ‘Integrating Hebbian and homeostatic plasticity’.


1997 ◽  
Vol 10 (4) ◽  
pp. 109-115
Author(s):  
B. Gordon

Recent developments in the functional and neural bases of several aspects of memory are described including long term cortical memory storage, the transition from immediate to permanent memory mediated by medial temporal structures, working memory, memory retrieval, and implicit memory. These are linked to current data on the nature of anterograde and retrograde amnesia in the degenerative diseases, and also to issues in the clinical diagnosis of memory impairments. Understanding the bases of memory can inform the diagnosis of memory impairments in degenerative diseases, and the patterns of impairment seen in the degenerative diseases can help contribute to knowledge of the mechanisms of normal memory.


2017 ◽  
Vol 13 (2) ◽  
Author(s):  
Marc Ebner

AbstractThe human brain is able to learn language by processing written or spoken language. Recently, several deep neural networks have been successfully used for natural language generation. Although it is possible to train such networks, it remains unknown how these networks (or the brain) actually process language. A scalable method for distributed storage and recall of sentences within a neural network is presented. A corpus of 59 million words was used for training. A system using this method can efficiently identify sentences that can be considered reasonable replies to an input sentence. The system first selects a small number of seeds words which occur with low frequency in the corpus. These seed words are then used to generate answer sentences. Possible answers are scored using statistical data also obtained from the corpus. A number of sample answers generated by the system are shown to illustrate how the method works.


2021 ◽  
Vol 15 ◽  
Author(s):  
Minoo Sisakhti ◽  
Perminder S. Sachdev ◽  
Seyed Amir Hossein Batouli

One of the less well-understood aspects of memory function is the mechanism by which the brain responds to an increasing load of memory, either during encoding or retrieval. Identifying the brain structures which manage this increasing cognitive demand would enhance our knowledge of human memory. Despite numerous studies about the effect of cognitive loads on working memory processes, whether these can be applied to long-term memory processes is unclear. We asked 32 healthy young volunteers to memorize all possible details of 24 images over a 12-day period ending 2 days before the fMRI scan. The images were of 12 categories relevant to daily events, with each category including a high and a low load image. Behavioral assessments on a separate group of participants (#22) provided the average loads of the images. The participants had to retrieve these previously memorized images during the fMRI scan in 15 s, with their eyes closed. We observed seven brain structures showing the highest activation with increasing load of the retrieved images, viz. parahippocampus, cerebellum, superior lateral occipital, fusiform and lingual gyri, precuneus, and posterior cingulate gyrus. Some structures showed reduced activation when retrieving higher load images, such as the anterior cingulate, insula, and supramarginal and postcentral gyri. The findings of this study revealed that the mechanism by which a difficult-to-retrieve memory is handled is mainly by elevating the activation of the responsible brain areas and not by getting other brain regions involved, which is a help to better understand the LTM retrieval process in the human brain.


2016 ◽  
Author(s):  
Adam Henry Marblestone ◽  
Greg Wayne ◽  
Konrad P Kording

Neuroscience has focused on the detailed implementation of computation, studying neural codes, dynamics and circuits. In machine learning, however, artificial neural networks tend to eschew precisely designed codes, dynamics or circuits in favor of brute force optimization of a cost function, often using simple and relatively uniform initial architectures. Two recent developments have emerged within machine learning that create an opportunity to connect these seemingly divergent perspectives. First, structured architectures are used, including dedicated systems for attention, recursion and various forms of short- and long-term memory storage. Second, cost functions and training procedures have become more complex and are varied across layers and over time. Here we think about the brain in terms of these ideas. We hypothesize that (1) the brain optimizes cost functions, (2) these cost functions are diverse and differ across brain locations and over development, and (3) optimization operates within a pre-structured architecture matched to the computational problems posed by behavior. Such a heterogeneously optimized system, enabled by a series of interacting cost functions, serves to make learning data-efficient and precisely targeted to the needs of the organism. We suggest directions by which neuroscience could seek to refine and test these hypotheses.


2015 ◽  
Vol 26 (3) ◽  
Author(s):  
Zareen Amtul ◽  
Atta-ur-Rahman

AbstractDeciphering the cellular and molecular mechanisms of memory has been an important topic encompassing the learning and memory domain besides the neurodegenerative disorders. Synapses accumulate cognitive information from life-lasting alterations of their molecular and structural composition. Current memory storage models identify posttranslational modification imperative for short-term information storage and mRNA translation for long-term information storage. However, the precise account of these modifications has not been summarized at the individual synapse level. Therefore, herein we describe the spatiotemporal reorganization of synaptic plasticity at the dendritic spine level to elucidate the mechanism through which synaptic substructures are remodeled; though at the molecular level, such mechanisms are still quite unclear. It has thus been concluded that the existing mechanisms do not entirely elaborate memory storage processes. Further efforts are therefore encouraged to delineate the mechanism of neuronal connectivity at the chemical level as well, including inter- or intramolecular bonding patterns at the synaptic level, which may be a permissive and vital step of memory storage.


2000 ◽  
Vol 7 (1-2) ◽  
pp. 1-8 ◽  
Author(s):  
Teresa Montiel ◽  
Daniel Almeida ◽  
Iván Arango ◽  
Eduardo Calixto ◽  
César Casasola ◽  
...  

In electrophysiological terms, experimental models of durable information storage in the brain include long-term potentiation (LTP), long-term depression, and kindling. Protein synthesis correlates with these enduring processes. We propose a fourth example of long-lasting information storage in the brain, which we call the GABA-withdrawal syndrome (GWS). In rats, withdrawal of a chronic intracortical infusion of GABA, a ubiquitous inhibitory neurotransmitter, induced epileptogenesis at the infusion site. This overt GWS lasted for days. Anisomycin, a protein synthesis inhibitor, prevented the appearance of GWSin vivo. Hippocampal and neocortical slices showed a similar post-GABA hyperexcitabilityin vitroand an enhanced susceptibility to LTP induction. One to four months after the epileptic behavior disappeared, systemic administration of a subconvulsant dose of pentylenetetrazol produced the reappearance of paroxysmal activity. The long-lasting effects of tonicGABAAreceptor stimulation may be involved in long-term information storage processes at the cortical level, whereas the cessation ofGABAAreceptor stimulation may be involved in chronic pathological conditions, such as epilepsy. Furthermore, we propose that GWS may represent a common key factor in the addiction to GABAergic agents (for example, barbiturates, benzodiazepines, and ethanol). GWS represents a novel form of neurono-glial plasticity. The mechanisms of this phenomenon remain to be understood.


1986 ◽  
Vol 113 (2_Suppla) ◽  
pp. S85-S94 ◽  
Author(s):  
Tj. B. van Wimersma Greidanus ◽  
J. P. H. Burbach ◽  
H. D. Veldhuis

Abstract. Vasopressin and oxytocin exert pronounced effects on behaviour by a direct action on the brain. A single injection of vasopressin results in a long-term inhibition of extinction of a conditioned avoidance response suggesting that vasopressin triggers a long-term effect on the maintenance of a learned response, probably by facilitation of memory processes. In addition vasopressin improves passive avoidance behaviour, delays extinction of appetitive discrimination tasks, affects approach behaviour to an imprinting stimulus in ducklings, improves copulation rewarded behaviour of male rats in a T-maze, prevents or reverses amnesia induced by electroconvulsive shoch. CO2 inhalation, pentylenetetrazol or puromycin. The majority of these effects of vasopressin in the various and sometimes relatively complex tasks may be explained by stimulatory influences of this neuropeptide on memory processes. Generally oxytocin exerts effects which are opposite to those of vasopressin and it has been suggested that oxytocin may be an amnesic neuropeptide. Various limbic system structures seem to act as the anatomical substrate for the bevavioural effects of vasopressin. In particular the amygdala, the dentate gyrus of the hippocampal complex, the ventral hippocampus and the dorsal septum seem to be involved. Evidence has been obtained from experiments with homozygous diabetes insipidus rats and from experiments in which antisera were applied that endogenous vasopressin and oxytocin play a physiological role in brain processes related to memory. It appears that highly active fragments can be generated from vasopressin and experiments in which a fragment of vasopressin ([pGlu4, Cyt6]AVP-(4–8)) as well as an AVP-antagonist were used, reveal that the vasopressin receptors mediating the behavioural effects are situated in the brain and differ in specificity from the peripheral (blood pressure) vasopressin receptors. Generally the clinical data obtained so far with vasopressin treatment are in agreement with the results from animal experiments and they support the notion on the involvement of vasopressin in memory function. The sometimes reported conflicting results on vasopressin effects in certain patients (Korsakoff or Alzheimer) may have to do with the wide-spread pathology in these diseases.


Sign in / Sign up

Export Citation Format

Share Document