Translating Global Memory across Colonial Divides

Author(s):  
Lisa K. Taylor
Keyword(s):  
2021 ◽  
Vol 25 (4) ◽  
pp. 1031-1045
Author(s):  
Helang Lai ◽  
Keke Wu ◽  
Lingli Li

Emotion recognition in conversations is crucial as there is an urgent need to improve the overall experience of human-computer interactions. A promising improvement in this field is to develop a model that can effectively extract adequate contexts of a test utterance. We introduce a novel model, termed hierarchical memory networks (HMN), to address the issues of recognizing utterance level emotions. HMN divides the contexts into different aspects and employs different step lengths to represent the weights of these aspects. To model the self dependencies, HMN takes independent local memory networks to model these aspects. Further, to capture the interpersonal dependencies, HMN employs global memory networks to integrate the local outputs into global storages. Such storages can generate contextual summaries and help to find the emotional dependent utterance that is most relevant to the test utterance. With an attention-based multi-hops scheme, these storages are then merged with the test utterance using an addition operation in the iterations. Experiments on the IEMOCAP dataset show our model outperforms the compared methods with accuracy improvement.


2021 ◽  
Vol 0 (0) ◽  
Author(s):  
Hamish Patel ◽  
Reza Zamani

Abstract Long-term memories are thought to be stored in neurones and synapses that undergo physical changes, such as long-term potentiation (LTP), and these changes can be maintained for long periods of time. A candidate enzyme for the maintenance of LTP is protein kinase M zeta (PKMζ), a constitutively active protein kinase C isoform that is elevated during LTP and long-term memory maintenance. This paper reviews the evidence and controversies surrounding the role of PKMζ in the maintenance of long-term memory. PKMζ maintains synaptic potentiation by preventing AMPA receptor endocytosis and promoting stabilisation of dendritic spine growth. Inhibition of PKMζ, with zeta-inhibitory peptide (ZIP), can reverse LTP and impair established long-term memories. However, a deficit of memory retrieval cannot be ruled out. Furthermore, ZIP, and in high enough doses the control peptide scrambled ZIP, was recently shown to be neurotoxic, which may explain some of the effects of ZIP on memory impairment. PKMζ knockout mice show normal learning and memory. However, this is likely due to compensation by protein-kinase C iota/lambda (PKCι/λ), which is normally responsible for induction of LTP. It is not clear how, or if, this compensatory mechanism is activated under normal conditions. Future research should utilise inducible PKMζ knockdown in adult rodents to investigate whether PKMζ maintains memory in specific parts of the brain, or if it represents a global memory maintenance molecule. These insights may inform future therapeutic targets for disorders of memory loss.


1996 ◽  
Vol 30 (5) ◽  
pp. 258-267 ◽  
Author(s):  
Hervé A. Jamrozik ◽  
Michael J. Feeley ◽  
Geoffrey M. Voelker ◽  
James Evans ◽  
Anna R. Karlin ◽  
...  

1992 ◽  
Vol 6 (1) ◽  
pp. 98-111 ◽  
Author(s):  
S. K. Kim ◽  
A. T. Chrortopoulos

Main memory accesses for shared-memory systems or global communications (synchronizations) in message passing systems decrease the computation speed. In this paper, the standard Arnoldi algorithm for approximating a small number of eigenvalues, with largest (or smallest) real parts for nonsymmetric large sparse matrices, is restructured so that only one synchronization point is required; that is, one global communication in a message passing distributed-memory machine or one global memory sweep in a shared-memory machine per each iteration is required. We also introduce an s-step Arnoldi method for finding a few eigenvalues of nonsymmetric large sparse matrices. This method generates reduction matrices that are similar to those generated by the standard method. One iteration of the s-step Arnoldi algorithm corresponds to s iterations of the standard Arnoldi algorithm. The s-step method has improved data locality, minimized global communication, and superior parallel properties. These algorithms are implemented on a 64-node NCUBE/7 Hypercube and a CRAY-2, and performance results are presented.


2007 ◽  
Vol 37 (9) ◽  
pp. 1281-1291 ◽  
Author(s):  
STELLA W. Y. CHAN ◽  
GUY M. GOODWIN ◽  
CATHERINE J. HARMER

ABSTRACTBackgroundCognitive theories associate depression with negative biases in information processing. Although negatively biased cognitions are well documented in depressed patients and to some extent in recovered patients, it remains unclear whether these abnormalities are present before the first depressive episode.MethodHigh neuroticism (N) is a well-recognized risk factor for depression. The current study therefore compared different aspects of emotional processing in 33 high-N never-depressed and 32 low-N matched volunteers. Awakening salivary cortisol, which is often elevated in severely depressed patients, was measured to explore the neurobiological substrate of neuroticism.ResultsHigh-N volunteers showed increased processing of negative and/or decreased processing of positive information in emotional categorization and memory, facial expression recognition and emotion-potentiated startle (EPS), in the absence of global memory or executive deficits. By contrast, there was no evidence for effects of neuroticism on attentional bias (as measured with the dot-probe task), over-general autobiographical memory, or awakening cortisol levels.ConclusionsThese results suggest that certain negative processing biases precede depression rather than arising as a result of depressive experience per se and as such could in part mediate the vulnerability of high-N subjects to depression. Longitudinal studies are required to confirm that such cognitive vulnerabilities predict subsequent depression in individual subjects.


1992 ◽  
Vol 99 (3) ◽  
pp. 518-535 ◽  
Author(s):  
Roger Ratcliff ◽  
Ching-fan Sheu ◽  
Scott D. Gronlund

Sign in / Sign up

Export Citation Format

Share Document