Spontaneous Temporal Grouping Neural Network for Long-Term Memory Modeling

Author(s):  
Dongjing Shan ◽  
Xiongwei Zhang ◽  
Chao Zhang
2021 ◽  
Author(s):  
Juntao Han ◽  
Junwei Sun ◽  
Xiao Xiao ◽  
Peng Liu

2012 ◽  
Vol 23 (6) ◽  
pp. 971-983 ◽  
Author(s):  
V. A. Nguyen ◽  
J. A. Starzyk ◽  
Wooi-Boon Goh ◽  
D. Jachyra

2021 ◽  
pp. 174702182110308
Author(s):  
Dominic Guitard ◽  
Jean Saint-Aubin ◽  
Nelson Cowan

One commonly acknowledged role of working memory is to set up conditions for new learning. Yet, it has long been understood that there is not a perfect correspondence between conditions leading to good immediate recall from working memory and conditions leading to good delayed recall from long-term memory. Here, in six experiments, we investigated the relation between grouping effects in immediate and delayed reconstruction of order for word lists. There has been a striking absence of tests of grouping effects in long-term memory. In the first four experiments, items within groups are presented concurrently, which encourages associations between items in a group. Despite that presumably favorable situation for group learning, in Experiments 1 and 2 we found effects of grouping only in immediate order reconstruction and not in delayed reconstruction. When more processing time was allowed (Experiments 3 & 4), grouping effects in both immediate and delayed order reconstruction were obtained. Experiment 5 showed that, with items presented one at a time, but with roughly the same amount of processing time and spatial separation as the previous two experiments, grouping effects were obtained neither in immediate order reconstruction nor in delayed reconstruction. However, in Experiment 6 with a more salient manipulation of grouping, effects of grouping were obtained in immediate order reconstruction, but not in delayed reconstruction. In sum, we demonstrated for the first time that there are mechanisms of temporal grouping that assist working memory but are relatively ineffective for long-term learning, in contrast to more effective, concurrent presentation.


2021 ◽  
Author(s):  
Haomei Duan ◽  
Jinghua Zhu

In the case that user profiles are not available, the recommendation based on anonymous session is particularly important, which aims to predict the items that the user may click at the next moment based on the user's access sequence over a while. In recent years, with the development of recurrent neural network, attention mechanism, and graph neural network, the performance of session-based recommendation has been greatly improved. However, the previous methods did not comprehensively consider the context dependencies and short-term interest first of the session. Therefore, we propose a context-aware short-term interest first model (CASIF).The aim of this paper is improve the accuracy of recommendations by combining context and short-term interest. In CASIF, we dynamically construct a graph structure for session sequences and capture rich context dependencies via graph neural network (GNN), latent feature vectors are captured as inputs of the next step. Then we build the shortterm interest first module, which can to capture the user's general interest from the session in the context of long-term memory, at the same time get the user's current interest from the item of the last click. In the end, the short-term and long-term interest are combined as the final interest and multiplied by the candidate vector to obtain the recommendation probability. Finally, a large number of experiments on two real-world datasets demonstrate the effectiveness of our proposed method.


2020 ◽  
Vol 34 (05) ◽  
pp. 7822-7829
Author(s):  
Fengyu Guo ◽  
Ruifang He ◽  
Jianwu Dang ◽  
Jian Wang

Recognizing implicit discourse relation is a challenging task in discourse analysis, which aims to understand and infer the latent relations between two discourse arguments, such as temporal, comparison. Most of the present models largely focus on learning-based methods that utilize only intra-sentence textual information to identify discourse relations, ignoring the wider contexts beyond the discourse. Moreover, people comprehend the meanings and the relations of discourses, heavily relying on their interconnected working memories (e.g., instant memory, long-term memory). Inspired by this, we propose a Knowledge-Enhanced Attentive Neural Network (KANN) framework to address these issues. Specifically, it establishes a mutual attention matrix to capture the reciprocal information between two arguments, as instant memory. While implicitly stated knowledge in the arguments is retrieved from external knowledge source and encoded as inter-words semantic connection embeddings to further construct knowledge matrix, as long-term memory. We devise a novel paradigm with two ways by the collaboration of the memories to enrich the argument representation: 1) integrating the knowledge matrix into the mutual attention matrix, which implicitly maps knowledge into the process of capturing asymmetric interactions between two discourse arguments; 2) directly concatenating the argument representations and the semantic connection embeddings, which explicitly supplements knowledge to help discourse understanding. The experimental results on the PDTB also show that our KANN model is effective.


2009 ◽  
Vol 30 (2) ◽  
pp. 284-298 ◽  
Author(s):  
Katrin Poettrich ◽  
Peter H. Weiss ◽  
Annett Werner ◽  
Silke Lux ◽  
Markus Donix ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document