scholarly journals Incremental Learning Algorithm for Feedforward Neural Network with Long-Term Memory

2002 ◽  
Vol 38 (9) ◽  
pp. 792-799 ◽  
Author(s):  
Masataka KOBAYASHI ◽  
Seiichi OZAWA ◽  
Shigeo ABE
2021 ◽  
Author(s):  
Juntao Han ◽  
Junwei Sun ◽  
Xiao Xiao ◽  
Peng Liu

2012 ◽  
Vol 23 (6) ◽  
pp. 971-983 ◽  
Author(s):  
V. A. Nguyen ◽  
J. A. Starzyk ◽  
Wooi-Boon Goh ◽  
D. Jachyra

2021 ◽  
Author(s):  
Haomei Duan ◽  
Jinghua Zhu

In the case that user profiles are not available, the recommendation based on anonymous session is particularly important, which aims to predict the items that the user may click at the next moment based on the user's access sequence over a while. In recent years, with the development of recurrent neural network, attention mechanism, and graph neural network, the performance of session-based recommendation has been greatly improved. However, the previous methods did not comprehensively consider the context dependencies and short-term interest first of the session. Therefore, we propose a context-aware short-term interest first model (CASIF).The aim of this paper is improve the accuracy of recommendations by combining context and short-term interest. In CASIF, we dynamically construct a graph structure for session sequences and capture rich context dependencies via graph neural network (GNN), latent feature vectors are captured as inputs of the next step. Then we build the shortterm interest first module, which can to capture the user's general interest from the session in the context of long-term memory, at the same time get the user's current interest from the item of the last click. In the end, the short-term and long-term interest are combined as the final interest and multiplied by the candidate vector to obtain the recommendation probability. Finally, a large number of experiments on two real-world datasets demonstrate the effectiveness of our proposed method.


2020 ◽  
Vol 34 (05) ◽  
pp. 7822-7829
Author(s):  
Fengyu Guo ◽  
Ruifang He ◽  
Jianwu Dang ◽  
Jian Wang

Recognizing implicit discourse relation is a challenging task in discourse analysis, which aims to understand and infer the latent relations between two discourse arguments, such as temporal, comparison. Most of the present models largely focus on learning-based methods that utilize only intra-sentence textual information to identify discourse relations, ignoring the wider contexts beyond the discourse. Moreover, people comprehend the meanings and the relations of discourses, heavily relying on their interconnected working memories (e.g., instant memory, long-term memory). Inspired by this, we propose a Knowledge-Enhanced Attentive Neural Network (KANN) framework to address these issues. Specifically, it establishes a mutual attention matrix to capture the reciprocal information between two arguments, as instant memory. While implicitly stated knowledge in the arguments is retrieved from external knowledge source and encoded as inter-words semantic connection embeddings to further construct knowledge matrix, as long-term memory. We devise a novel paradigm with two ways by the collaboration of the memories to enrich the argument representation: 1) integrating the knowledge matrix into the mutual attention matrix, which implicitly maps knowledge into the process of capturing asymmetric interactions between two discourse arguments; 2) directly concatenating the argument representations and the semantic connection embeddings, which explicitly supplements knowledge to help discourse understanding. The experimental results on the PDTB also show that our KANN model is effective.


Sign in / Sign up

Export Citation Format

Share Document