scholarly journals TLSAN: Time-aware Long- and Short-term Attention Network for Next-item Recommendation

2021 ◽  
Author(s):  
Jianqing Zhang ◽  
Dongjing Wang ◽  
Dongjin Yu
2021 ◽  
Vol 187 ◽  
pp. 106316
Author(s):  
Ling Yang ◽  
Huihui Yu ◽  
Yuelan Cheng ◽  
Siyuan Mei ◽  
Yanqing Duan ◽  
...  

2015 ◽  
Vol 26 (09) ◽  
pp. 1550102 ◽  
Author(s):  
Wen-Jun Li ◽  
Yuan-Yuan Xu ◽  
Qiang Dong ◽  
Jun-Lin Zhou ◽  
Yan Fu

Traditional recommender algorithms usually employ the early and recent records indiscriminately, which overlooks the change of user interests over time. In this paper, we show that the interests of a user remain stable in a short-term interval and drift during a long-term period. Based on this observation, we propose a time-aware diffusion-based (TaDb) recommender algorithm, which assigns different temporal weights to the leading links existing before the target user's collection and the following links appearing after that in the diffusion process. Experiments on four real datasets, Netflix, MovieLens, FriendFeed and Delicious show that TaDb algorithm significantly improves the prediction accuracy compared with the algorithms not considering temporal effects.


Author(s):  
Yuan Zhang ◽  
Xi Yang ◽  
Julie Ivy ◽  
Min Chi

Modeling patient disease progression using Electronic Health Records (EHRs) is critical to assist clinical decision making. Long-Short Term Memory (LSTM) is an effective model to handle sequential data, such as EHRs, but it encounters two major limitations when applied to EHRs: it is unable to interpret the prediction results and it ignores the irregular time intervals between consecutive events. To tackle these limitations, we propose an attention-based time-aware LSTM Networks (ATTAIN), to improve the interpretability of LSTM and to identify the critical previous events for current diagnosis by modeling the inherent time irregularity. We validate ATTAIN on modeling the progression of an extremely challenging disease, septic shock, by using real-world EHRs. Our results demonstrate that the proposed framework outperforms the state-of-the-art models such as RETAIN and T-LSTM. Also, the generated interpretative time-aware attention weights shed some lights on the progression behaviors of septic shock. 


2021 ◽  
Vol 423 ◽  
pp. 580-589
Author(s):  
Chengfeng Xu ◽  
Jian Feng ◽  
Pengpeng Zhao ◽  
Fuzhen Zhuang ◽  
Deqing Wang ◽  
...  
Keyword(s):  

Author(s):  
Ran Cui ◽  
Chirath Hettiarachchi ◽  
Christopher J Nolan ◽  
Elena Daskalaki ◽  
Hanna Suominen

Author(s):  
Zhitao Wang ◽  
Wenjie Li

A series of recent studies formulated the diffusion prediction problem as a sequence prediction task and proposed several sequential models based on recurrent neural networks. However, non-sequential properties exist in real diffusion cascades, which do not strictly follow the sequential assumptions of previous work. In this paper, we propose a hierarchical diffusion attention network (HiDAN), which adopts a non-sequential framework and two-level attention mechanisms, for diffusion prediction. At the user level, a dependency attention mechanism is proposed to dynamically capture historical user-to-user dependencies and extract the dependency-aware user information. At the cascade (i.e., sequence) level, a time-aware influence attention is designed to infer possible future user's dependencies on historical users by considering both inherent user importance and time decay effects. Significantly higher effectiveness and efficiency of HiDAN over state-of-the-art sequential models are demonstrated when evaluated on three real diffusion datasets. The further case studies illustrate that HiDAN can accurately capture diffusion dependencies.


Sign in / Sign up

Export Citation Format

Share Document