learning method
Recently Published Documents





2022 ◽  
Vol 40 (4) ◽  
pp. 1-28
Peng Zhang ◽  
Baoxi Liu ◽  
Tun Lu ◽  
Xianghua Ding ◽  
Hansu Gu ◽  

User-generated contents (UGC) in social media are the direct expression of users’ interests, preferences, and opinions. User behavior prediction based on UGC has increasingly been investigated in recent years. Compared to learning a person’s behavioral patterns in each social media site separately, jointly predicting user behavior in multiple social media sites and complementing each other (cross-site user behavior prediction) can be more accurate. However, cross-site user behavior prediction based on UGC is a challenging task due to the difficulty of cross-site data sampling, the complexity of UGC modeling, and uncertainty of knowledge sharing among different sites. For these problems, we propose a Cross-Site Multi-Task (CSMT) learning method to jointly predict user behavior in multiple social media sites. CSMT mainly derives from the hierarchical attention network and multi-task learning. Using this method, the UGC in each social media site can obtain fine-grained representations in terms of words, topics, posts, hashtags, and time slices as well as the relevances among them, and prediction tasks in different social media sites can be jointly implemented and complement each other. By utilizing two cross-site datasets sampled from Weibo, Douban, Facebook, and Twitter, we validate our method’s superiority on several classification metrics compared with existing related methods.

2022 ◽  
Vol 74 ◽  
pp. 374-382
Zhihang Li ◽  
Qian Tang ◽  
Sibao Wang ◽  
Penghui Zhang

2022 ◽  
Vol 73 ◽  
pp. 102227
Rong Zhang ◽  
Qibing Lv ◽  
Jie Li ◽  
Jinsong Bao ◽  
Tianyuan Liu ◽  

2022 ◽  
Vol 166 ◽  
pp. 108771
Zhichao Wang ◽  
Hong Xia ◽  
Jiyu Zhang ◽  
M. Annor-Nyarko ◽  
Shaomin Zhu ◽  

2022 ◽  
Vol 40 (1) ◽  
pp. 1-24
Seyed Ali Bahrainian ◽  
George Zerveas ◽  
Fabio Crestani ◽  
Carsten Eickhoff

Neural sequence-to-sequence models are the state-of-the-art approach used in abstractive summarization of textual documents, useful for producing condensed versions of source text narratives without being restricted to using only words from the original text. Despite the advances in abstractive summarization, custom generation of summaries (e.g., towards a user’s preference) remains unexplored. In this article, we present CATS, an abstractive neural summarization model that summarizes content in a sequence-to-sequence fashion while also introducing a new mechanism to control the underlying latent topic distribution of the produced summaries. We empirically illustrate the efficacy of our model in producing customized summaries and present findings that facilitate the design of such systems. We use the well-known CNN/DailyMail dataset to evaluate our model. Furthermore, we present a transfer-learning method and demonstrate the effectiveness of our approach in a low resource setting, i.e., abstractive summarization of meetings minutes, where combining the main available meetings’ transcripts datasets, AMI and International Computer Science Institute(ICSI) , results in merely a few hundred training documents.

Sign in / Sign up

Export Citation Format

Share Document