scholarly journals Interactive Multi-Head Attention Networks for Aspect-Level Sentiment Classification

IEEE Access ◽  
2019 ◽  
Vol 7 ◽  
pp. 160017-160028 ◽  
Author(s):  
Qiuyue Zhang ◽  
Ran Lu ◽  
Qicai Wang ◽  
Zhenfang Zhu ◽  
Peiyu Liu
2019 ◽  
Vol 178 ◽  
pp. 61-73 ◽  
Author(s):  
Jie Xu ◽  
Feiran Huang ◽  
Xiaoming Zhang ◽  
Senzhang Wang ◽  
Chaozhuo Li ◽  
...  

2020 ◽  
Author(s):  
Hao Wang ◽  
Shuai Wang ◽  
Sahisnu Mazumder ◽  
Bing Liu ◽  
Yan Yang ◽  
...  

2020 ◽  
Vol 10 (6) ◽  
pp. 2052
Author(s):  
Dianyuan Zhang ◽  
Zhenfang Zhu ◽  
Qiang Lu ◽  
Hongli Pei ◽  
Wenqing Wu ◽  
...  

Aspect-Based (also known as aspect-level) Sentiment Classification (ABSC) aims at determining the sentimental tendency of a particular target in a sentence. With the successful application of the attention network in multiple fields, attention-based ABSC has aroused great interest. However, most of the previous methods are difficult to parallelize, insufficiently obtain, and fuse the interactive information. In this paper, we proposed a Multiple Interactive Attention Network (MIN). First, we used the Bidirectional Encoder Representations from Transformers (BERT) model to pre-process the data. Then, we used the partial transformer to obtain a hidden state in parallel. Finally, we took the target word and the context word as the core to obtain and fuse the interactive information. Experimental results on the different datasets showed that our model was much more effective.


Author(s):  
Jingjing Wang ◽  
Jie Li ◽  
Shoushan Li ◽  
Yangyang Kang ◽  
Min Zhang ◽  
...  

Aspect sentiment classification, a challenging task in sentiment analysis, has been attracting more and more attention in recent years. In this paper, we highlight the need for incorporating the importance degrees of both words and clauses inside a sentence and propose a hierarchical network with both word-level and clause-level attentions to aspect sentiment classification. Specifically, we first adopt sentence-level discourse segmentation to segment a sentence into several clauses. Then, we leverage multiple Bi-directional LSTM layers to encode all clauses and propose a word-level attention layer to capture the importance degrees of words in each clause. Third and finally, we leverage another Bi-directional LSTM layer to encode the outputs from the former layers and propose a clause-level attention layer to capture the importance degrees of all the clauses inside a sentence. Experimental results on the laptop and restaurant datasets from SemEval-2015 demonstrate the effectiveness of our proposed approach to aspect sentiment classification.


Sign in / Sign up

Export Citation Format

Share Document