Sentiment Analysis Based on Hybrid Bi-attention Mechanism in Mobile Application

Author(s):  
Pengcheng Zhu ◽  
Yujiu Yang ◽  
Yi Liu
Symmetry ◽  
2019 ◽  
Vol 11 (1) ◽  
pp. 115 ◽  
Author(s):  
Yaocheng Zhang ◽  
Wei Ren ◽  
Tianqing Zhu ◽  
Ehoche Faith

The development of mobile internet has led to a massive amount of data being generated from mobile devices daily, which has become a source for analyzing human behavior and trends in public sentiment. In this paper, we build a system called MoSa (Mobile Sentiment analysis) to analyze this data. In this system, sentiment analysis is used to analyze news comments on the THAAD (Terminal High Altitude Area Defense) event from Toutiao by employing algorithms to calculate the sentiment value of the comment. This paper is based on HowNet; after the comparison of different sentiment dictionaries, we discover that the method proposed in this paper, which use a mixed sentiment dictionary, has a higher accuracy rate in its analysis of comment sentiment tendency. We then statistically analyze the relevant attributes of the comments and their sentiment values and discover that the standard deviation of the comments’ sentiment value can quickly reflect sentiment changes among the public. Besides that, we also derive some special models from the data that can reflect some specific characteristics. We find that the intrinsic characteristics of situational awareness have implicit symmetry. By using our system, people can obtain some practical results to guide interaction design in applications including mobile Internet, social networks, and blockchain based crowdsourcing.


2021 ◽  
Vol 2005 (1) ◽  
pp. 012010
Author(s):  
Kaixuan Yu ◽  
Yachao Li ◽  
Dongsheng Zhang

Information ◽  
2020 ◽  
Vol 11 (5) ◽  
pp. 280
Author(s):  
Shaoxiu Wang ◽  
Yonghua Zhu ◽  
Wenjing Gao ◽  
Meng Cao ◽  
Mengyao Li

The sentiment analysis of microblog text has always been a challenging research field due to the limited and complex contextual information. However, most of the existing sentiment analysis methods for microblogs focus on classifying the polarity of emotional keywords while ignoring the transition or progressive impact of words in different positions in the Chinese syntactic structure on global sentiment, as well as the utilization of emojis. To this end, we propose the emotion-semantic-enhanced bidirectional long short-term memory (BiLSTM) network with the multi-head attention mechanism model (EBILSTM-MH) for sentiment analysis. This model uses BiLSTM to learn feature representation of input texts, given the word embedding. Subsequently, the attention mechanism is used to assign the attentive weights of each words to the sentiment analysis based on the impact of emojis. The attentive weights can be combined with the output of the hidden layer to obtain the feature representation of posts. Finally, the sentiment polarity of microblog can be obtained through the dense connection layer. The experimental results show the feasibility of our proposed model on microblog sentiment analysis when compared with other baseline models.


2020 ◽  
Vol 25 (4) ◽  
pp. 1208-1219
Author(s):  
Buqing Cao ◽  
Junjie Chen ◽  
Jianxun Liu ◽  
Yiping Wen

Author(s):  
Bowen Xing ◽  
Lejian Liao ◽  
Dandan Song ◽  
Jingang Wang ◽  
Fuzheng Zhang ◽  
...  

Aspect-based sentiment analysis (ABSA) aims to predict fine-grained sentiments of comments with respect to given aspect terms or categories. In previous ABSA methods, the importance of aspect has been realized and verified. Most existing LSTM-based models take aspect into account via the attention mechanism, where the attention weights are calculated after the context is modeled in the form of contextual vectors. However, aspect-related information may be already discarded and aspect-irrelevant information may be retained in classic LSTM cells in the context modeling process, which can be improved to generate more effective context representations. This paper proposes a novel variant of LSTM, termed as aspect-aware LSTM (AA-LSTM), which incorporates aspect information into LSTM cells in the context modeling stage before the attention mechanism. Therefore, our AA-LSTM can dynamically produce aspect-aware contextual representations. We experiment with several representative LSTM-based models by replacing the classic LSTM cells with the AA-LSTM cells. Experimental results on SemEval-2014 Datasets demonstrate the effectiveness of AA-LSTM.


Author(s):  
Yan Zhou ◽  
Longtao Huang ◽  
Tao Guo ◽  
Jizhong Han ◽  
Songlin Hu

Target-Based Sentiment Analysis aims at extracting opinion targets and classifying the sentiment polarities expressed on each target. Recently, token based sequence tagging methods have been successfully applied to jointly solve the two tasks, which aims to predict a tag for each token. Since they do not treat a target containing several words as a whole, it might be difficult to make use of the global information to identify that opinion target, leading to incorrect extraction. Independently predicting the sentiment for each token may also lead to sentiment inconsistency for different words in an opinion target. In this paper, inspired by span-based methods in NLP, we propose a simple and effective joint model to conduct extraction and classification at span level rather than token level. Our model first emulates spans with one or more tokens and learns their representation based on the tokens inside. And then, a span-aware attention mechanism is designed to compute the sentiment information towards each span. Extensive experiments on three benchmark datasets show that our model consistently outperforms the state-of-the-art methods.


2020 ◽  
Vol 2020 ◽  
pp. 1-13
Author(s):  
Xiaodi Wang ◽  
Xiaoliang Chen ◽  
Mingwei Tang ◽  
Tian Yang ◽  
Zhen Wang

The aim of aspect-level sentiment analysis is to identify the sentiment polarity of a given target term in sentences. Existing neural network models provide a useful account of how to judge the polarity. However, context relative position information for the target terms is adversely ignored under the limitation of training datasets. Considering position features between words into the models can improve the accuracy of sentiment classification. Hence, this study proposes an improved classification model by combining multilevel interactive bidirectional Gated Recurrent Unit (GRU), attention mechanisms, and position features (MI-biGRU). Firstly, the position features of words in a sentence are initialized to enrich word embedding. Secondly, the approach extracts the features of target terms and context by using a well-constructed multilevel interactive bidirectional neural network. Thirdly, an attention mechanism is introduced so that the model can pay greater attention to those words that are important for sentiment analysis. Finally, four classic sentiment classification datasets are used to deal with aspect-level tasks. Experimental results indicate that there is a correlation between the multilevel interactive attention network and the position features. MI-biGRU can obviously improve the performance of classification.


Sign in / Sign up

Export Citation Format

Share Document