scholarly journals Emotion recognition based on customized smart bracelet with built-in accelerometer

PeerJ ◽  
2016 ◽  
Vol 4 ◽  
pp. e2258 ◽  
Author(s):  
Zhan Zhang ◽  
Yufei Song ◽  
Liqing Cui ◽  
Xiaoqian Liu ◽  
Tingshao Zhu

Background:Recently, emotion recognition has become a hot topic in human-computer interaction. If computers could understand human emotions, they could interact better with their users. This paper proposes a novel method to recognize human emotions (neutral, happy, and angry) using a smart bracelet with built-in accelerometer.Methods:In this study, a total of 123 participants were instructed to wear a customized smart bracelet with built-in accelerometer that can track and record their movements. Firstly, participants walked two minutes as normal, which served as walking behaviors in a neutral emotion condition. Participants then watched emotional film clips to elicit emotions (happy and angry). The time interval between watching two clips was more than four hours. After watching film clips, they walked for one minute, which served as walking behaviors in a happy or angry emotion condition. We collected raw data from the bracelet and extracted a few features from raw data. Based on these features, we built classification models for classifying three types of emotions (neutral, happy, and angry).Results and Discussion:For two-category classification, the classification accuracy can reach 91.3% (neutral vs. angry), 88.5% (neutral vs. happy), and 88.5% (happy vs. angry), respectively; while, for the differentiation among three types of emotions (neutral, happy, and angry), the accuracy can reach 81.2%.Conclusions:Using wearable devices, we found it is possible to recognize human emotions (neutral, happy, and angry) with fair accuracy. Results of this study may be useful to improve the performance of human-computer interaction.

2021 ◽  
Author(s):  
Zhibing Xie

Understanding human emotional states is indispensable for our daily interaction, and we can enjoy more natural and friendly human computer interaction (HCI) experience by fully utilizing human’s affective states. In the application of emotion recognition, multimodal information fusion is widely used to discover the relationships of multiple information sources and make joint use of a number of channels, such as speech, facial expression, gesture and physiological processes. This thesis proposes a new framework of emotion recognition using information fusion based on the estimation of information entropy. The novel techniques of information theoretic learning are applied to feature level fusion and score level fusion. The most critical issues for feature level fusion are feature transformation and dimensionality reduction. The existing methods depend on the second order statistics, which is only optimal for Gaussian-like distributions. By incorporating information theoretic tools, a new feature level fusion method based on kernel entropy component analysis is proposed. For score level fusion, most previous methods focus on predefined rule based approaches, which are usually heuristic. In this thesis, a connection between information fusion and maximum correntropy criterion is established for effective score level fusion. Feature level fusion and score level fusion methods are then combined to introduce a two-stage fusion platform. The proposed methods are applied to audiovisual emotion recognition, and their effectiveness is evaluated by experiments on two publicly available audiovisual emotion databases. The experimental results demonstrate that the proposed algorithms achieve improved performance in comparison with the existing methods. The work of this thesis offers a promising direction to design more advanced emotion recognition systems based on multimodal information fusion and has great significance to the development of intelligent human computer interaction systems.


2017 ◽  
Vol 107 ◽  
pp. 170-175 ◽  
Author(s):  
Zhimin He ◽  
Tao Chang ◽  
Siyu Lu ◽  
Hong Ai ◽  
Dong Wang ◽  
...  

2005 ◽  
Vol 18 (4) ◽  
pp. 389-405 ◽  
Author(s):  
N. Fragopanagos ◽  
J.G. Taylor

2020 ◽  
Vol 2 (3) ◽  
pp. 121-130
Author(s):  
Zequn Wang ◽  
Rui Jiao ◽  
Huiping Jiang

Sign in / Sign up

Export Citation Format

Share Document