scholarly journals Emotion Recognition Using Physiological Signals: Laboratory vs. Wearable Sensors

Author(s):  
Martin Ragot ◽  
Nicolas Martin ◽  
Sonia Em ◽  
Nico Pallamin ◽  
Jean-Marc Diverrez
Sensors ◽  
2020 ◽  
Vol 21 (1) ◽  
pp. 52
Author(s):  
Tianyi Zhang ◽  
Abdallah El Ali ◽  
Chen Wang ◽  
Alan Hanjalic ◽  
Pablo Cesar

Recognizing user emotions while they watch short-form videos anytime and anywhere is essential for facilitating video content customization and personalization. However, most works either classify a single emotion per video stimuli, or are restricted to static, desktop environments. To address this, we propose a correlation-based emotion recognition algorithm (CorrNet) to recognize the valence and arousal (V-A) of each instance (fine-grained segment of signals) using only wearable, physiological signals (e.g., electrodermal activity, heart rate). CorrNet takes advantage of features both inside each instance (intra-modality features) and between different instances for the same video stimuli (correlation-based features). We first test our approach on an indoor-desktop affect dataset (CASE), and thereafter on an outdoor-mobile affect dataset (MERCA) which we collected using a smart wristband and wearable eyetracker. Results show that for subject-independent binary classification (high-low), CorrNet yields promising recognition accuracies: 76.37% and 74.03% for V-A on CASE, and 70.29% and 68.15% for V-A on MERCA. Our findings show: (1) instance segment lengths between 1–4 s result in highest recognition accuracies (2) accuracies between laboratory-grade and wearable sensors are comparable, even under low sampling rates (≤64 Hz) (3) large amounts of neutral V-A labels, an artifact of continuous affect annotation, result in varied recognition performance.


2021 ◽  
Vol 17 (7) ◽  
pp. 444-448
Author(s):  
Weilun Xie ◽  
Wanli Xue

Sign in / Sign up

Export Citation Format

Share Document