Spontaneous facial expression database for academic emotion inference in online learning

2019 ◽  
Vol 13 (3) ◽  
pp. 329-337 ◽  
Author(s):  
Cunling Bian ◽  
Ya Zhang ◽  
Fei Yang ◽  
Wei Bi ◽  
Weigang Lu
i-Perception ◽  
10.1068/if694 ◽  
2012 ◽  
Vol 3 (9) ◽  
pp. 694-694 ◽  
Author(s):  
Haenah Lee ◽  
Ahyoung Shin ◽  
BoRa Kim ◽  
Christian Wallraven

2013 ◽  
Vol 4 (1) ◽  
pp. 34-46 ◽  
Author(s):  
Shangfei Wang ◽  
Zhilei Liu ◽  
Zhaoyu Wang ◽  
Guobing Wu ◽  
Peijia Shen ◽  
...  

2018 ◽  
Author(s):  
Jeffrey M. Girard ◽  
Wen-Sheng Chu ◽  
László A Jeni ◽  
Jeffrey F Cohn ◽  
Fernando De la Torre ◽  
...  

Despite the important role that facial expressions play in interpersonal communication and our knowledge that interpersonal behavior is influenced by social context, no currently available facial expression database includes multiple interacting participants. The Sayette Group Formation Task (GFT) database addresses the need for well-annotated video of multiple participants during unscripted interactions. The database includes 172,800 video frames from 96 participants in 32 three-person groups. To aid in the development of automated facial expression analysis systems, GFT includes expert annotations of FACS occurrence and intensity, facial landmark tracking, and baseline results for linear SVM, deep learning, active patch learning, and personalized classification. Baseline performance is quantified and compared using identical partitioning and a variety of metrics (including means and confidence intervals). The highest performance scores were found for the deep learning and active patch learning methods. Learn more at http://osf.io/7wcyz.


PSYCHOLOGIA ◽  
2019 ◽  
Vol 61 (4) ◽  
pp. 221-240 ◽  
Author(s):  
Yoshiyuki UEDA ◽  
Masato NUNOI ◽  
Sakiko YOSHIKAWA

2019 ◽  
Vol 58 (1) ◽  
pp. 63-86 ◽  
Author(s):  
Zhaoli Zhang ◽  
Zhenhua Li ◽  
Hai Liu ◽  
Taihe Cao ◽  
Sannyuya Liu

Online learning engagement detection is a fundamental problem in educational information technology. Efficient detection of students’ learning situations can provide information to teachers to help them identify students having trouble in real time. To improve the accuracy of learning engagement detection, we have collected two aspects of students’ behavior data: face data (using adaptive weighted Local Gray Code Patterns for facial expression recognition) and mouse interaction. In this article, we propose a novel learning engagement detection algorithm based on the collected data (students’ behavior), which come from the cameras and the mouse in the online learning environment. The cameras were utilized to capture students’ face images, while the mouse movement data were captured simultaneously. In the process of image data labeling, we built two datasets for classifier training and testing. One took the mouse movement data as a reference, while the other did not. We performed experiments on two datasets using several methods and found that the classifier trained by the former dataset had a better performance, and its recognition rate is higher than that of the latter one (94.60% vs. 91.51%).


Sign in / Sign up

Export Citation Format

Share Document