scholarly journals East Asian Young and Older Adult Perceptions of Emotional Faces From an Age- and Sex-Fair East Asian Facial Expression Database

2018 ◽  
Vol 9 ◽  
Author(s):  
Yu-Zhen Tu ◽  
Dong-Wei Lin ◽  
Atsunobu Suzuki ◽  
Joshua Oon Soo Goh
2019 ◽  
Vol 11 (1) ◽  
pp. 1-8
Author(s):  
Malik Abdul Ghani ◽  
Andre Rusli ◽  
Ni Made Satvika Iswari

Expressions of facial expressions in addition to providing important emotional indicators, are very important objects in our daily lives too. Real-time video processing on mobile devices is a hot topic and has a very broad application. Photos that have used the filter have 21% more possibilities to be seen and 45% more likely to be commented on by photo consumers. The use of the Fisher-Yates algorithm is used as a filter scrambler for each facial expression emotion. The application is made for the iOS operating system with the Swift programming language that utilizes the Core ML and Vision framework. Custom Vision is used as a tool for creating and training models. In making a model, this study uses a dataset from Cohn-Kanade AU-Coded Facial Expression Database and Karolinska Directed Emotional Faces. Custom Vision can provide performance result training and provide precision and recall values ​​for data that has been trained. The facial expression match with the model is determined by the confidence level value. The results of trials with Hedonic Motivation System Adoption Model method produce a percentage of pleasure in using the application (joy) of 79.39%  of the users agree that the application provides joy.   


i-Perception ◽  
10.1068/if694 ◽  
2012 ◽  
Vol 3 (9) ◽  
pp. 694-694 ◽  
Author(s):  
Haenah Lee ◽  
Ahyoung Shin ◽  
BoRa Kim ◽  
Christian Wallraven

2019 ◽  
Vol 13 (3) ◽  
pp. 329-337 ◽  
Author(s):  
Cunling Bian ◽  
Ya Zhang ◽  
Fei Yang ◽  
Wei Bi ◽  
Weigang Lu

2013 ◽  
Vol 4 (1) ◽  
pp. 34-46 ◽  
Author(s):  
Shangfei Wang ◽  
Zhilei Liu ◽  
Zhaoyu Wang ◽  
Guobing Wu ◽  
Peijia Shen ◽  
...  

2018 ◽  
Author(s):  
Jeffrey M. Girard ◽  
Wen-Sheng Chu ◽  
László A Jeni ◽  
Jeffrey F Cohn ◽  
Fernando De la Torre ◽  
...  

Despite the important role that facial expressions play in interpersonal communication and our knowledge that interpersonal behavior is influenced by social context, no currently available facial expression database includes multiple interacting participants. The Sayette Group Formation Task (GFT) database addresses the need for well-annotated video of multiple participants during unscripted interactions. The database includes 172,800 video frames from 96 participants in 32 three-person groups. To aid in the development of automated facial expression analysis systems, GFT includes expert annotations of FACS occurrence and intensity, facial landmark tracking, and baseline results for linear SVM, deep learning, active patch learning, and personalized classification. Baseline performance is quantified and compared using identical partitioning and a variety of metrics (including means and confidence intervals). The highest performance scores were found for the deep learning and active patch learning methods. Learn more at http://osf.io/7wcyz.


PSYCHOLOGIA ◽  
2019 ◽  
Vol 61 (4) ◽  
pp. 221-240 ◽  
Author(s):  
Yoshiyuki UEDA ◽  
Masato NUNOI ◽  
Sakiko YOSHIKAWA

2017 ◽  
Vol 1 (suppl_1) ◽  
pp. 333-333
Author(s):  
H. Cheng ◽  
H. Weng ◽  
T. Lu ◽  
Y. Yang

Sign in / Sign up

Export Citation Format

Share Document