2A1-L04 Research and Development of Facial Expression Recognition Social Robot with Image Processing(Communication Robot)

2014 ◽  
Vol 2014 (0) ◽  
pp. _2A1-L04_1-_2A1-L04_2
Author(s):  
Kazuyuki ISHIGAMI ◽  
Sigeru KUCHII
2008 ◽  
Vol 381-382 ◽  
pp. 375-378
Author(s):  
K.T. Song ◽  
M.J. Han ◽  
F.Y. Chang ◽  
S.H. Chang

The capability of recognizing human facial expression plays an important role in advanced human-robot interaction development. Through recognizing facial expressions, a robot can interact with a user in a more natural and friendly manner. In this paper, we proposed a facial expression recognition system based on an embedded image processing platform to classify different facial expressions on-line in real time. A low-cost embedded vision system has been designed and realized for robotic applications using a CMOS image sensor and digital signal processor (DSP). The current design acquires thirty 640x480 image frames per second (30 fps). The proposed emotion recognition algorithm has been successfully implemented on the real-time vision system. Experimental results on a pet robot show that the robot can interact with a person in a responding manner. The developed image processing platform is effective for accelerating the recognition speed to 25 recognitions per second with an average on-line recognition rate of 74.4% for five facial expressions.


2011 ◽  
Vol 16 (3) ◽  
pp. 318-323 ◽  
Author(s):  
Yasunari Yoshitomi ◽  
Taro Asada ◽  
Kyouhei Shimada ◽  
Masayoshi Tabuse

Author(s):  
Anastasios Koutlas ◽  
Dimitrios I. Fotiadis

The aim of this chapter is to analyze the recent advances in image processing and machine learning techniques with respect to facial expression recognition. A comprehensive review of recently proposed methods is provided along with an analysis of the advantages and the shortcomings of existing systems. Moreover, an example for the automatic identification of basic emotions is presented: Active Shape Models are used to identify prominent features of the face; Gabor filters are used to represent facial geometry at selected locations of fiducial points and Artificial Neural Networks are used for the classification into the basic emotions (anger, surprise, fear, happiness, sadness, disgust, neutral); and finally, the future trends towards automatic facial expression recognition are described.


Sign in / Sign up

Export Citation Format

Share Document