Head movement and facial expression-based human-machine interface for controlling an intelligent wheelchair

Author(s):  
Ericka Janet Rechy Ramirez ◽  
osheng Hu
Author(s):  
Ericka Janet Rechy-Ramirez ◽  
Huosheng Hu

A bio-signal-based human machine interface is proposed for hands-free control of a wheelchair. An Emotiv EPOC sensor is used to detect facial expressions and head movements of users. Nine facial expressions and up-down head movements can be chosen to form five commands: move-forward and backward, turn-left and right, and stop. Four uni-modal modes, three bi-modal modes, and three fuzzy bi-modal modes are created to control a wheelchair. Fuzzy modes use the users' strength in making the head movement and facial expression to adjust the wheelchair speed via a fuzzy logic system. Two subjects tested the ten modes with several command configurations. Means, minimum, and maximum values of the traveling times achieved by each subject in each mode were collected. Results showed that both subjects achieved the lowest mean, minimum and maximum traveling times using fuzzy modes. Statistical tests showed that there were significant differences between traveling times of fuzzy modes of subject B and traveling times of bi-modal modes and those of the respective fuzzy modes of both subjects.


1990 ◽  
Author(s):  
B. Bly ◽  
P. J. Price ◽  
S. Park ◽  
S. Tepper ◽  
E. Jackson ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document