An Electric Wheelchair Controlled by Head Movements and Facial Expressions

Author(s):  
Ericka Janet Rechy-Ramirez ◽  
Huosheng Hu

A bio-signal-based human machine interface is proposed for hands-free control of a wheelchair. An Emotiv EPOC sensor is used to detect facial expressions and head movements of users. Nine facial expressions and up-down head movements can be chosen to form five commands: move-forward and backward, turn-left and right, and stop. Four uni-modal modes, three bi-modal modes, and three fuzzy bi-modal modes are created to control a wheelchair. Fuzzy modes use the users' strength in making the head movement and facial expression to adjust the wheelchair speed via a fuzzy logic system. Two subjects tested the ten modes with several command configurations. Means, minimum, and maximum values of the traveling times achieved by each subject in each mode were collected. Results showed that both subjects achieved the lowest mean, minimum and maximum traveling times using fuzzy modes. Statistical tests showed that there were significant differences between traveling times of fuzzy modes of subject B and traveling times of bi-modal modes and those of the respective fuzzy modes of both subjects.

2014 ◽  
Vol 4 (1) ◽  
pp. 59-76 ◽  
Author(s):  
Ericka Janet Rechy-Ramirez ◽  
Huosheng Hu

This paper presents a bio-signal based human machine interface (HMI) for hands-free control of an electric powered wheelchair. In this novel HMI, an Emotive EPOC sensor is deployed to detect facial expressions and head movements of users, which are then recognized and converted to four uni-modal control modes and two bi-modal control modes to operate the wheelchair. Nine facial expressions and up-down head movements have been defined and tested, so that users can select some of these facial expressions and head movements to form the six control commands. The proposed HMI is user-friendly and allows users to select one of available control modes according to their comfort. Experiments are conducted to show the feasibility and performance of the proposed HMI.


Author(s):  
Yongmian Zhang ◽  
Jixu Chen ◽  
Yan Tong ◽  
Qiang Ji

This chapter describes a probabilistic framework for faithful reproduction of spontaneous facial expressions on a synthetic face model in a real time interactive application. The framework consists of a coupled Bayesian network (BN) to unify the facial expression analysis and synthesis into one coherent structure. At the analysis end, we cast the facial action coding system (FACS) into a dynamic Bayesian network (DBN) to capture relationships between facial expressions and the facial motions as well as their uncertainties and dynamics. The observations fed into the DBN facial expression model are measurements of facial action units (AUs) generated by an AU model. Also implemented by a DBN, the AU model captures the rigid head movements and nonrigid facial muscular movements of a spontaneous facial expression. At the synthesizer, a static BN reconstructs the Facial Animation Parameters (FAPs) and their intensity through the top-down inference according to the current state of facial expression and pose information output by the analysis end. The two BNs are connected statically through a data stream link. The novelty of using the coupled BN brings about several benefits. First, a facial expression is inferred through both spatial and temporal inference so that the perceptual quality of animation is less affected by the misdetection of facial features. Second, more realistic looking facial expressions can be reproduced by modeling the dynamics of human expressions in facial expression analysis. Third, very low bitrate (9 bytes per frame) in data transmission can be achieved.


Author(s):  
Pola Lydia Lagari ◽  
Antonia Nasiakou ◽  
Miltiadis Alamaniotis

Nuclear Power Plants (NPPs) are nowadays facing a transition from analog to digital control rooms, mainly in the form of new interfaces for displaying sensor information as of Human Machine Interface (HMI) were one-sided, taking into consideration either the human or the machine perspective. The approach presented in the present article considers human machine interfaces met in nuclear power plants as a joint system, where the performance of the NPP operator is evaluated according to the cooperation level achieved with the machine. In particular, the purpose of this study is to provide a methodology to evaluate the degree of flexibility of an operator during the transition period from an analog to a digital system. The proposed methodology has been implemented by utilizing fuzzy logic inference and realized with the fuzzy toolbox embedded in Matlab software.


2021 ◽  
Vol 6 (2) ◽  
pp. 10
Author(s):  
Andre Dwi Syahrul Kirom ◽  
Ratna Ika Putri ◽  
Edi Sulistio Budi

Bermula dari kegemaran masyarakat akan konsumi sari buah yang siap saji dapat memberikan peluang yang sangat besar bagi para petani apel guna menigkatkan harga jual dari produk olahan mereka. akan tetapi proses pembuatan sari apel sendiri masih menggunakan proses manual, yakni pemerasan inti sari apel menggunakan kain sebagai medianya. Berdasarkan penelitian terdahulu terkait pengaturan kecepatan crusher motor dalam proses ekstraksi buah apel yang menggunakan metode PI masih belum menghasilkan kontrol waktu yang stabil, maka dari itu pada penelitian ini mengunakan fuzzy logic yang ditanamkan pada microcontroller sebagai kontrolernya dan sensor rotary encoder sebagai sensor kecepatannya. fuzzy terdiri dari Error dengan membersif function Kurang, Samadengan, Ples, DError adalah, - =, + dan memiliki Output lambat, sedang, cepat. Dengan set point 2000 RPM Chrusher dapat menstabilkan kecepatan motor chrasher terhadap variasi beban dari 1Kg dan 2 Kg. waktu tunda = 2 detik, waktu naik = 2 detik, waktu puncak = 3 detik, overshoot maksimal = 3.75%,error steasi state = 1,9% waktu setling = 3 detik. hasil dari respon sisten dihubungkan secara real time dengan HMI.


2017 ◽  
Vol 2 (2) ◽  
pp. 130-134
Author(s):  
Jarot Dwi Prasetyo ◽  
Zaehol Fatah ◽  
Taufik Saleh

In recent years it appears interest in the interaction between humans and computers. Facial expressions play a fundamental role in social interaction with other humans. In two human communications is only 7% of communication due to language linguistic message, 38% due to paralanguage, while 55% through facial expressions. Therefore, to facilitate human machine interface more friendly on multimedia products, the facial expression recognition on interface very helpful in interacting comfort. One of the steps that affect the facial expression recognition is the accuracy in facial feature extraction. Several approaches to facial expression recognition in its extraction does not consider the dimensions of the data as input features of machine learning Through this research proposes a wavelet algorithm used to reduce the dimension of data features. Data features are then classified using SVM-multiclass machine learning to determine the difference of six facial expressions are anger, hatred, fear of happy, sad, and surprised Jaffe found in the database. Generating classification obtained 81.42% of the 208 sample data.


Sign in / Sign up

Export Citation Format

Share Document