Feature extraction from 2D gesture trajectory in Malaysian Sign Language recognition

Author(s):  
Yona Falinie bte Abdul Gaus ◽  
Farrah Wong Hock Tze ◽  
Kenneth Teo Tze Kin
2020 ◽  
Vol 5 (1) ◽  
Author(s):  
Kudirat O Jimoh ◽  
Anuoluwapo O Ajayi ◽  
Ibrahim K Ogundoyin

An android based sign language recognition system for selected English vocabularies was developed with the explicit objective to examine the specific characteristics that are responsible for gestures recognition. Also, a recognition model for the process was designed, implemented, and evaluated on 230 samples of hand gestures.  The collected samples were pre-processed and rescaled from 3024 ×4032 pixels to 245 ×350 pixels. The samples were examined for the specific characteristics using Oriented FAST and Rotated BRIEF, and the Principal Component Analysis used for feature extraction. The model was implemented in Android Studio using the template matching algorithm as its classifier. The performance of the system was evaluated using precision, recall, and accuracy as metrics. It was observed that the system obtained an average classification rate of 87%, an average precision value of 88% and 91% for the average recall rate on the test data of hand gestures.  The study, therefore, has successfully classified hand gestures for selected English vocabularies. The developed system will enhance the communication skills between hearing and hearing-impaired people, and also aid their teaching and learning processes. Future work include exploring state-of-the-art machining learning techniques such Generative Adversarial Networks (GANs) for large dataset to improve the accuracy of results. Keywords— Feature extraction; Gestures Recognition; Sign Language; Vocabulary, Android device.


Sign in / Sign up

Export Citation Format

Share Document