Bangla sign language interpretation using bag of features and Support Vector Machine

Author(s):  
Jia Uddin ◽  
Fahmid Nasif Arko ◽  
Nujhat Tabassum ◽  
Taposhi Rabeya Trisha ◽  
Fariha Ahmed
2021 ◽  
Vol 11 (2) ◽  
pp. 107-118
Author(s):  
Imantoko Imantoko ◽  
◽  
Arief Hermawan ◽  
Donny Avianto ◽  
◽  
...  

The communication method using sign language is very efficient considering that the speed of information delivery is closer to verbal communication (speaking) compared to writing or typing. Because of this, sign language is often used by people who are deaf, speech impaired, and normal people to communicate. To make sign language translation easier, a system is needed to translate symbols formed from hand movements (in the form of images) into text or sound. This study aims to compare performance such as accuracy and computation time of Support Vector Machine (SVM) and K-Nearest Neighbors (KNN) with Pyramidal Histogram of Gradient (PHOG) for feature extraction, to know which one is better at recognizing sign language. Yield, both combined methods PHOG-SVM and PHOG-KNN can recognize images from hand movements that form certain symbols. The system built using the SVM classification produces the highest accuracy of 82% at PHOG level 3, while the system built with the KNN classification produces the highest accuracy of 78% at PHOG level 2. The total computation time of the fastest training and testing by the SVM model is 236.53 seconds at PHOG level 3, while the KNN model is 78.27 seconds at PHOG level 3. In terms of accuracy, PHOG-SVM is better, but in terms of computation time, PHOG-KNN takes the place.


2020 ◽  
Vol 7 (2) ◽  
pp. 164
Author(s):  
Aditiya Anwar ◽  
Achmad Basuki ◽  
Riyanto Sigit

<p><em>Hand gestures are the communication ways for the deaf people and the other. Each hand gesture has a different meaning.  In order to better communicate, we need an automatic translator who can recognize hand movements as a word or sentence in communicating with deaf people. </em><em>This paper proposes a system to recognize hand gestures based on Indonesian Sign Language Standard. This system uses Myo Armband as hand gesture sensors. Myo Armband has 21 sensors to express the hand gesture data. Recognition process uses a Support Vector Machine (SVM) to classify the hand gesture based on the dataset of Indonesian Sign Language Standard. SVM yields the accuracy of 86.59% to recognize hand gestures as sign language.</em></p><p><em><strong>Keywords</strong></em><em>: </em><em>Hand Gesture Recognition, Feature Extraction, Indonesian Sign Language, Myo Armband, Moment Invariant</em></p>


Author(s):  
Astri Novianty ◽  
Fairuz Azmi

The World Health Organization (WHO) estimates that over five percent of the world's population are hearing-impaired. One of the communication problems that often arise between deaf or speech impaired with normal people is the low level of knowledge and understanding of the deaf or speech impaired's normal sign language in their daily communication. To overcome this problem, we build a sign language recognition system, especially for the Indonesian language. The sign language system for Bahasa Indonesia, called Bisindo, is unique from the others. Our work utilizes two image processing algorithms for the pre-processing, namely the grayscale conversion and the histogram equalization. Subsequently, the principal component analysis (PCA) is employed for dimensional reduction and feature extraction. Finally, the support vector machine (SVM) is applied as the classifier. Results indicate that the use of the histogram equalization significantly enhances the accuracy of the recognition. Comprehensive experiments by applying different random seeds for testing data confirm that our method achieves 76.8% accuracy. Accordingly, a more robust method is still open to enhance the accuracy in sign language recognition.


Sign in / Sign up

Export Citation Format

Share Document