scholarly journals American Sign Language Posture Understanding with Deep Neural Networks

Author(s):  
Md Asif Jalal ◽  
Ruilong Chen ◽  
Roger K Moore ◽  
Lyudmila Mihaylova
Author(s):  
Rachaell Nihalaani

Abstract: Sign Language is invaluable to hearing and speaking impaired people and is their only way of communicating among themselves. However, it has limitations with its reach as the rest of the people have no information regarding sign language interpretation. Sign language is communicated via hand gestures and visual modes and is therefore used by hearing and speaking impaired people to intercommunicate. These languages have alphabets and grammar of their own, which cannot be understood by people who have no knowledge about the specific symbols and rules. Thus, it has become essential for everyone to interpret, understand and communicate via sign language to overcome and alleviate the barriers of speech and communication. This can be tackled with the help of machine learning. This model is a Sign Language Interpreter that uses a dataset of images and interprets the sign language alphabets and sentences with 90.9% accuracy. For this paper, we have used an ASL (American Sign Language) Alphabet. We have used the CNN algorithm for this project. This paper ends with a summary of the model’s viability and its usefulness for interpretation of Sign Language. Keywords: Sign Language, Machine Learning, Interpretation model, Convoluted Neural Networks, American Sign Language


Author(s):  
Aniket Wattamwar

Abstract: This research work presents a prototype system that helps to recognize hand gesture to normal people in order to communicate more effectively with the special people. Aforesaid research work focuses on the problem of gesture recognition in real time that sign language used by the community of deaf people. The problem addressed is based on Digital Image Processing using CNN (Convolutional Neural Networks), Skin Detection and Image Segmentation techniques. This system recognizes gestures of ASL (American Sign Language) including the alphabet and a subset of its words. Keywords: gesture recognition, digital image processing, CNN (Convolutional Neural Networks), image segmentation, ASL (American Sign Language), alphabet


2007 ◽  
Vol 32 (1) ◽  
pp. 24-37 ◽  
Author(s):  
Qutaishat Munib ◽  
Moussa Habeeb ◽  
Bayan Takruri ◽  
Hiba Abed Al-Malik

Author(s):  
Sarthak Sharma

Abstract: Sign language is one of the oldest and most natural form of language for communication, but since most people do not know sign language and interpreters are very difficult to come by we have come up with a real time method using neural networks for fingerspelling based American sign language. In our method, the hand is first passed through a filter and after the filter is applied the hand is passed through a classifier which predicts the class of the hand gestures.


Author(s):  
Mohd Arifullah ◽  
Fais Khan ◽  
Yash Handa

Actual-time signal language translator is a crucial milestone in facilitating communication among the deaf community and the general public. Introducing the development and use of yanked sign Language Spelling Translator (ASL) based on the convolutional neural network. We use the pre-skilled Google Net architecture educated inside the ILSVRC2012 database, in addition to the ASL database for Surrey University and Massey university ASL to apply gaining knowledge of switch in this task. We have developed a sturdy version that constantly separates the letters a-e from the original users and any other that separates the spaced characters in maximum cases. Given the limitations of the information sets and the encouraging consequences acquired, we are assured that with similarly studies and further facts, we can produce a totally customized translator for all ASL characters. Keywords: Sign Language, Image Recognition, American Sign Language, Expressions signals, CNN


Sign in / Sign up

Export Citation Format

Share Document