indian sign language
Recently Published Documents


TOTAL DOCUMENTS

285
(FIVE YEARS 134)

H-INDEX

13
(FIVE YEARS 2)

Author(s):  
Poonam Yerpude

Abstract: Communication is very imperative for daily life. Normal people use verbal language for communication while people with disabilities use sign language for communication. Sign language is a way of communicating by using the hand gestures and parts of the body instead of speaking and listening. As not all people are familiar with sign language, there lies a language barrier. There has been much research in this field to remove this barrier. There are mainly 2 ways in which we can convert the sign language into speech or text to close the gap, i.e. , Sensor based technique,and Image processing. In this paper we will have a look at the Image processing technique, for which we will be using the Convolutional Neural Network (CNN). So, we have built a sign detector, which will recognise the sign numbers from 1 to 10. It can be easily extended to recognise other hand gestures including alphabets (A- Z) and expressions. We are creating this model based on Indian Sign Language(ISL). Keywords: Multi Level Perceptron (MLP), Convolutional Neural Network (CNN), Indian Sign Language(ISL), Region of interest(ROI), Artificial Neural Network(ANN), VGG 16(CNN vision architecture model), SGD(Stochastic Gradient Descent).


Author(s):  
Utpal Nandi ◽  
Anudyuti Ghorai ◽  
Moirangthem Marjit Singh ◽  
Chiranjit Changdar ◽  
Shubhankar Bhakta ◽  
...  

2021 ◽  
Author(s):  
Shyam Krishna ◽  
Janmesh Ukey ◽  
Dinesh Babu J

Author(s):  
Pratiksha Sancheti ◽  
Nayan Sabnis ◽  
Keshav Kadam ◽  
Yash Rode ◽  
Pujashree Vidap

2021 ◽  
Author(s):  
P. Golda Jeyasheeli ◽  
N. Indumathi

In Indian Population there is about 1 percent of the people are deaf and dumb. Deaf and dumb people use gestures to interact with each other. Ordinary humans fail to grasp the significance of gestures, which makes interaction between deaf and mute people hard. In attempt for ordinary citizens to understand the signs, an automated sign language identification system is proposed. A smart wearable hand device is designed by attaching different sensors to the gloves to perform the gestures. Each gesture has unique sensor values and those values are collected as an excel data. The characteristics of movements are extracted and categorized with the aid of a convolutional neural network (CNN). The data from the test set is identified by the CNN according to the classification. The objective of this system is to bridge the interaction gap between people who are deaf or hard of hearing and the rest of society.


2021 ◽  
Author(s):  
Hemang Monga ◽  
Jatin Bhutani ◽  
Muskan Ahuja ◽  
Nikita Maid ◽  
Himangi Pande

Indian Sign Language is one of the most important and widely used forms of communication for people with speaking and hearing impairments. Many people or communities have attempted to create systems that read the sign language symbols and convert the same to text, but text or audio to sign language is still infrequent. This project mainly focuses on developing a translating system consisting of many modules that take English audio and convert the input to English text, which is further parsed to structure grammar representation on which grammar rules of Indian Sign Language are applied. Stop words are removed from the reordered sentence. Since the Indian Sign Language does not support conjugation in words, stemming and lemmatization will transform the provided word into its root or original word. Then all the individual words are checked in a dictionary holding videos of each word. If the system does not find words in the dictionary, then the most suitable synonym replaces them. The system proposed by us is inventive as the current systems are bound to direct conversion of words into Indian Sign Language on-the-other-hand our system aims to convert the sentences in Indian Sign Language grammar and effectively display it to the user.


Sign in / Sign up

Export Citation Format

Share Document