scholarly journals DEVELOPMENT OF AN INTELLIGENT SYSTEM FOR SIGN LANGUAGE INTERPRETATION «JS1» WITH TOOLS OF THE COMPUTER VISION LIBRARY OPENCV

2019 ◽  
Vol 7 ◽  
pp. 38-41
Author(s):  
Artem Sharapov ◽  
Ruslan Grishin

The possibility of using computer vision libraries for sign language interpretation. The specialized software required for the development of sign language interpretation system is analyzed. The configuration and testing of public libraries of computer vision are carried out.

2014 ◽  
Vol 6 (37) ◽  
pp. 170
Author(s):  
Yulia Sergeevna Manueva ◽  
Mikhail Gennadyevich Grif ◽  
Andrei Nikolaevich Kozlov

Author(s):  
K M Bilvika ◽  
Sneha B K ◽  
Sahana K M ◽  
Tejaswini S M Patil

In human-computer interaction or sign language interpretation, recognizing hand gestures and face detection become predominant in computer vision research. The primary goal of this proposed system is to create a system, which can identify hand gestures and facial detection to convey information for controlling media player. For those who are deaf and dumb sign language is a common, efficient and alternative way for talking, by using the hand and facial gestures we can easily understand them. Here hand and face are directly use as the input to the device for effective communication purpose of gesture identification there is no need of an intermediate medium.


2019 ◽  
Vol 1362 ◽  
pp. 012034 ◽  
Author(s):  
Golda Jeyasheeli P ◽  
Annapoorani K Miss

Author(s):  
Ayodele Olawale Olabanji ◽  
Akinlolu Adediran Ponnle

Sign language is the primary method of communication adopted by deaf and hearing-impaired individuals. The indigenous sign language in Nigeria is one area receiving growing interest, with the major challenge faced is communication between signers and non-signers. Recent advancements in computer vision and deep learning neural networks (DLNN) have led to the exploration of necessary technological concepts towards tackling existing challenges. One area with extensive impact from the use of DLNN is the interpretation of hand signs. This study presents an interpretation system for the indigenous sign language in Nigeria. The methodology comprises three key phases: dataset creation, computer vision techniques, and deep learning model development. A multi-class Convolutional Neural Network (CNN) is designed to train and interpret the indigenous signs in Nigeria. The model is evaluated using a custom-built dataset of some selected indigenous words comprising of 15000 image samples. The experimental outcome shows excellent performance from the interpretation system, with accuracy attaining 95.67%.


Sign in / Sign up

Export Citation Format

Share Document