HUMAN-COMPUTER INTERACTION USING DYNAMIC HAND GESTURE RECOGNITION TO CONVENIENTLY CONTROL THE SYSTEM

Author(s):  
Smit Parikh ◽  
Srikar Banka ◽  
Isha Lautrey ◽  
Isha Gupta ◽  
Prof Dhanalekshmi Yedurkar

The use of a physical controller such as a mouse, a keyboard for human computer interaction hinders the natural interface since the user and computer have a high barrier. Our aim is to create an application that controls some basic features of computers using hand gestures through an integrated webcam to resolve this issue. A Hand Gesture Recognition system detects gestures and translates them into specific actions to make our work easier. This can be pursued using OpenCV to capture the gestures which will be interfaced using Django, React.Js and Electron. An algorithm named YOLO is used to train the system accordingly. The gestures will get saved inside the DBMS. The main result expected is that the user will be able to control the basic functions of the system using his/her hand gestures and hence providing them utmost comfort.

2013 ◽  
Vol 13 (02) ◽  
pp. 1340001
Author(s):  
SIDDHARTH SWARUP RAUTARAY ◽  
ANUPAM AGRAWAL

Traditional human–computer interaction devices such as the keyboard and mouse become ineffective for an effective interaction with the virtual environment applications because the 3D applications need a new interaction device. An efficient human interaction with the modern virtual environments requires more natural devices. Among them the "Hand Gesture" human–computer interaction modality has recently become of major interest. The main objective of gesture recognition research is to build a system which can recognize human gestures and utilize them to control an application. One of the drawbacks of present gesture recognition systems is being application-dependent which makes it difficult to transfer one gesture control interface into multiple applications. This paper focuses on designing a hand gesture recognition system which is vocabulary independent as well as adaptable to multiple applications. This makes the proposed system vocabulary independent and application independent. The designed system is comprised of the different processing steps like detection, segmentation, tracking, recognition, etc. Vocabulary independence has been incorporated in the proposed system with the help of a robust gesture mapping module that allows the user for cognitive mapping of different gestures to the same command and vice versa. For performance analysis of the proposed system accuracy, recognition rate and command response time have been compared. These parameters have been considered because they analyze the vital impact on the performance of the proposed vocabulary and application-independent hand gesture recognition system.


Sign in / Sign up

Export Citation Format

Share Document