scholarly journals Media-Player Controlling by Hand Gestures

Author(s):  
DSS Varshika

In this Project we try to control our media player using hand gestures with the help of OpenCV and Python. Computer applications require interaction between human and computer. This interaction needs to be unrestricted and it has made it challenging to traditional input devices such as keyboard, mouse, pen etc. Hand gesture is an important component of body languages in linguistics. Human computer interaction becomes easy with the use of the hand as a device. Use of hand gestures to operate machines would make interaction interesting. Gesture recognition has gained a lot of importance. Hand gestures are used to control various applications like windows media player, robot control, gaming etc. Use of gesture makes interaction easy, convenient and does not require any extra device. Vision and audio recognition can be used together. But audio commands may not work in noisy environments.

2020 ◽  
Author(s):  
David Huerta ◽  
Eric Crawford ◽  
Scott Brown

Human Computer Interaction (HCI) has been redefined in this era. People want to interact with their devices in such a way that has physical significance in the real world, in other words, they want ergonomic input devices. In this paper, we propose a new method of interaction with computing devices having a consumer grade camera, that uses two colored markers (red and green) worn on tips of the fingers to generate desired hand gestures, and for marker detection and tracking we usedtemplate matching with kalman filter. We have implemented all the usual system commands, i.e., cursor movement, right click, left click, double click, going forward and backward, zoom in and out through different hand gestures. Our system caneasily recognize these gestures and give corresponding system commands. Our system is suitable for both desktop devices and devices where touch screen is not feasible like large screens or projected screens.


Author(s):  
Smit Parikh ◽  
Srikar Banka ◽  
Isha Lautrey ◽  
Isha Gupta ◽  
Prof Dhanalekshmi Yedurkar

The use of a physical controller such as a mouse, a keyboard for human computer interaction hinders the natural interface since the user and computer have a high barrier. Our aim is to create an application that controls some basic features of computers using hand gestures through an integrated webcam to resolve this issue. A Hand Gesture Recognition system detects gestures and translates them into specific actions to make our work easier. This can be pursued using OpenCV to capture the gestures which will be interfaced using Django, React.Js and Electron. An algorithm named YOLO is used to train the system accordingly. The gestures will get saved inside the DBMS. The main result expected is that the user will be able to control the basic functions of the system using his/her hand gestures and hence providing them utmost comfort.


Author(s):  
Pranjali Manmode ◽  
Rupali Saha ◽  
Manisha N. Amnerkar

With the rapid development of computer vision, the demand for interaction between humans and machines is becoming more and more extensive. Since hand gestures can express enriched information, hand gesture recognition is widely used in robot control, intelligent furniture, and other aspects. The paper realizes the segmentation of hand gestures by establishing the skin color model and AdaBoost classifier based on haar according to the particularity of skin color for hand gestures and the denaturation of hand gestures with one frame of video being cut for analysis. In this regard, the human hand is segmented from a complicated background. The camshaft algorithm also realizes real-time hand gesture tracking. Then, the area of hand gestures detected in real-time is recognized by a convolutional neural network to discover the recognition of 10 common digits. Experiments show 98.3% accuracy.


Author(s):  
Harsh Ganpatbhai Rana ◽  
Aayushi Sanjay Soni

Nowadays, there is a huge inclination of youngsters and adults towards video games and of course in our childhood everyone might definitely have experienced it. However, we are controlling the game using typical input devices such as mouse, keyboard, joystick, etc but how about, if we control the game using our hand gestures ? These days we have lots of game controllers that are surpassing the gaming experience however they are quite expensive too. Through this project we have designed our own game controlling glove using Arduino. In addition we have also developed a car game using UNITY 3D. The areas of IOT and hand gesture recognition will be explored by this project.


Author(s):  
K M Bilvika ◽  
Sneha B K ◽  
Sahana K M ◽  
Tejaswini S M Patil

In human-computer interaction or sign language interpretation, recognizing hand gestures and face detection become predominant in computer vision research. The primary goal of this proposed system is to create a system, which can identify hand gestures and facial detection to convey information for controlling media player. For those who are deaf and dumb sign language is a common, efficient and alternative way for talking, by using the hand and facial gestures we can easily understand them. Here hand and face are directly use as the input to the device for effective communication purpose of gesture identification there is no need of an intermediate medium.


This proposed work presents a frame work of mouth gesture recognition for Human Computer Interface (HCI). It replaces the traditional input devices such as mouse and keyboard which allows a user to work on a computer using his/her mouth gestures. This work is aimed at helping severely disabled and paralyzed people. The entire work includes mouth detection, region extraction, gesture classification, and interface creation with computer applications. Initially face and mouth regions are detected using Haar-cascaded classifier. Secondly, the gesture recognition is done using the concept of Deep learning through Convolutional Neural Network (CNN). The mouth gestures are recognized and classified as mouth close, mouth open, tongue left and tongue right. Finally an HCI is created by mapping the mouth gestures into VLC player operations such as play, pause, forward jump and backward jump. The performance of the proposed method is measured and compared with other existing methods. This work is found to perform better than the other methods.


2018 ◽  
Vol 14 (7) ◽  
pp. 155014771879075 ◽  
Author(s):  
Kiwon Rhee ◽  
Hyun-Chool Shin

In the recognition of electromyogram-based hand gestures, the recognition accuracy may be degraded during the actual stage of practical applications for various reasons such as electrode positioning bias and different subjects. Besides these, the change in electromyogram signals due to different arm postures even for identical hand gestures is also an important issue. We propose an electromyogram-based hand gesture recognition technique robust to diverse arm postures. The proposed method uses both the signals of the accelerometer and electromyogram simultaneously to recognize correct hand gestures even for various arm postures. For the recognition of hand gestures, the electromyogram signals are statistically modeled considering the arm postures. In the experiments, we compared the cases that took into account the arm postures with the cases that disregarded the arm postures for the recognition of hand gestures. In the cases in which varied arm postures were disregarded, the recognition accuracy for correct hand gestures was 54.1%, whereas the cases using the method proposed in this study showed an 85.7% average recognition accuracy for hand gestures, an improvement of more than 31.6%. In this study, accelerometer and electromyogram signals were used simultaneously, which compensated the effect of different arm postures on the electromyogram signals and therefore improved the recognition accuracy of hand gestures.


2015 ◽  
Vol 14 (9) ◽  
pp. 6102-6106
Author(s):  
Sangeeta Goyal ◽  
Dr. Bhupesh Kumar

There has been growing interest in development of new techniques and methods for Human-Computer Interaction (HCI). Gesture Recognition is one of the important areas of this technology. Gesture Recognition means interfacing with computer using motion of human body typically hand movements. As a Handicapped person cannot move very easily and quickly if there is a fire in house or he/she cannot switch off the Miniature Circuit Breaker (MCB) but the same task can be done easily with hand gesture recognition. In our proposed system electrical MCB can be controlled using hand gesture recognizer. To switch on/off the MCB, we need to provide hand based gesture as an input to system.


Author(s):  
Bhavish Sushiel Agarwal ◽  
Jyoti R Desai ◽  
Snehanshu Saha

The use of hand gestures opens a wide range of application for human computer interaction. The paper makes use of haar classifiers and camShift algorithm to track the movement of hand. Parallelism is introduced at every step by segmenting the data from camshaft into an NxN grid. Every block of the grid now represents a lead point which is calculated from mean of all the points belonging to the particular grid. Now we have only N2 points to recognize the curve that was performed by the user in his action. Finally the fit that was found is compared to pre-defined curve fit data to find out the curve using Mahalanobis equation. Parallelism used in reducing the number of points to be fitted allows the recognition to be faster.


Sign in / Sign up

Export Citation Format

Share Document