scholarly journals Hand Gesture Recognition With a Novel Particle Filter

2020 ◽  
Author(s):  
David Huerta ◽  
Eric Crawford ◽  
Scott Brown

Human Computer Interaction (HCI) has been redefined in this era. People want to interact with their devices in such a way that has physical significance in the real world, in other words, they want ergonomic input devices. In this paper, we propose a new method of interaction with computing devices having a consumer grade camera, that uses two colored markers (red and green) worn on tips of the fingers to generate desired hand gestures, and for marker detection and tracking we usedtemplate matching with kalman filter. We have implemented all the usual system commands, i.e., cursor movement, right click, left click, double click, going forward and backward, zoom in and out through different hand gestures. Our system caneasily recognize these gestures and give corresponding system commands. Our system is suitable for both desktop devices and devices where touch screen is not feasible like large screens or projected screens.

Author(s):  
DSS Varshika

In this Project we try to control our media player using hand gestures with the help of OpenCV and Python. Computer applications require interaction between human and computer. This interaction needs to be unrestricted and it has made it challenging to traditional input devices such as keyboard, mouse, pen etc. Hand gesture is an important component of body languages in linguistics. Human computer interaction becomes easy with the use of the hand as a device. Use of hand gestures to operate machines would make interaction interesting. Gesture recognition has gained a lot of importance. Hand gestures are used to control various applications like windows media player, robot control, gaming etc. Use of gesture makes interaction easy, convenient and does not require any extra device. Vision and audio recognition can be used together. But audio commands may not work in noisy environments.


Author(s):  
Smit Parikh ◽  
Srikar Banka ◽  
Isha Lautrey ◽  
Isha Gupta ◽  
Prof Dhanalekshmi Yedurkar

The use of a physical controller such as a mouse, a keyboard for human computer interaction hinders the natural interface since the user and computer have a high barrier. Our aim is to create an application that controls some basic features of computers using hand gestures through an integrated webcam to resolve this issue. A Hand Gesture Recognition system detects gestures and translates them into specific actions to make our work easier. This can be pursued using OpenCV to capture the gestures which will be interfaced using Django, React.Js and Electron. An algorithm named YOLO is used to train the system accordingly. The gestures will get saved inside the DBMS. The main result expected is that the user will be able to control the basic functions of the system using his/her hand gestures and hence providing them utmost comfort.


Author(s):  
Harsh Ganpatbhai Rana ◽  
Aayushi Sanjay Soni

Nowadays, there is a huge inclination of youngsters and adults towards video games and of course in our childhood everyone might definitely have experienced it. However, we are controlling the game using typical input devices such as mouse, keyboard, joystick, etc but how about, if we control the game using our hand gestures ? These days we have lots of game controllers that are surpassing the gaming experience however they are quite expensive too. Through this project we have designed our own game controlling glove using Arduino. In addition we have also developed a car game using UNITY 3D. The areas of IOT and hand gesture recognition will be explored by this project.


2018 ◽  
Vol 14 (7) ◽  
pp. 155014771879075 ◽  
Author(s):  
Kiwon Rhee ◽  
Hyun-Chool Shin

In the recognition of electromyogram-based hand gestures, the recognition accuracy may be degraded during the actual stage of practical applications for various reasons such as electrode positioning bias and different subjects. Besides these, the change in electromyogram signals due to different arm postures even for identical hand gestures is also an important issue. We propose an electromyogram-based hand gesture recognition technique robust to diverse arm postures. The proposed method uses both the signals of the accelerometer and electromyogram simultaneously to recognize correct hand gestures even for various arm postures. For the recognition of hand gestures, the electromyogram signals are statistically modeled considering the arm postures. In the experiments, we compared the cases that took into account the arm postures with the cases that disregarded the arm postures for the recognition of hand gestures. In the cases in which varied arm postures were disregarded, the recognition accuracy for correct hand gestures was 54.1%, whereas the cases using the method proposed in this study showed an 85.7% average recognition accuracy for hand gestures, an improvement of more than 31.6%. In this study, accelerometer and electromyogram signals were used simultaneously, which compensated the effect of different arm postures on the electromyogram signals and therefore improved the recognition accuracy of hand gestures.


2015 ◽  
Vol 14 (9) ◽  
pp. 6102-6106
Author(s):  
Sangeeta Goyal ◽  
Dr. Bhupesh Kumar

There has been growing interest in development of new techniques and methods for Human-Computer Interaction (HCI). Gesture Recognition is one of the important areas of this technology. Gesture Recognition means interfacing with computer using motion of human body typically hand movements. As a Handicapped person cannot move very easily and quickly if there is a fire in house or he/she cannot switch off the Miniature Circuit Breaker (MCB) but the same task can be done easily with hand gesture recognition. In our proposed system electrical MCB can be controlled using hand gesture recognizer. To switch on/off the MCB, we need to provide hand based gesture as an input to system.


Author(s):  
Bhavish Sushiel Agarwal ◽  
Jyoti R Desai ◽  
Snehanshu Saha

The use of hand gestures opens a wide range of application for human computer interaction. The paper makes use of haar classifiers and camShift algorithm to track the movement of hand. Parallelism is introduced at every step by segmenting the data from camshaft into an NxN grid. Every block of the grid now represents a lead point which is calculated from mean of all the points belonging to the particular grid. Now we have only N2 points to recognize the curve that was performed by the user in his action. Finally the fit that was found is compared to pre-defined curve fit data to find out the curve using Mahalanobis equation. Parallelism used in reducing the number of points to be fitted allows the recognition to be faster.


Author(s):  
Zeenat S. AlKassim ◽  
Nader Mohamed

In this chapter, the authors discuss a unique technology known as the Sixth Sense Technology, highlighting the future opportunities of such technology in integrating the digital world with the real world. Challenges in implementing such technologies are also discussed along with a review of the different possible implementation approaches. This review is performed by exploring the different inventions in areas similar to the Sixth Sense Technology, namely augmented reality (AR), computer vision, image processing, gesture recognition, and artificial intelligence and then categorizing and comparing between them. Lastly, recommendations are discussed for improving such a unique technology that has the potential to create a new trend in human-computer interaction (HCI) in the coming years.


2017 ◽  
Vol 10 (27) ◽  
pp. 1329-1342 ◽  
Author(s):  
Javier O. Pinzon Arenas ◽  
Robinson Jimenez Moreno ◽  
Paula C. Useche Murillo

This paper presents the implementation of a Region-based Convolutional Neural Network focused on the recognition and localization of hand gestures, in this case 2 types of gestures: open and closed hand, in order to achieve the recognition of such gestures in dynamic backgrounds. The neural network is trained and validated, achieving a 99.4% validation accuracy in gesture recognition and a 25% average accuracy in RoI localization, which is then tested in real time, where its operation is verified through times taken for recognition, execution behavior through trained and untrained gestures, and complex backgrounds.


Sign in / Sign up

Export Citation Format

Share Document