scholarly journals Real-time Hand Gesture Extraction Using Python Programming Language Facilities

2021 ◽  
Vol 39 (6) ◽  
pp. 1031-1040
Author(s):  
Azher A. Fahad ◽  
Hassan J. Hassan ◽  
Salma H. Abdullah

Hand gesture recognition is one of communication in which used bodily behavior to transmit several messages. This paper aims to detect hand gestures with the mobile device camera and create a customize dataset that used in deep learning model training to recognize hand gestures. The real-time approach was used for all these objectives: the first step is hand area detection; the second step is hand area storing in a dataset form to use in the future for model training. A framework for human contact was put in place by studying pictures recorded by the camera. It was converted the RGB color space image to the greyscale, the blurring method is used for object noise removing efficaciously. To highlight the edges and curves of the hand, the thresholding method is used. And subtraction of complex background is applied to detect moving objects from a static camera. The objectives of the paper were reliable and favorable which helps deaf and dumb people interact with the environment through the sign language fully approved to extract hand movements. Python language as a programming manner to discover hand gestures. This work has an efficient hand gesture detection process to address the problem of framing from real-time video.

2013 ◽  
Vol 756-759 ◽  
pp. 4138-4142 ◽  
Author(s):  
Lin Song ◽  
Rui Min Hu ◽  
Hua Zhang ◽  
Yu Lian Xiao ◽  
Li Yu Gong

In this paper, we describe an real-time algorithm to detect 3D hand gestures from depth images. Firstly, we detect moving regions by frame difference; then, regions are refined by removing small regions and boundary regions; finally, foremost region is selected and its trajectories are classified using an automatic state machine. Experiments on Microsoft Kinect for Xbox captured sequences show the effectiveness and efficiency of our system.


2017 ◽  
Vol 10 (27) ◽  
pp. 1329-1342 ◽  
Author(s):  
Javier O. Pinzon Arenas ◽  
Robinson Jimenez Moreno ◽  
Paula C. Useche Murillo

This paper presents the implementation of a Region-based Convolutional Neural Network focused on the recognition and localization of hand gestures, in this case 2 types of gestures: open and closed hand, in order to achieve the recognition of such gestures in dynamic backgrounds. The neural network is trained and validated, achieving a 99.4% validation accuracy in gesture recognition and a 25% average accuracy in RoI localization, which is then tested in real time, where its operation is verified through times taken for recognition, execution behavior through trained and untrained gestures, and complex backgrounds.


Sensors ◽  
2020 ◽  
Vol 20 (16) ◽  
pp. 4566
Author(s):  
Chanhwi Lee ◽  
Jaehan Kim ◽  
Seoungbae Cho ◽  
Jinwoong Kim ◽  
Jisang Yoo ◽  
...  

The use of human gesturing to interact with devices such as computers or smartphones has presented several problems. This form of interaction relies on gesture interaction technology such as Leap Motion from Leap Motion, Inc, which enables humans to use hand gestures to interact with a computer. The technology has excellent hand detection performance, and even allows simple games to be played using gestures. Another example is the contactless use of a smartphone to take a photograph by simply folding and opening the palm. Research on interaction with other devices via hand gestures is in progress. Similarly, studies on the creation of a hologram display from objects that actually exist are also underway. We propose a hand gesture recognition system that can control the Tabletop holographic display based on an actual object. The depth image obtained using the latest Time-of-Flight based depth camera Azure Kinect is used to obtain information about the hand and hand joints by using the deep-learning model CrossInfoNet. Using this information, we developed a real time system that defines and recognizes gestures indicating left, right, up, and down basic rotation, and zoom in, zoom out, and continuous rotation to the left and right.


Author(s):  
Pranjali Manmode ◽  
Rupali Saha ◽  
Manisha N. Amnerkar

With the rapid development of computer vision, the demand for interaction between humans and machines is becoming more and more extensive. Since hand gestures can express enriched information, hand gesture recognition is widely used in robot control, intelligent furniture, and other aspects. The paper realizes the segmentation of hand gestures by establishing the skin color model and AdaBoost classifier based on haar according to the particularity of skin color for hand gestures and the denaturation of hand gestures with one frame of video being cut for analysis. In this regard, the human hand is segmented from a complicated background. The camshaft algorithm also realizes real-time hand gesture tracking. Then, the area of hand gestures detected in real-time is recognized by a convolutional neural network to discover the recognition of 10 common digits. Experiments show 98.3% accuracy.


Sign in / Sign up

Export Citation Format

Share Document