scholarly journals Prototype Program Hand Gesture Recognize Using the Convex Hull Method and Convexity Defect on Android

2020 ◽  
Vol 5 (2) ◽  
pp. 205
Author(s):  
Muhammad Adi Khairul Anshary ◽  
Eka Wahyu Hidayat ◽  
Tiara Amalia

One of the research topics of Human-Computer Interaction is the development of input devices and how users interact with computers. So far, the application of hand gestures is more often applied to desktop computers. Meanwhile, current technological developments have given rise to various forms of computers, one of which is a computer in the form of a smartphone whose users are increasing every year. Therefore, hand gestures need to be applied to smartphones to facilitate interaction between the user and the device. This study implements hand gestures on smartphones using the Android operating system. The algorithm used is convex hull and convexity defect for recognition of the network on the hand which is used as system input. Meanwhile, to ensure this technology runs well, testing was carried out with 3 scenarios involving variable lighting, background color, and indoor or outdoor conditions. The results of this study indicate that Hand gesture recognition using convex hull and convexity defect algorithms has been successfully implemented on smartphones with the Android operating system. Indoor or outdoor testing environment greatly affects the accuracy of hand gesture recognition. For outdoor use, a green background color with a light intensity of 1725 lux produces 76.7% accuracy, while for indoors, a red background color with a light intensity of 300 lux provides the greatest accuracy of 83.3%.

2018 ◽  
Vol 8 (2) ◽  
pp. 105
Author(s):  
Artha Gilang Saputra ◽  
Ema Utami ◽  
Hanif Al Fatta

Research of Human Computer Interaction (HCI) and Computer Vision (CV) is increasingly focused on advanced interface for interacting with humans and creating system models for various purposes. Especially for input device problem to interact with computer. Humans are accustomed to communicate with fellow human beings using voice communication and accompanied by body pose and hand gesture. The main purpose of this research is to applying the Convex Hull and Convexity Defects methods for Hand Gesture Recognition system. In this research, the Hand Gesture Recognition system designed with the OpenCV library and then receives input from the user's hand gesture using an integrated webcam on the computer and system generates a language output from the hand-recognizable gestures. Testing involves several variables which affect success in recognizing user's hand gestures, such as hand distance with webcam, corner of the finger, light conditions and background conditions. As a result, the user's hand gestures can be recognized with a stable and accurate when at a distance of 50cm-70cm, corner of the finger 25o–70o, light conditions 150lux-460lux and plain background conditions.


2018 ◽  
Vol 14 (7) ◽  
pp. 155014771879075 ◽  
Author(s):  
Kiwon Rhee ◽  
Hyun-Chool Shin

In the recognition of electromyogram-based hand gestures, the recognition accuracy may be degraded during the actual stage of practical applications for various reasons such as electrode positioning bias and different subjects. Besides these, the change in electromyogram signals due to different arm postures even for identical hand gestures is also an important issue. We propose an electromyogram-based hand gesture recognition technique robust to diverse arm postures. The proposed method uses both the signals of the accelerometer and electromyogram simultaneously to recognize correct hand gestures even for various arm postures. For the recognition of hand gestures, the electromyogram signals are statistically modeled considering the arm postures. In the experiments, we compared the cases that took into account the arm postures with the cases that disregarded the arm postures for the recognition of hand gestures. In the cases in which varied arm postures were disregarded, the recognition accuracy for correct hand gestures was 54.1%, whereas the cases using the method proposed in this study showed an 85.7% average recognition accuracy for hand gestures, an improvement of more than 31.6%. In this study, accelerometer and electromyogram signals were used simultaneously, which compensated the effect of different arm postures on the electromyogram signals and therefore improved the recognition accuracy of hand gestures.


2019 ◽  
Vol 2019 ◽  
pp. 1-7 ◽  
Author(s):  
Peng Liu ◽  
Xiangxiang Li ◽  
Haiting Cui ◽  
Shanshan Li ◽  
Yafei Yuan

Hand gesture recognition is an intuitive and effective way for humans to interact with a computer due to its high processing speed and recognition accuracy. This paper proposes a novel approach to identify hand gestures in complex scenes by the Single-Shot Multibox Detector (SSD) deep learning algorithm with 19 layers of a neural network. A benchmark database with gestures is used, and general hand gestures in the complex scene are chosen as the processing objects. A real-time hand gesture recognition system based on the SSD algorithm is constructed and tested. The experimental results show that the algorithm quickly identifies humans’ hands and accurately distinguishes different types of gestures. Furthermore, the maximum accuracy is 99.2%, which is significantly important for human-computer interaction application.


2016 ◽  
Vol 11 (1) ◽  
pp. 30-35
Author(s):  
Manoj Acharya ◽  
Dibakar Raj Pant

This paper proposes a method to recognize static hand gestures in an image or video where a person is performing Nepali Sign Language (NSL) and translate it to words and sentences. The classification is carried out using Neural Network where contour of the hand is used as the feature. The work is verified successfully for NSL recognition using signer dependency analysis. Journal of the Institute of Engineering, 2015, 11(1): 30-35


2020 ◽  
Vol 7 (2) ◽  
pp. 164
Author(s):  
Aditiya Anwar ◽  
Achmad Basuki ◽  
Riyanto Sigit

<p><em>Hand gestures are the communication ways for the deaf people and the other. Each hand gesture has a different meaning.  In order to better communicate, we need an automatic translator who can recognize hand movements as a word or sentence in communicating with deaf people. </em><em>This paper proposes a system to recognize hand gestures based on Indonesian Sign Language Standard. This system uses Myo Armband as hand gesture sensors. Myo Armband has 21 sensors to express the hand gesture data. Recognition process uses a Support Vector Machine (SVM) to classify the hand gesture based on the dataset of Indonesian Sign Language Standard. SVM yields the accuracy of 86.59% to recognize hand gestures as sign language.</em></p><p><em><strong>Keywords</strong></em><em>: </em><em>Hand Gesture Recognition, Feature Extraction, Indonesian Sign Language, Myo Armband, Moment Invariant</em></p>


2019 ◽  
Vol 8 (4) ◽  
pp. 1027-1029

In today’s world gesture recognition technologies are much newer. Hand gesture recognition is well known done using glove based technique. In our project the vehicle that can be controlled by hand gestures. The gesture controlled robot car is now we controlled by our hand sign and not in our older days controlled using buttons. The controller just they need to wear some small transmission sensor in his hand which acts as accelerometer and at end it receives signals using RF receiver sensor in the car. We refer this idea in some previous projects and we implement some extra features in our project. We uses different sensor to implement better. In previous cases there are some transmission problems but in this we alter the transmission phase and we also fixed extra antenna to transmit extra range of signals. The microcontrollers controls the movement of the robot in the same direction as our hand moves.


Author(s):  
Priyanka R. ◽  
Prahanya Sriram ◽  
Jayasree L. N. ◽  
Angelin Gladston

Gesture recognition is the most intuitive form of human-computer interface. Hand gestures provide a natural way for humans to interact with computers to perform a variety of different applications. However, factors such as complexity of hand gesture structures, differences in hand size, hand posture, and environmental illumination can influence the performance of hand gesture recognition algorithms. Considering the above factors, this paper aims to present a real time system for hand gesture recognition on the basis of detection of some meaningful shape-based features like orientation, center of mass, status of fingers, thumb in terms of raised or folded fingers of hand and their respective location in image. The internet is growing at a very fast pace. The use of web browser is also growing. Everyone has at least two or three most frequently visited website. Thus, in this paper, effectiveness of the gesture recognition and its ability to control the browser via the recognized hand gestures are experimented and the results are analyzed.


Sensors ◽  
2019 ◽  
Vol 19 (16) ◽  
pp. 3548 ◽  
Author(s):  
Piotr Kaczmarek ◽  
Tomasz Mańkowski ◽  
Jakub Tomczyński

In this paper, we present a putEMG dataset intended for the evaluation of hand gesture recognition methods based on sEMG signal. The dataset was acquired for 44 able-bodied subjects and include 8 gestures (3 full hand gestures, 4 pinches and idle). It consists of uninterrupted recordings of 24 sEMG channels from the subject’s forearm, RGB video stream and depth camera images used for hand motion tracking. Moreover, exemplary processing scripts are also published. The putEMG dataset is available under a Creative Commons Attribution-NonCommercial 4.0 International (CC BY-NC 4.0). The dataset was validated regarding sEMG amplitudes and gesture recognition performance. The classification was performed using state-of-the-art classifiers and feature sets. An accuracy of 90% was achieved for SVM classifier utilising RMS feature and for LDA classifier using Hudgin’s and Du’s feature sets. Analysis of performance for particular gestures showed that LDA/Du combination has significantly higher accuracy for full hand gestures, while SVM/RMS performs better for pinch gestures. The presented dataset can be used as a benchmark for various classification methods, the evaluation of electrode localisation concepts, or the development of classification methods invariant to user-specific features or electrode displacement.


Sensors ◽  
2020 ◽  
Vol 20 (16) ◽  
pp. 4566
Author(s):  
Chanhwi Lee ◽  
Jaehan Kim ◽  
Seoungbae Cho ◽  
Jinwoong Kim ◽  
Jisang Yoo ◽  
...  

The use of human gesturing to interact with devices such as computers or smartphones has presented several problems. This form of interaction relies on gesture interaction technology such as Leap Motion from Leap Motion, Inc, which enables humans to use hand gestures to interact with a computer. The technology has excellent hand detection performance, and even allows simple games to be played using gestures. Another example is the contactless use of a smartphone to take a photograph by simply folding and opening the palm. Research on interaction with other devices via hand gestures is in progress. Similarly, studies on the creation of a hologram display from objects that actually exist are also underway. We propose a hand gesture recognition system that can control the Tabletop holographic display based on an actual object. The depth image obtained using the latest Time-of-Flight based depth camera Azure Kinect is used to obtain information about the hand and hand joints by using the deep-learning model CrossInfoNet. Using this information, we developed a real time system that defines and recognizes gestures indicating left, right, up, and down basic rotation, and zoom in, zoom out, and continuous rotation to the left and right.


Sign in / Sign up

Export Citation Format

Share Document