Continuous Hand Gesture Spotting and Classification Using 3D Finger Joints Information

Author(s):  
Nguyen Ngoc Hoang ◽  
Guee-Sang Lee ◽  
Soo-Hyung Kim ◽  
Hyung-Jeong Yang
2021 ◽  
Vol 14 (1) ◽  
pp. 70-91
Author(s):  
Ananya Choudhury ◽  
Kandarpa Kumar Sarma

The task of automatic gesture spotting and segmentation is challenging for determining the meaningful gesture patterns from continuous gesture-based character sequences. This paper proposes a vision-based automatic method that handles hand gesture spotting and segmentation of gestural characters embedded in a continuous character stream simultaneously, by employing a hybrid geometrical and statistical feature set. This framework shall form an important constituent of gesture-based character recognition (GBCR) systems, which has gained tremendous demand lately as assistive aids for overcoming the restraints faced by people with physical impairments. The performance of the proposed system is validated by taking into account the vowels and numerals of Assamese vocabulary. Another attribute to this proposed system is the implementation of an effective hand segmentation module, which enables it to tackle complex background settings.


2015 ◽  
Vol 08 (07) ◽  
pp. 313-323 ◽  
Author(s):  
Fayed F. M. Ghaleb ◽  
Ebrahim A. Youness ◽  
Mahmoud Elmezain ◽  
Fatma Sh. Dewdar

2016 ◽  
Vol 2016 ◽  
pp. 1-8 ◽  
Author(s):  
Clementine Nyirarugira ◽  
Hyo-rim Choi ◽  
TaeYong Kim

We present a gesture recognition method derived from particle swarm movement for free-air hand gesture recognition. Online gesture recognition remains a difficult problem due to uncertainty in vision-based gesture boundary detection methods. We suggest an automated process of segmenting meaningful gesture trajectories based on particle swarm movement. A subgesture detection and reasoning method is incorporated in the proposed recognizer to avoid premature gesture spotting. Evaluation of the proposed method shows promising recognition results: 97.6% on preisolated gestures, 94.9% on stream gestures with assistive boundary indicators, and 94.2% for blind gesture spotting on digit gesture vocabulary. The proposed recognizer requires fewer computation resources; thus it is a good candidate for real-time applications.


2021 ◽  
Vol 11 (10) ◽  
pp. 4689
Author(s):  
Ngoc-Hoang Nguyen ◽  
Tran-Dac-Thinh Phan ◽  
Soo-Hyung Kim ◽  
Hyung-Jeong Yang ◽  
Guee-Sang Lee

This paper presents a novel approach to continuous dynamic hand gesture recognition. Our approach contains two main modules: gesture spotting and gesture classification. Firstly, the gesture spotting module pre-segments the video sequence with continuous gestures into isolated gestures. Secondly, the gesture classification module identifies the segmented gestures. In the gesture spotting module, the motion of the hand palm and fingers are fed into the Bidirectional Long Short-Term Memory (Bi-LSTM) network for gesture spotting. In the gesture classification module, three residual 3D Convolution Neural Networks based on ResNet architectures (3D_ResNet) and one Long Short-Term Memory (LSTM) network are combined to efficiently utilize the multiple data channels such as RGB, Optical Flow, Depth, and 3D positions of key joints. The promising performance of our approach is obtained through experiments conducted on three public datasets—Chalearn LAP ConGD dataset, 20BN-Jester, and NVIDIA Dynamic Hand gesture Dataset. Our approach outperforms the state-of-the-art methods on the Chalearn LAP ConGD dataset.


1999 ◽  
Vol 61 (2) ◽  
pp. 134-137
Author(s):  
Kiho KIYOI ◽  
Tamano MATSUI ◽  
Kiyofumi EGAWA ◽  
Tomomichi ONO

Sign in / Sign up

Export Citation Format

Share Document