scholarly journals Automatic Classification of Hand Gesture Contours Based on Human Computer Interaction

IJARCCE ◽  
2017 ◽  
Vol 6 (6) ◽  
pp. 541-543
Author(s):  
Ms. Nayan S. Mane ◽  
Prof. Suresh S. Rode
2015 ◽  
Vol 14 (9) ◽  
pp. 6102-6106
Author(s):  
Sangeeta Goyal ◽  
Dr. Bhupesh Kumar

There has been growing interest in development of new techniques and methods for Human-Computer Interaction (HCI). Gesture Recognition is one of the important areas of this technology. Gesture Recognition means interfacing with computer using motion of human body typically hand movements. As a Handicapped person cannot move very easily and quickly if there is a fire in house or he/she cannot switch off the Miniature Circuit Breaker (MCB) but the same task can be done easily with hand gesture recognition. In our proposed system electrical MCB can be controlled using hand gesture recognizer. To switch on/off the MCB, we need to provide hand based gesture as an input to system.


Author(s):  
Anna Pereira ◽  
Juan P. Wachs ◽  
Kunwoo Park ◽  
David Rempel

Photonics ◽  
2019 ◽  
Vol 6 (3) ◽  
pp. 90 ◽  
Author(s):  
Bosworth ◽  
Russell ◽  
Jacob

Over the past decade, the Human–Computer Interaction (HCI) Lab at Tufts University has been developing real-time, implicit Brain–Computer Interfaces (BCIs) using functional near-infrared spectroscopy (fNIRS). This paper reviews the work of the lab; we explore how we have used fNIRS to develop BCIs that are based on a variety of human states, including cognitive workload, multitasking, musical learning applications, and preference detection. Our work indicates that fNIRS is a robust tool for the classification of brain-states in real-time, which can provide programmers with useful information to develop interfaces that are more intuitive and beneficial for the user than are currently possible given today’s human-input (e.g., mouse and keyboard).


Sensors ◽  
2020 ◽  
Vol 20 (10) ◽  
pp. 2972
Author(s):  
Qinghua Gao ◽  
Shuo Jiang ◽  
Peter B. Shull

Hand gesture classification and finger angle estimation are both critical for intuitive human–computer interaction. However, most approaches study them in isolation. We thus propose a dual-output deep learning model to enable simultaneous hand gesture classification and finger angle estimation. Data augmentation and deep learning were used to detect spatial-temporal features via a wristband with ten modified barometric sensors. Ten subjects performed experimental testing by flexing/extending each finger at the metacarpophalangeal joint while the proposed model was used to classify each hand gesture and estimate continuous finger angles simultaneously. A data glove was worn to record ground-truth finger angles. Overall hand gesture classification accuracy was 97.5% and finger angle estimation R 2 was 0.922, both of which were significantly higher than shallow existing learning approaches used in isolation. The proposed method could be used in applications related to the human–computer interaction and in control environments with both discrete and continuous variables.


Sign in / Sign up

Export Citation Format

Share Document