SEMG for Human Computer Interface Using Ann to Navigate Wheel Chair

Author(s):  
V. Rajesh ◽  
P. Rajesh Kumar

This paper presents an approach to identify hand gestures with muscle activity separated from electromyogram (EMG) using Back Propagation analysis with the goal of using hand gestures for human-computer interaction. While there are a number of previous reported works where EMG has been used to identify movement, the limitation of these works is that the systems are suitable for gross actions and when there is one prime-mover muscle involved. This paper reports overcoming the difficulty by using independent component analysis to separate muscle activity from different muscles and classified using back propagation neural networks. The experimental results show that the system was accurately able to identify the hand gesture using this technique (95%). The advantage of this system is that it is easy to train one to use it and can easily be implemented in real time.


Author(s):  
V. Rajesh ◽  
P. Rajesh Kumar

This paper presents an approach to identify hand gestures with muscle activity separated from electromyogram (EMG) using Back Propagation analysis with the goal of using hand gestures for human-computer interaction. While there are a number of previous reported works where EMG has been used to identify movement, the limitation of these works is that the systems are suitable for gross actions and when there is one prime-mover muscle involved. This paper reports overcoming the difficulty by using independent component analysis to separate muscle activity from different muscles and classified using back propagation neural networks. The experimental results show that the system was accurately able to identify the hand gesture using this technique (95%). The advantage of this system is that it is easy to train one to use it and can easily be implemented in real time.



Author(s):  
Koichi Ishibuchi ◽  
Keisuke Iwasaki ◽  
Haruo Takemura ◽  
Fumio Kishino


2017 ◽  
Vol 10 (27) ◽  
pp. 1329-1342 ◽  
Author(s):  
Javier O. Pinzon Arenas ◽  
Robinson Jimenez Moreno ◽  
Paula C. Useche Murillo

This paper presents the implementation of a Region-based Convolutional Neural Network focused on the recognition and localization of hand gestures, in this case 2 types of gestures: open and closed hand, in order to achieve the recognition of such gestures in dynamic backgrounds. The neural network is trained and validated, achieving a 99.4% validation accuracy in gesture recognition and a 25% average accuracy in RoI localization, which is then tested in real time, where its operation is verified through times taken for recognition, execution behavior through trained and untrained gestures, and complex backgrounds.



2019 ◽  
Vol 6 (3) ◽  
pp. 1 ◽  
Author(s):  
K. Maheswari ◽  
S. Ramkumar ◽  
K.Sathesh Kumar ◽  
P.Packia Amutha Priya ◽  
G. Emayavaramban ◽  
...  


2015 ◽  
Vol 2015 ◽  
pp. 1-8 ◽  
Author(s):  
Hazem Khaled ◽  
Samir G. Sayed ◽  
El Sayed M. Saad ◽  
Hossam Ali

Computers and computerized machines have tremendously penetrated all aspects of our lives. This raises the importance of Human-Computer Interface (HCI). The common HCI techniques still rely on simple devices such as keyboard, mice, and joysticks, which are not enough to convoy the latest technology. Hand gesture has become one of the most important attractive alternatives to existing traditional HCI techniques. This paper proposes a new hand gesture detection system for Human-Computer Interaction using real-time video streaming. This is achieved by removing the background using average background algorithm and the 1$ algorithm for hand’s template matching. Then every hand gesture is translated to commands that can be used to control robot movements. The simulation results show that the proposed algorithm can achieve high detection rate and small recognition time under different light changes, scales, rotation, and background.



Sign in / Sign up

Export Citation Format

Share Document