scholarly journals Hand Gesture Controlling using Artificial Intelligence

Presently multi day's robot is constrained by remote or mobile phone or by direct wired association. In the event that we pondering expense and required equipment, this things builds the unpredictability, particularly for low dimension application. Presently the robot that we have structured is not quite the same as over one. It doesn't require any kind of remote or any correspondence module. it is a self-enacted robot, which drives itself as indicated by the position of a client who remains before it. It does what the client wants to do. it makes a duplicate, all things considered, development of the client remaining before it. Equipment required is little, and henceforth minimal effort and little in size. Of late, there has been a flood in enthusiasm for perceiving human Hand signal controlled robot. Hand motion acknowledgment has a few uses, for example, PC amusements, gaming machines, as mouse substitution and apparatus controlled robot (for example crane, medical procedure machines, apply autonomy, counterfeit intelligence

Hand motion acknowledgment is a characteristic method for human PC association and a zone of dynamic research in PC vision and AI. This is a zone with a wide range of conceivable applications, giving clients an easier and increasingly normal approach to speak with robots/frameworks interfaces, without the requirement for additional gadgets. Along these lines, the essential objective of signal acknowledgment explore connected to Human-Computer Interaction (HCI) is to make frameworks, which can distinguish explicit human motions and use them to pass on data or controlling gadgets. For that, vision-based hand signal interfaces require quick and incredibly strong hand discovery, and motion acknowledgment continuously. This paper introduces an answer, sufficiently nonexclusive, with the assistance of deep learning, permitting its application in a wide scope of human-PC interfaces, for ongoing motion acknowledgment. Investigations did demonstrated that the framework had the capacity to accomplish a precision of 99.4% as far as hand act acknowledgment and a normal exactness of 93.72% as far as unique signal acknowledgment.


Robotics ◽  
2021 ◽  
Vol 10 (1) ◽  
pp. 41
Author(s):  
Yuji Yamakawa ◽  
Yugo Katsuki ◽  
Yoshihiro Watanabe ◽  
Masatoshi Ishikawa

This paper focuses on development of a high-speed, low-latency telemanipulated robot hand system, evaluation of the system, and demonstration of the system. The characteristics of the developed system are the followings: non-contact, high-speed 3D visual sensing of the human hand, intuitive motion mapping between human hands and robot hands, and low-latency, fast responsiveness to human hand motion. Such a high-speed, low-latency telemanipulated robot hand system can be considered to be more effective from the viewpoint of usability. The developed system consists of a high-speed vision system, a high-speed robot hand, and a real-time controller. For the developed system, we propose new methods of 3D sensing, mapping between the human hand and the robot hand, and the robot hand control. We evaluated the performance (latency and responsiveness) of the developed system. As a result, the latency of the developed system is so small that humans cannot recognize the latency. In addition, we conducted experiments of opening/closing motion, object grasping, and moving object grasping as demonstrations. Finally, we confirmed the validity and effectiveness of the developed system and proposed method.


Sensors ◽  
2021 ◽  
Vol 21 (9) ◽  
pp. 3035
Author(s):  
Néstor J. Jarque-Bou ◽  
Joaquín L. Sancho-Bru ◽  
Margarita Vergara

The role of the hand is crucial for the performance of activities of daily living, thereby ensuring a full and autonomous life. Its motion is controlled by a complex musculoskeletal system of approximately 38 muscles. Therefore, measuring and interpreting the muscle activation signals that drive hand motion is of great importance in many scientific domains, such as neuroscience, rehabilitation, physiotherapy, robotics, prosthetics, and biomechanics. Electromyography (EMG) can be used to carry out the neuromuscular characterization, but it is cumbersome because of the complexity of the musculoskeletal system of the forearm and hand. This paper reviews the main studies in which EMG has been applied to characterize the muscle activity of the forearm and hand during activities of daily living, with special attention to muscle synergies, which are thought to be used by the nervous system to simplify the control of the numerous muscles by actuating them in task-relevant subgroups. The state of the art of the current results are presented, which may help to guide and foster progress in many scientific domains. Furthermore, the most important challenges and open issues are identified in order to achieve a better understanding of human hand behavior, improve rehabilitation protocols, more intuitive control of prostheses, and more realistic biomechanical models.


2014 ◽  
Vol 2014 ◽  
pp. 1-12 ◽  
Author(s):  
Hong-Min Zhu ◽  
Chi-Man Pun

We propose an adaptive and robust superpixel based hand gesture tracking system, in which hand gestures drawn in free air are recognized from their motion trajectories. First we employed the motion detection of superpixels and unsupervised image segmentation to detect the moving target hand using the first few frames of the input video sequence. Then the hand appearance model is constructed from its surrounding superpixels. By incorporating the failure recovery and template matching in the tracking process, the target hand is tracked by an adaptive superpixel based tracking algorithm, where the problem of hand deformation, view-dependent appearance invariance, fast motion, and background confusion can be well handled to extract the correct hand motion trajectory. Finally, the hand gesture is recognized by the extracted motion trajectory with a trained SVM classifier. Experimental results show that our proposed system can achieve better performance compared to the existing state-of-the-art methods with the recognition accuracy 99.17% for easy set and 98.57 for hard set.


2000 ◽  
Author(s):  
Michael L. Turner ◽  
Ryan P. Findley ◽  
Weston B. Griffin ◽  
Mark R. Cutkosky ◽  
Daniel H. Gomez

Abstract This paper describes the development of a system for dexterous telemanipulation and presents the results of tests involving simple manipulation tasks. The user wears an instrumented glove augmented with an arm-grounded haptic feedback apparatus. A linkage attached to the user’s wrist measures gross motions of the arm. The movements of the user are transferred to a two fingered dexterous robot hand mounted on the end of a 4-DOF industrial robot arm. Forces measured at the robot fingers can be transmitted back to the user via the haptic feedback apparatus. The results obtained in block-stacking and object-rolling experiments indicate that the addition of force feedback to the user did not improve the speed of task execution. In fact, in some cases the presence of incomplete force information is detrimental to performance speed compared to no force information. There are indications that the presence of force feedback did aid in task learning.


2012 ◽  
Vol 6 ◽  
pp. 98-107 ◽  
Author(s):  
Amit Gupta ◽  
Vijay Kumar Sehrawat ◽  
Mamta Khosla

Author(s):  
Shriya A. Hande ◽  
Nitin R. Chopde

<p>In today’s world, in almost all sectors, most of the work is done by robots or robotic arm having different number of degree of freedoms (DOF’s) as per the requirement. This project deals with the Design and Implementation of a “Wireless Gesture Controlled Robotic Arm with Vision”. The system design is divided into 3 parts namely: Accelerometer Part, Robotic Arm and Platform. It is fundamentally an Accelerometer based framework which controls a Robotic Arm remotely utilizing a, little and minimal effort, 3-pivot (DOF's) accelerometer by means of RF signals. The Robotic Arm is mounted over a versatile stage which is likewise controlled remotely by another accelerometer. One accelerometer is mounted/joined on the human hand, catching its conduct (motions and stances) and hence the mechanical arm moves in like manner and the other accelerometer is mounted on any of the leg of the client/administrator, catching its motions and stances and in this way the stage moves as needs be. In a nutshell, the robotic arm and platform is synchronised with the gestures and postures of the hand and leg of the user / operator, respectively. The different motions performed by robotic arm are: PICK and PLACE / DROP, RAISING and LOWERING the objects. Also, the motions performed by the platform are: FORWARD, BACKWARD, RIGHT and LEFT.</p>


Sign in / Sign up

Export Citation Format

Share Document