leap motion
Recently Published Documents


TOTAL DOCUMENTS

600
(FIVE YEARS 261)

H-INDEX

23
(FIVE YEARS 6)

Author(s):  
Ángela Aguilera-Rubio ◽  
Isabel M. Alguacil-Diego ◽  
Ana Mallo-López ◽  
Alicia Cuesta-Gómez

Author(s):  
Fernando C. Jiménez-González ◽  
Dulce Esperanza Torres-Ramírez

Subjective feelings feedbacks are commonly employed by a patient during forearm rehabilitation therapy without real-time data, leading to suboptimal recovery results in some patients. Technological innovations in the field of assisted rehabilitation have enabled the evolution of real-time monitoring systems. In this paper, interactive assistant development is presented as the interface to define the relationship between the kinematics patterns and the electromyographic signals during the forearm rehabilitation routine. Leap Motion (LM) and Shimmer3 EMG sensors read the routine behavior by following the movements that appear on the software. Real-time targets are programmed to lead the necessary forearm movements that the therapist sets to determine the recovery progress. The integration of software and hardware shows a dataset basis on interaction variables such as arm velocity, arm position, performance rate, and electrical muscle pulse. The results obtained from tests show that the system works effectively within a range of movement of 9 to 88 degrees in rotation about the axes, and velocities under 190 mm/s show stable movement representation on software. Finally, the outcomes ranges show an alternative tool to evaluate patients with a forearm injury.


2021 ◽  
Vol 12 (1) ◽  
pp. 258
Author(s):  
Marek Čorňák ◽  
Michal Tölgyessy ◽  
Peter Hubinský

The concept of “Industry 4.0” relies heavily on the utilization of collaborative robotic applications. As a result, the need for an effective, natural, and ergonomic interface arises, as more workers will be required to work with robots. Designing and implementing natural forms of human–robot interaction (HRI) is key to ensuring efficient and productive collaboration between humans and robots. This paper presents a gestural framework for controlling a collaborative robotic manipulator using pointing gestures. The core principle lies in the ability of the user to send the robot’s end effector to the location towards, which he points to by his hand. The main idea is derived from the concept of so-called “linear HRI”. The framework utilizes a collaborative robotic arm UR5e and the state-of-the-art human body tracking sensor Leap Motion. The user is not required to wear any equipment. The paper describes the overview of the framework’s core method and provides the necessary mathematical background. An experimental evaluation of the method is provided, and the main influencing factors are identified. A unique robotic collaborative workspace called Complex Collaborative HRI Workplace (COCOHRIP) was designed around the gestural framework to evaluate the method and provide the basis for the future development of HRI applications.


2021 ◽  
Author(s):  
Ishika Godage ◽  
Ruvan Weerasignhe ◽  
Damitha Sandaruwan

It is no doubt that communication plays a vital role in human life. There is, however, a significant population of hearing-impaired people who use non-verbal techniques for communication, which a majority of the people cannot understand. The predominant of these techniques is based on sign language, the main communication protocol among hearing impaired people. In this research, we propose a method to bridge the communication gap between hearing impaired people and others, which translates signed gestures into text. Most existing solutions, based on technologies such as Kinect, Leap Motion, Computer vision, EMG and IMU try to recognize and translate individual signs of hearing impaired people. The few approaches to sentence-level sign language recognition suffer from not being user-friendly or even practical owing to the devices they use. The proposed system is designed to provide full freedom to the user to sign an uninterrupted full sentence at a time. For this purpose, we employ two Myo armbands for gesture-capturing. Using signal processing and supervised learning based on a vocabulary of 49 words and 346 sentences for training with a single signer, we were able to achieve 75-80% word-level accuracy and 45-50% sentence level accuracy using gestural (EMG) and spatial (IMU) features for our signer-dependent experiment.


2021 ◽  
Vol 2021 ◽  
pp. 1-6
Author(s):  
Khalid Twarish Alhamazani ◽  
Jalawi Alshudukhi ◽  
Talal Saad Alharbi ◽  
Saud Aljaloud ◽  
Zelalem Meraf

In recent years, in combination with technological advances, new paradigms of interaction with the user have emerged. This has motivated the industry to create increasingly powerful and accessible natural user interface devices. In particular, depth cameras have achieved high levels of user adoption. These devices include the Microsoft Kinect, the Intel RealSense, and the Leap Motion Controller. This type of device facilitates the acquisition of data in human activity recognition. Hand gestures can be static or dynamic, depending on whether they present movement in the image sequences. Hand gesture recognition enables human-computer interaction (HCI) system developers to create more immersive, natural, and intuitive experiences and interactions. However, this task is not easy. That is why, in the academy, this problem has been addressed using machine learning techniques. The experiments carried out have shown very encouraging results indicating that the choice of this type of architecture allows obtaining an excellent efficiency of parameters and prediction times. It should be noted that the tests are carried out on a set of relevant data from the area. Based on this, the performance of this proposal is analysed about different scenarios such as lighting variation or camera movement, different types of gestures, and sensitivity or bias by people, among others. In this article, we will look at how infrared camera images can be used to segment, classify, and recognise one-handed gestures in a variety of lighting conditions. A standard webcam was modified, and an infrared filter was added to the lens to create the infrared camera. The scene was illuminated by additional infrared LED structures, allowing it to be used in various lighting conditions.


2021 ◽  
Vol 10 (6) ◽  
pp. 3834-3836
Author(s):  
Prasad Dhage

Stroke patients have limited everyday tasks. For that videogame-based training (VBT) with the effect of virtual reality helps to improve the role of upper limb and motor function of hand rehabilitation (finger pinch grip). The Leap motion controller can track the both extremities (hand and fingers) fine movements. The study will demonstrate the impact of the leap motion controller on pinch grip in patient with sub-acute and chronic stroke. The total of 40 participants will be taken for study as per inclusion and exclusion criteria. The duration of the study will be six months with intervention. Leap motion -based, augmented reality training will be provided to patients for half hour, Every single day, 5days of the week a month. Formant’s sign and system usability scale will be taken. Those two will be the patient’s measure outcomes. Impact of the leap motion controller device will be evaluated by using the system usability scale and Formant’s sign. The result from the study will significantly provide evidence on the use of Leap motion controller on pinch grip in subacute and chronic stroke patient.


Robotics ◽  
2021 ◽  
Vol 10 (4) ◽  
pp. 130
Author(s):  
Marcus R. S. B. de Souza ◽  
Rogério S. Gonçalves ◽  
Giuseppe Carbone

The leap motion controller is a commercial low-cost marker-less optical sensor that can track the motion of a human hand by recording various parameters. Upper limb rehabilitation therapy is the treatment of people having upper limb impairments, whose recovery is achieved through continuous motion exercises. However, the repetitive nature of these exercises can be interpreted as boring or discouraging while patient motivation plays a key role in their recovery. Thus, serious games have been widely used in therapies for motivating patients and making the therapeutic process more enjoyable. This paper explores the feasibility, accuracy, and repeatability of a leap motion controller (LMC) to be applied in combination with a serious game for upper limb rehabilitation. Experimental feasibility tests are carried out by using an industrial robot that replicates the upper limb motions and is tracked by using an LMC. The results suggest a satisfactory performance in terms of tracking accuracy although some limitations are identified and discussed in terms of measurable workspace.


2021 ◽  
Vol 2021 ◽  
pp. 1-16
Author(s):  
Po Zhang ◽  
Junqiang Lin ◽  
Jianhua He ◽  
Xiuchan Rong ◽  
Chengen Li ◽  
...  

The agricultural machinery experiment is restricted by the crop production season. Missing the crop growth cycle will extend the machine development period. The use of virtual reality technology to complete preassembly and preliminary experiments can reduce the loss caused by this problem. To improve the intelligence and stability of virtual assembly, this paper proposed a more stable dynamic gesture cognition framework: the TCP/IP protocol constituted the network communication terminal, the leap motion-based vision system constituted the gesture data collection terminal, and the CNN-LSTM network constituted the dynamic gesture recognition classification terminal. The dynamic gesture recognition framework and the harvester virtual assembly platform formed a virtual assembly system to achieve gesture interaction. Through experimental analysis, the improved CNN-LSTM network had a small volume and could quickly establish a stable and accurate gesture recognition model with an average accuracy of 98.0% (±0.894). The assembly efficiency of the virtual assembly system with the framework was improved by approximately 15%. The results showed that the accuracy and stability of this model met the requirements, the corresponding assembly parts were robust in the virtual simulation environment of the whole machine, and the harvesting behaviour in the virtual reality scene was close to the real scene. The virtual assembly system under this framework provided technical support for unmanned farms and virtual experiments on agricultural machinery.


Sign in / Sign up

Export Citation Format

Share Document