Investigation of dynamic 3D hand motion reproduction by a robot using a Leap Motion

Author(s):  
Franck Hernoux ◽  
Richard Béarée ◽  
Olivier Gibaru
Keyword(s):  
2020 ◽  
pp. 155335062094720
Author(s):  
Yuanyuan Feng ◽  
Uchenna A. Uchidiuno ◽  
Hamid R. Zahiri ◽  
Ivan George ◽  
Adrian E. Park ◽  
...  

Background. Touchless interaction devices have increasingly garnered attention for intraoperative imaging interaction, but there are limited recommendations on which touchless interaction mechanisms should be implemented in the operating room. The objective of this study was to evaluate the efficiency, accuracy, and satisfaction of 2 current touchless interaction mechanisms—hand motion and body motion for intraoperative image interaction. Methods. We used the TedCas plugin for ClearCanvas DICOM viewer to display and manipulate CT images. Ten surgeons performed 5 image interaction tasks—step-through, pan, zoom, circle measure, and line measure—on the 3 input interaction devices—the Microsoft Kinect, the Leap Motion, and a mouse. Results. The Kinect shared similar accuracy with the Leap Motion for most of the tasks. But it had an increased error rate in the step-through task. The Leap Motion led to shorter task completion time than the Kinect and was preferred by the surgeons, especially for the measure tasks. Discussion. Our study suggests that hand tracking devices, such as the Leap Motion, should be used for intraoperative imagining manipulation tasks that require high precision.


Sensors ◽  
2021 ◽  
Vol 21 (11) ◽  
pp. 3735
Author(s):  
Lesong Jia ◽  
Xiaozhou Zhou ◽  
Hao Qin ◽  
Ruidong Bai ◽  
Liuqing Wang ◽  
...  

Continuous movements of the hand contain discrete expressions of meaning, forming a variety of semantic gestures. For example, it is generally considered that the bending of the finger includes three semantic states of bending, half bending, and straightening. However, there is still no research on the number of semantic states that can be conveyed by each movement primitive of the hand, especially the interval of each semantic state and the representative movement angle. To clarify these issues, we conducted experiments of perception and expression. Experiments 1 and 2 focused on perceivable semantic levels and boundaries of different motion primitive units from the perspective of visual semantic perception. Experiment 3 verified and optimized the segmentation results obtained above and further determined the typical motion values of each semantic state. Furthermore, in Experiment 4, the empirical application of the above semantic state segmentation was illustrated by using Leap Motion as an example. We ended up with the discrete gesture semantic expression space both in the real world and Leap Motion Digital World, containing the clearly defined number of semantic states of each hand motion primitive unit and boundaries and typical motion angle values of each state. Construction of this quantitative semantic expression will play a role in guiding and advancing research in the fields of gesture coding, gesture recognition, and gesture design.


2016 ◽  
Vol 2016 ◽  
pp. 1-10 ◽  
Author(s):  
Juliana M. de Oliveira ◽  
Rafael Carneiro G. Fernandes ◽  
Cristtiano S. Pinto ◽  
Plácido R. Pinheiro ◽  
Sidarta Ribeiro ◽  
...  

Cerebral palsy is a severe condition usually caused by decreased brain oxygenation during pregnancy, at birth or soon after birth. Conventional treatments for cerebral palsy are often tiresome and expensive, leading patients to quit treatment. In this paper, we describe a virtual environment for patients to engage in a playful therapeutic game for neuropsychomotor rehabilitation, based on the experience of the occupational therapy program of the Nucleus for Integrated Medical Assistance (NAMI) at the University of Fortaleza, Brazil. Integration between patient and virtual environment occurs through the hand motion sensor “Leap Motion,” plus the electroencephalographic sensor “MindWave,” responsible for measuring attention levels during task execution. To evaluate the virtual environment, eight clinical experts on cerebral palsy were subjected to a questionnaire regarding the potential of the experimental virtual environment to promote cognitive and motor rehabilitation, as well as the potential of the treatment to enhance risks and/or negatively influence the patient’s development. Based on the very positive appraisal of the experts, we propose that the experimental virtual environment is a promising alternative tool for the rehabilitation of children with cerebral palsy.


2019 ◽  
Vol 5 (2) ◽  
pp. 121-132
Author(s):  
Galang Ihsan Isnanto ◽  
Samuel Gandang Gunanto ◽  
Agnes Karina Pritha Atmani

Motion Leap (Hand Motion tracking) is a term for recording hand movements used as digital models and is an additional device that can be connected to a computer and can then be used to replace both mouse and keyboard functions. In the 3D game "Everplane", Leap Motion is used as a game controller and is the main component of the game. The Everplane game is an endless game / endless runner type game. Endless runner is a game where the player's character continues to move forward through the endless world of games. The Everplane game has the concept of exploring space. The process of making 3D games "Everplane" through various data research is needed to meet the needs of game production through three stages, namely Preproduction (Game Design, Character Design and Layout Design), Production (Modeling, Texturing, Design graphic user interface, Music, Leap Setup Motion, Assembly Animating, Programming, and Problem Solfing) and Postproduction (Deploying, Mastering and Merchandise).Keywords: Leap Motion, Everplane game, manufacturing process


2018 ◽  
Vol 3 (2) ◽  
pp. 146 ◽  
Author(s):  
Frihandhika Permana ◽  
Herman Tolle ◽  
Fitri Utaminingrum ◽  
Rizdania Dermawi

The smartphone development today makes the gadget not only used as a communication tool but also as an entertainment tool such as to play games and play music. The development of the smartphone also supports many technologies that can be run on the smartphone itself, such as Augmented Reality (AR), for example. There are some studies evaluated the AR application combined with Leap Motion, but those studies were using the SDK alpha of the Leap Motion Corp. that is now no longer accessible for the developers to use. This research is meant to overcome such a problem. The method proposed in this study is a technique to connect the Leap Motion with Android for Augmented Reality application. This paper also evaluates the technique used to connect the AR technology to Leap Motion so it can be a visual instrument simulation, which applied to the Gamelan traditional music instrument. The experiments resulted in the accuracy rate of the application of 96.43% for right-hand movement and 97.86% for the left-hand motion. The high accuracy result obtained in the research can be a promising result for the future research.


Author(s):  
I Gede Aris Dharmayasa ◽  
Surya Sumpeno ◽  
I Ketut Eddy Purnama ◽  
Adri Gabriel Sooai

Sensors ◽  
2020 ◽  
Vol 20 (10) ◽  
pp. 2773 ◽  
Author(s):  
Edwin Daniel Oña ◽  
Alberto Jardón ◽  
Alicia Cuesta-Gómez ◽  
Patricia Sánchez-Herrera-Baeza ◽  
Roberto Cano-de-la-Cuerda ◽  
...  

In recent decades, gaming technology has been accepted as a feasible method for complementing traditional clinical practice, especially in neurorehabilitation; however, the viability of using 3D Virtual Reality (VR) for the assessment of upper limb motor function has not been fully explored. For that purpose, we developed a VR-based version of the Box and Blocks Test (BBT), a clinical test for the assessment of manual dexterity, as an automated alternative to the classical procedure. Our VR-based BBT (VR-BBT) integrates the traditional BBT mechanics into gameplay using the Leap Motion Controller (LMC) to capture the user’s hand motion and the Oculus Rift headset to provide a fully immersive experience. This paper focuses on evaluating the validity of our VR-BBT to reliably measure the manual dexterity in a sample of patients with Parkinson’s Disease (PD). For this study, a group of twenty individuals in a mild to moderate stage of PD were recruited. Participants were asked to perform the physical BBT (once) and our proposed VR-BBT (twice) system, separately. Correlation analysis of collected data was carried out. Statistical analysis proved that the performance data collected by the VR-BBT significantly correlated with the conventional assessment of the BBT. The VR-BBT scores have shown a significant association with PD severity measured by the Hoehn and Yahr scale. This fact suggests that the VR-BBT could be used as a reliable indicator for health improvements in patients with PD. Finally, the VR-BBT system presented high usability and acceptability rated by clinicians and patients.


Sensors ◽  
2021 ◽  
Vol 21 (4) ◽  
pp. 1199
Author(s):  
Robin Fonk ◽  
Sean Schneeweiss ◽  
Ulrich Simon ◽  
Lucas Engelhardt

The AnyBody Modeling System™ (AMS) is a musculoskeletal software simulation solution using inverse dynamics analysis. It enables the determination of muscle and joint forces for a given bodily motion. The recording of the individual movement and the transfer into the AMS is a complex and protracted process. Researches indicated that the contactless, visual Leap Motion Controller (LMC) provides clinically meaningful motion data for hand tracking. Therefore, the aim of this study was to integrate the LMC hand motion data into the AMS in order to improve the process of recording a hand movement. A Python-based interface between the LMC and the AMS, termed ROSE Motion, was developed. This solution records and saves the data of the movement as Biovision Hierarchy (BVH) data and AnyScript vector files that are imported into the AMS simulation. Setting simulation parameters, initiating the calculation automatically, and fetching results is implemented by using the AnyPyTools library from AnyBody. The proposed tool offers a rapid and easy-to-use recording solution for elbow, hand, and finger movements. Features include animation, cutting/editing, exporting the motion, and remote controlling the AMS for the analysis and presentation of musculoskeletal simulation results. Comparing the motion tracking results with previous studies, covering problems when using the LMC limit the correctness of the motion data. However, fast experimental setup and intuitive and rapid motion data editing strengthen the use of marker less systems as the herein presented compared to marker based motion capturing.


Author(s):  
Godwin Ponraj Joseph Vedhagiri ◽  
Hongliang Ren

<span>In our daily life, we, human beings use our hands in various ways for most of our day-to-day activities. Tracking the position, orientation and articulation of human hands has a variety of applications including gesture recognition, robotics, medicine and health care, design and manufacturing, art and entertainment across multiple domains. However, it is an equally complex and challenging task due to several factors like higher dimensional data from hand motion, higher speed of operation, self-occlusion, etc. This paper puts forth a novel method for tracking the finger tips of human hand using two distinct sensors and combining their data by sensor fusion technique.</span>


Sign in / Sign up

Export Citation Format

Share Document