Interactive System Based on Leap Motion for 3D Medical Model

Author(s):  
Ruo Xiu Xiao ◽  
Jia Yu Wang ◽  
Tao Zhang ◽  
Ke Meng ◽  
Li Qun Cao ◽  
...  

An interactive visualization of the patients’ 3D medical anatomical model as guide is often helpful for doctors during complex surgery. However, there are certain limitations according to the actual requirements of building sterile operating environment. Traditional human–computer interaction tools (mouse and keyboard) must be disinfected regularly and cannot be used in the process. A noncontact gesture control medical model based on Leap Motion is proposed in this study. The gesture is recognized and localized without using mouse and keyboards through a binocular camera assembled on Leap Motion. Hence, the model is directly controlled by the gesture to complete the operation of rotation, zoom, and other functions. In this study, a 3D heart model is combined with pseudo-color processing technology to enhance the observability of its 3D structure. Gesture recognition technology is then utilized to control the rendered model as rotation and zoom. Experimental results show that our system has an absolute accuracy in recognizing circle, swipe, and other actions. Thus, rotation is proposed as a new motion that can be identified steadily. Rotation plays an essential role in usability, intuition, and interactive efficiency of future system design. The system is applicable to sterile operating environments due to its stable recognition process and small space occupation.

2014 ◽  
Vol 21 (6) ◽  
pp. 655-656 ◽  
Author(s):  
Nicola Bizzotto ◽  
Alessandro Costanzo ◽  
Leonardo Bizzotto ◽  
Dario Regis ◽  
Andrea Sandri ◽  
...  

2012 ◽  
Vol 433-440 ◽  
pp. 5436-5442
Author(s):  
Lei Li

The pseudo-color processing for target identification and tracking is very meaningful Experimental results show that the pseudo-color image fusion is a very effective methods. This paper presents a false color image fusion based on the new method. Fusion using wavelet transform grayscale images, find the gray fused image and the difference between the original image, respectively, as the image of l, α, β components are color fusion image, and then after the color transformation, the final false color fused image. The results showed that the color fusion image colors more vivid, more in line with human visual characteristics.


2007 ◽  
Vol 7 (1) ◽  
pp. 210-214 ◽  
Author(s):  
Mohammad A.U. Khan ◽  
Rabya Bahadur Kh ◽  
Shahid Bilal ◽  
Asad Jamil ◽  
Mehr Ali Shah

2018 ◽  
Vol 2018 ◽  
pp. 1-10 ◽  
Author(s):  
Gilbert Tang ◽  
Phil Webb

In industrial human-robot collaboration, variability commonly exists in the operation environment and the components, which induces uncertainty and error that require frequent manual intervention for rectification. Conventional teach pendants can be physically demanding to use and require user training prior to operation. Thus, a more effective control interface is required. In this paper, the design and evaluation of a contactless gesture control system using Leap Motion is described. The design process involves the use of RULA human factor analysis tool. Separately, an exploratory usability test was conducted to compare three usability aspects between the developed gesture control system and an off-the-shelf conventional touchscreen teach pendant. This paper focuses on the user-centred design methodology of the gesture control system. The novelties of this research are the use of human factor analysis tools in the human-centred development process, as well as the gesture control design that enable users to control industrial robot’s motion by its joints and tool centre point position. The system has potential to use as an input device for industrial robot control in a human-robot collaboration scene. The developed gesture control system was targeting applications in system recovery and error correction in flexible manufacturing environment shared between humans and robots. The system allows operators to control an industrial robot without the requirement of significant training.


Author(s):  
Yu-hang LI ◽  
Meng-xing HUANG ◽  
Di WU ◽  
Guan-jun WANG ◽  
Ya-zhou DONG ◽  
...  
Keyword(s):  

2020 ◽  
Vol 9 (2) ◽  
pp. 51
Author(s):  
D Sreeharsha

Robot plays a vital part in making our lives more facile. The scope of this project is to provide a relation between human and machine by the interaction of human hand and robotic arm. The arm consists of five Degree of Freedom (DOF) and an end effectors, which allows the interaction with the real world. Now the obligations for the controller arise and along the way settled with the exploration of leap motion sensor. As earlier, robotic arm was controlled by the keypad or joystick which required a lot of practices and calculations to manipulate the robotic arm to reach desired position. The exploitation of the leap motion results in explicitly acquiring for hand gesture and provides set of points.


Sign in / Sign up

Export Citation Format

Share Document