Hand Tracking in 3D Space using MediaPipe and PnP Method for Intuitive Control of Virtual Globe

Author(s):  
Vaishnav Chunduru ◽  
Mrinalkanti Roy ◽  
Dasari Romit N. S ◽  
Rajeevlochana G. Chittawadigi
Keyword(s):  
2014 ◽  
Vol 214 ◽  
pp. 1-10
Author(s):  
Grzegorz Baron ◽  
Piotr Czekalski ◽  
Daniel Malicki ◽  
Krzysztof Tokarz

Modern sensing technologies create new possibilities to control mobile robots without any dedicated manipulators. In this article authors present a novel method that enables driving of the Mindstorms NXT artificial arm with Microsoft Kinect, using gesture recognition and hand tracking. To imitate the movement of an artificial robotic arm, an algorithm of the human-computer interaction is employed using skeleton tracking and gesture control in 3D space. The solution is implemented using free, open source software, Java based.


Author(s):  
Xiaolu Zeng ◽  
Alan Hedge ◽  
Francois Guimbretiere
Keyword(s):  

2009 ◽  
Author(s):  
F. Jacob Seagull ◽  
Peter Miller ◽  
Ivan George ◽  
Paul Mlyniec ◽  
Adrian Park
Keyword(s):  
3D Image ◽  

Author(s):  
D Flöry ◽  
C Ginthoer ◽  
J Roeper-Kelmayr ◽  
A Doerfler ◽  
WG Bradley ◽  
...  
Keyword(s):  

2014 ◽  
Vol 081 (03) ◽  
Author(s):  
Cheston Saunders ◽  
Amy Taylor
Keyword(s):  

Author(s):  
S. Chef ◽  
C. T. Chua ◽  
C. L. Gan

Abstract Limited spatial resolution and low signal to noise ratio are some of the main challenges in optical signal observation, especially for photon emission microscopy. As dynamic emission signals are generated in a 3D space, the use of the time dimension in addition to space enables a better localization of switching events. It can actually be used to infer information with a precision above the resolution limits of the acquired signals. Taking advantage of this property, we report on a post-acquisition processing scheme to generate emission images with a better image resolution than the initial acquisition.


2021 ◽  
Author(s):  
Marius Fechter ◽  
Benjamin Schleich ◽  
Sandro Wartzack

AbstractVirtual and augmented reality allows the utilization of natural user interfaces, such as realistic finger interaction, even for purposes that were previously dominated by the WIMP paradigm. This new form of interaction is particularly suitable for applications involving manipulation tasks in 3D space, such as CAD assembly modeling. The objective of this paper is to evaluate the suitability of natural interaction for CAD assembly modeling in virtual reality. An advantage of the natural interaction compared to the conventional operation by computer mouse would indicate development potential for user interfaces of current CAD applications. Our approach bases on two main elements. Firstly, a novel natural user interface for realistic finger interaction enables the user to interact with virtual objects similar to physical ones. Secondly, an algorithm automatically detects constraints between CAD components based solely on their geometry and spatial location. In order to prove the usability of the natural CAD assembly modeling approach in comparison with the assembly procedure in current WIMP operated CAD software, we present a comparative user study. Results show that the VR method including natural finger interaction significantly outperforms the desktop-based CAD application in terms of efficiency and ease of use.


Sign in / Sign up

Export Citation Format

Share Document