A Fingertip Gestural User Interface Without Depth Data for Mixed Reality Applications

Author(s):  
Srinidhi Hegde ◽  
Gaurav Garg ◽  
Ramakrishna Perla ◽  
Ramya Hebbalaguppe
2015 ◽  
Vol 15 (1) ◽  
pp. 25-34 ◽  
Author(s):  
Daniel Fritz ◽  
Annette Mossel ◽  
Hannes Kaufmann

In mobile applications, it is crucial to provide intuitive means for 2D and 3D interaction. A large number of techniques exist to support a natural user interface (NUI) by detecting the user's hand posture in RGB+D (depth) data. Depending on the given interaction scenario and its environmental properties, each technique has its advantages and disadvantages regarding accuracy and the robustness of posture detection. While the interaction environment in a desktop setup can be constrained to meet certain requirements, a handheld scenario has to deal with varying environmental conditions. To evaluate the performance of techniques on a mobile device, a powerful software framework was developed that is capable of processing and fusing RGB and depth data directly on a handheld device. Using this framework, five existing hand posture recognition techniques were integrated and systematically evaluated by comparing their accuracy under varying illumination and background. Overall results reveal best recognition rate of posture detection for combined RGB+D data at the expense of update rate. To support users in choosing the appropriate technique for their specific mobile interaction task, we derived guidelines based on our study. In the last step, an experimental study was conducted using the detected hand postures to perform the canonical 3D interaction tasks selection and positioning in a mixed reality handheld setup.


2019 ◽  
Vol 53 ◽  
pp. 75-92 ◽  
Author(s):  
Adolfo Muñoz ◽  
Xavier Mahiques ◽  
J. Ernesto Solanes ◽  
Ana Martí ◽  
Luis Gracia ◽  
...  

2009 ◽  
Vol 42 (22) ◽  
pp. 91-96 ◽  
Author(s):  
Markus Sauer ◽  
Martin Hess ◽  
Klaus Schilling

Author(s):  
Kaj Helin ◽  
Jaakko Karjalainen ◽  
Paul Kiernan ◽  
Mikael Wolff ◽  
David Martinez Oliveira
Keyword(s):  

Author(s):  
Patrick O'Connor ◽  
Casey Meekhof ◽  
Chad McBride ◽  
Christopher Mei ◽  
Cyrus Bamji ◽  
...  

2018 ◽  
Vol 14 (02) ◽  
pp. 38
Author(s):  
Tomas Komenda ◽  
Franz Schauer

Our recent research in remote laboratory management systems (REMLABNET - www.remlabnet.eu) deals with questions such as how to make the user experience stronger and how to help users understand complex phenomena behind remote experiments and the laws of physics governing the experiment. At our current stage of technological development, we have both sufficiently powerful hardware and software to create an impressive virtual user interface which could be a help to this mission. An extended mixed reality taxonomy for remote physical experiments was proposed to identify goals of the future REMLABNET research and development. The first part of this paper describes classes of taxonomy and reasons why they were set up in this way. The second part mentions the chosen method of our research and our current progress.


Sign in / Sign up

Export Citation Format

Share Document