scholarly journals Mobile based augmented reality for flexible human height estimation using touch and motion gesture interaction

Author(s):  
N A Ismail ◽  
C W Tan ◽  
S E Mohamed ◽  
M S Salam ◽  
F A Ghaleb
2018 ◽  
Vol 3 (1) ◽  
pp. 16-22
Author(s):  
Julius Cézar Alves de LIMA ◽  
Yane Laiza da Silva OLIVEIRA ◽  
Patricia Moreira RABELLO ◽  
Yuri Wanderley CAVALCANTI ◽  
Bianca Marques SANTIAGO

Author(s):  
Shengzhe Li ◽  
Van Huan Nguyen ◽  
Mingjie Ma ◽  
Cheng-Bin Jin ◽  
Trung Dung Do ◽  
...  

2021 ◽  
Author(s):  
Apurv Varshney ◽  
Justin Nilsen ◽  
Richa Wadaskar ◽  
Misha Sra

2015 ◽  
Vol 78 (2-2) ◽  
Author(s):  
Cik Suhaimi Yusof ◽  
Huidong Bai ◽  
Mark Billinghurst ◽  
Mohd Shahrizal Sunar

Interaction for Handheld Augmented Reality (HAR) is a challenging research topic because of the small screen display and limited input options. Although 2D touch screen input is widely used, 3D gesture interaction is a suggested alternative input method. Recent 3D gesture interaction research mainly focuses on using RGB-Depth cameras to detect the spatial position and pose of fingers, using this data for virtual object manipulations in the AR scene. In this paper we review previous 3D gesture research on handheld interaction metaphors for HAR. We present their novelties as well as limitations, and discuss future research directions of 3D gesture interaction for HAR. Our results indicate that 3D gesture input on HAR is a potential interaction method for assisting a user in many tasks such as in education, urban simulation and 3D games.


Author(s):  
Rafael Radkowski ◽  
Christian Stritzke

This paper presents a comparison between 2D and 3D interaction techniques for Augmented Reality (AR) applications. The interaction techniques are based on hand gestures and a computer vision-based hand gesture recognition system. We have compared 2D gestures and 3D gestures for interaction in AR application. The 3D recognition system is based on a video camera, which provides an additional depth image to each 2D color image. Thus, spatial interactions become possible. Our major question during this work was: Do depth images and 3D interaction techniques improve the interaction with AR applications, respectively with virtual 3D objects? Therefore, we have tested and compared the hand gesture recognition systems. The results show two things: First, they show that the depth images facilitate a more robust hand recognition and gesture identification. Second, the results are a strong indication that 3D hand gesture interactions techniques are more intuitive than 2D hand gesture interaction techniques. In summary the results emphasis, that depth images improve the hand gesture interaction for AR applications.


2014 ◽  
Vol 34 (1) ◽  
pp. 77-80 ◽  
Author(s):  
Mark Billinghurst ◽  
Tham Piumsomboon ◽  
Huidong Bai

2020 ◽  
Vol 2020 ◽  
pp. 1-15
Author(s):  
Rahmita Wirza ◽  
Shah Nazir ◽  
Habib Ullah Khan ◽  
Iván García-Magariño ◽  
Rohul Amin

The medical system is facing the transformations with augmentation in the use of medical information systems, electronic records, smart, wearable devices, and handheld. The central nervous system function is to control the activities of the mind and the human body. Modern speedy development in medical and computational growth in the field of the central nervous system enables practitioners and researchers to extract and visualize insight from these systems. The function of augmented reality is to incorporate virtual and real objects, interactively running in a real-time and real environment. The role of augmented reality in the central nervous system becomes a thought-provoking task. Gesture interaction approach-based augmented reality in the central nervous system has enormous impending for reducing the care cost, quality refining of care, and waste and error reducing. To make this process smooth, it would be effective to present a comprehensive study report of the available state-of-the-art-work for enabling doctors and practitioners to easily use it in the decision making process. This comprehensive study will finally summarise the outputs of the published materials associate to gesture interaction-based augmented reality approach in the central nervous system. This research uses the protocol of systematic literature which systematically collects, analyses, and derives facts from the collected papers. The data collected range from the published materials for 10 years. 78 papers were selected and included papers based on the predefined inclusion, exclusion, and quality criteria. The study supports to identify the studies related to augmented reality in the nervous system, application of augmented reality in the nervous system, technique of augmented reality in the nervous system, and the gesture interaction approaches in the nervous system. The derivations from the studies show that there is certain amount of rise-up in yearly wise articles, and numerous studies exist, related to augmented reality and gestures interaction approaches to different systems of the human body, specifically to the nervous system. This research organises and summarises the existing associated work, which is in the form of published materials, and are related to augmented reality. This research will help the practitioners and researchers to sight most of the existing studies subjected to augmented reality-based gestures interaction approaches for the nervous system and then can eventually be followed as support in future for complex anatomy learning.


2019 ◽  
Vol 9 (2) ◽  
Author(s):  
Muhammad Nur Affendy Nor'a ◽  
Ajune Wanis Ismail

Application that adopts collaborative system allows multiple users to interact with other users in the same virtual space either in Virtual Reality (VR) or Augmented Reality (AR). This paper aims to integrate the VR and AR space in a Collaborative User Interface that enables the user to cooperate with other users in a different type of interfaces in a single shared space manner. The gesture interaction technique is proposed as the interaction tool in both of the virtual spaces as it can provide a more natural gesture interaction when interacting with the virtual object. The integration of VR and AR space provide a cross-discipline shared data interchange through the network protocol of client-server architecture.


Sign in / Sign up

Export Citation Format

Share Document