scholarly journals Virtual Object Manipulation by Combining Touch and Head Interactions for Mobile Augmented Reality

2019 ◽  
Vol 9 (14) ◽  
pp. 2933 ◽  
Author(s):  
Ju Young Oh ◽  
Ji Hyung Park ◽  
Jung-Min Park

This paper proposes an interaction method to conveniently manipulate a virtual object by combining touch interaction and head movements for a head-mounted display (HMD), which provides mobile augmented reality (AR). A user can conveniently manipulate a virtual object with touch interaction recognized from the inertial measurement unit (IMU) attached to the index finger’s nail and head movements tracked by the IMU embedded in the HMD. We design two interactions that combine touch and head movements, to manipulate a virtual object on a mobile HMD. Each designed interaction method manipulates virtual objects by controlling ray casting and adjusting widgets. To evaluate the usability of the designed interaction methods, a user evaluation is performed in comparison with the hand interaction using Hololens. As a result, the designed interaction method receives positive feedback that virtual objects can be manipulated easily in a mobile AR environment.

2019 ◽  
Vol 9 (15) ◽  
pp. 3078 ◽  
Author(s):  
Hyocheol Ro ◽  
Jung-Hyun Byun ◽  
Yoon Jung Park ◽  
Nam Kyu Lee ◽  
Tack-Don Han

In this paper, we propose AR Pointer, a new augmented reality (AR) interface that allows users to manipulate three-dimensional (3D) virtual objects in AR environment. AR Pointer uses a built-in 6-degrees of freedom (DoF) inertial measurement unit (IMU) sensor in an off-the-shelf mobile device to cast a virtual ray that is used to accurately select objects. It is also implemented using simple touch gestures commonly used in smartphones for 3D object manipulation, so users can easily manipulate 3D virtual objects using the AR Pointer, without a long training period. To demonstrate the usefulness of AR Pointer, we introduce two use-cases, constructing an AR furniture layout and AR education. Then, we conducted two experiments, performance tests and usability tests, to represent the excellence of the designed interaction methods using AR Pointer. We found that AR Pointer is more efficient than other interfaces, achieving 39.4% faster task completion time in the object manipulation. In addition, the participants gave an average of 8.61 points (13.4%) on the AR Pointer in the usability test conducted through the system usability scale (SUS) questionnaires and 8.51 points (15.1%) on the AR Pointer in the fatigue test conducted through the NASA task load index (NASA-TLX) questionnaire. Previous AR applications have been implemented in a passive AR environment where users simply check and pop up the AR objects those are prepared in advance. However, if AR Pointer is used for AR object manipulation, it is possible to provide an immersive AR environment for the user who want/wish to actively interact with the AR objects.


Author(s):  
Kevin Lesniak ◽  
Conrad S. Tucker

The method presented in this work reduces the frequency of virtual objects incorrectly occluding real-world objects in Augmented Reality (AR) applications. Current AR rendering methods cannot properly represent occlusion between real and virtual objects because the objects are not represented in a common coordinate system. These occlusion errors can lead users to have an incorrect perception of the environment around them when using an AR application, namely not knowing a real-world object is present due to a virtual object incorrectly occluding it and incorrect perception of depth or distance by the user due to incorrect occlusions. The authors of this paper present a method that brings both real-world and virtual objects into a common coordinate system so that distant virtual objects do not obscure nearby real-world objects in an AR application. This method captures and processes RGB-D data in real-time, allowing the method to be used in a variety of environments and scenarios. A case study shows the effectiveness and usability of the proposed method to correctly occlude real-world and virtual objects and provide a more realistic representation of the combined real and virtual environments in an AR application. The results of the case study show that the proposed method can detect at least 20 real-world objects with potential to be incorrectly occluded while processing and fixing occlusion errors at least 5 times per second.


2015 ◽  
Vol 1 (2) ◽  
pp. 306
Author(s):  
Hoger Mahmud Hussen

In this paper the outcome of a project is presented that aims to modify and improve one of the most widely used Augmented Reality tools. Augmented reality (AR), is a fast growing area of virtual reality research. Augmented Reality (AR) is a newly emerging technology by which user’s view of the real world is augmented with additional information from a computer model. ARToolKit is one of the most widely used toolkits for Augmented Reality applications. The toolkit tracks optical markers and overlays virtual objects on the markers. In the current version of the toolkit the overlaid object is stationary or loops regardless of the optical target position, this means that the overlaid object cannot be animated or changed based on the movement of the optical target. The aim is to improve the toolkit, therefore a design solution to modify it were designed and implement so that users can manipulate the position of the overlaid virtual object, through movements of the optical target. The design solution focuses on developing a mathematically based links between the position of the optical target and the overlaid virtual object. To test the solution test cases were developed and the results show that the design solution is effective and the principal idea can be used to develop many applications in different sectors such as education and health.


2019 ◽  
Vol 9 (9) ◽  
pp. 1797
Author(s):  
Chen ◽  
Lin

Augmented reality (AR) is an emerging technology that allows users to interact with simulated environments, including those emulating scenes in the real world. Most current AR technologies involve the placement of virtual objects within these scenes. However, difficulties in modeling real-world objects greatly limit the scope of the simulation, and thus the depth of the user experience. In this study, we developed a process by which to realize virtual environments that are based entirely on scenes in the real world. In modeling the real world, the proposed scheme divides scenes into discrete objects, which are then replaced with virtual objects. This enables users to interact in and with virtual environments without limitations. An RGB-D camera is used in conjunction with simultaneous localization and mapping (SLAM) to obtain the movement trajectory of the user and derive information related to the real environment. In modeling the environment, graph-based segmentation is used to segment point clouds and perform object segmentation to enable the subsequent replacement of objects with equivalent virtual entities. Superquadrics are used to derive shape parameters and location information from the segmentation results in order to ensure that the scale of the virtual objects matches the original objects in the real world. Only after the objects have been replaced with their virtual counterparts in the real environment converted into a virtual scene. Experiments involving the emulation of real-world locations demonstrated the feasibility of the proposed rendering scheme. A rock-climbing application scenario is finally presented to illustrate the potential use of the proposed system in AR applications.


2012 ◽  
Vol 24 (05) ◽  
pp. 435-445
Author(s):  
Ren-Guey Lee ◽  
Sheng-Chung Tien ◽  
Chun-Chang Chen ◽  
Yu-Ying Chen

In this paper, rehabilitation tools are proposed and implemented to assist patients with stroke and body dysfunction via auxiliary physical activity. By integrating the entertainment of games and the needs of rehabilitation and utilizing motor assessment scale (MAS) as the building blocks, we propose a game system developed for assessment of stroke rehabilitation by using augmented reality (AR) technology. By means of application of AR Markers and based on related parameters of Wii remotes, various assessment games have been implemented, and vivid pictures can be presented to users via a head-mounted display by seamless combination of real environment and virtual objects. This game system takes various assessment scales into consideration, and each scale is specifically designed and individually integrated to enable the relevant capacity for assessment of motor functions. According to the experimental results, the accuracy rate of the users in successfully following the game steps is 91.2%, and the accuracy rate of the system in assessing the MAS categories is as high as 94.6%, which confirms the feasibility of our proposed and implemented rehabilitation game system.


Author(s):  
Rafael Radkowski ◽  
Helene Waßmann

Sophisticated vehicle ergonomics are a relevant factor for the success of a new vehicle model. To support the evaluation of vehicle ergonomics we have developed a mobile Augmented Reality (AR) testing platform in cooperation with Volkswagen commercial vehicle. The mobile AR-based testing platform consists of a Volkswagen Multivan, where roof, pillars, and dashboard have been removed. The missing parts are replaced by new virtual parts. A head mounted display presents the virtual parts to the user of the mobile AR testing platform, that way, enabling visibility analyses. The user can also interact with the virtual components of the vehicle to perform reachability analyses. To support these analyses the user’s hands are recognized by the system. In this paper we introduce a method for tracking the user’s hands in a mobile AR testing platform. Using image processing the hands of the user are detected by checking for the skin color or the color of a glove the user wears. From image processing data we compute the position of the user’s real hand and potential collisions between the hand and virtual vehicle components. We also utilize results from collision detection for the interaction with the virtual objects.


2018 ◽  
Author(s):  
Jacob W. Greene

This lesson serves as an introduction to creating mobile augmented reality applications. Augmented reality (AR) can be defined as the overlaying of digital content (images, video, text, sound, etc.) onto physical objects or locations, and it is typically experienced by looking through the camera lens of an electronic device such as a smartphone, tablet, or optical head-mounted display.


Sign in / Sign up

Export Citation Format

Share Document