Augmented reality using high fidelity spherical panorama with HDRI

Author(s):  
Zi Siang See ◽  
Mark Billinghurst ◽  
Adrian David Cheok
Author(s):  
Rompapas Damien Constantine ◽  
Daniel Flores Quiros ◽  
Charlton Rodda ◽  
Bryan Christopher Brown ◽  
Noah Benjamin Zerkin ◽  
...  

Author(s):  
Damien Hompapas ◽  
Christian Sandor ◽  
Alexander Plopski ◽  
Daniel Saakes ◽  
Dong Hyeok Yun ◽  
...  

2020 ◽  
Vol 12 (21) ◽  
pp. 9262
Author(s):  
Naai-Jung Shih ◽  
Hui-Xu Chen ◽  
Tzu-Yu Chen ◽  
Yi-Ting Qiu

This research aimed to preserve traditional elements and urban fabric with enabled interaction in augmented reality (AR). Cultural elements and fabrics are mutually influential in Lukang, Taiwan. Evolved routes for tourism and religious activities have created characteristic elements and activity-based fabrics and facilities. The sustainable promotion of digital cultural assets started from photogrammetry modeling of alley space and shops. The application of AR enabled situated learning of 68 objects, including decorated façades, jar walls, the Lukang Gate, beggar seats, and other creative cultural elements. The heritages were promoted under a new interactive measure of feasibility that facilitated cultural sustainability in a remote site. A mobile interface with a convenient smartphone configured certain settings that were sufficiently flexible and easy to apply. The study presented an effective and efficient remote and situated learning process that correlated the development or setting of both locations. Correlation was achieved with a high fidelity of appearance and utilizing a flexible transformation interface. An approach, which recreated the background and formerly reconstructed objects during AR simulation, was used to verify the outcome of the situated study with conflicting qualitative and quantitative findings.


2019 ◽  
Vol 9 (1) ◽  
pp. 73-109
Author(s):  
Zi Siang See ◽  
Lizbeth Goodman ◽  
Craig Hight ◽  
Mohd Shahrizal Sunar ◽  
Arindam Dey ◽  
...  

Abstract This research explores the development of a novel method and apparatus for creating spherical panoramas enhanced with high dynamic range (HDR) for high fidelity Virtual Reality 360 degree (VR360) user experiences. A VR360 interactive panorama presentation using spherical panoramas can provide virtual interactivity and wider viewing coverage; with three degrees of freedom, users can look around in multiple directions within the VR360 experiences, gaining the sense of being in control of their own engagement. This degree of freedom is facilitated by the use of mobile displays or head-mount-devices. However, in terms of image reproduction, the exposure range can be a major difficulty in reproducing a high contrast real-world scene. Imaging variables caused by difficulties and obstacles can occur during the production process of spherical panorama facilitated with HDR. This may result in inaccurate image reproduction for location-based subjects, which will in turn result in a poor VR360 user experience. In this article we describe a HDR spherical panorama reproduction approach (workflow and best practice) which can shorten the production processes, and reduce imaging variables, and technical obstacles and issues to a minimum. This leads to improved photographic image reproduction with fewer visual abnormalities for VR360 experiences, which can be adaptable into a wide range of interactive design applications. We describe the process in detail and also report on a user study that shows the proposed approach creates images which viewers prefer, on the whole, to those created using more complicated HDR methods, or to those created without the use of HDR at all.


MedEdPublish ◽  
2017 ◽  
Vol 6 (4) ◽  
Author(s):  
Oliver David Thompson ◽  
Rhydian Harris ◽  
Tim Godfrey

Author(s):  
Rafael Radkowski

The paper introduces a method for an augmented reality (AR) assembly assistance application that allows one to quantify the alignment of two parts. Point cloud-based tracking is one method to recognize and to track physical parts. However, the correct fitting of two parts cannot be determined with high fidelity from point cloud tracking data due to occlusion and other challenges. A Maximum Likelihood Estimate (MLE) of an error model is suggested to quantify the probability that two parts are correctly aligned. An initial solution was investigated. The results of an offline-simulation with point cloud data are promising and indicate the efficacy of the suggested method.


Sign in / Sign up

Export Citation Format

Share Document