Quality Evaluation of 3D Objects in Mixed Reality For Different Lighting Conditions

2020 ◽  
Vol 2020 (11) ◽  
pp. 128-1-128-7
Author(s):  
Jesús Gutiérrez ◽  
Toinon Vigier ◽  
Patrick Le Callet

This paper presents a study on Quality of Experience (QoE) evaluation of 3D objects in Mixed Reality (MR) scenarios. In particular, a subjective test was performed with Microsoft HoloLens, considering different degradations affecting the geometry and texture of the content. Apart from the analysis of the perceptual effects of these artifacts, given the need for recommendations for subjective assessment of immersive media, this study was also aimed at: 1) checking the appropriateness of a single stimulus methodology (ACR-HR) for these scenarios where observers have less references than with traditional media, and 2) analyzing the possible impact of environment lighting conditions on the quality evaluation of 3D objects in mixed reality (MR), and 3) benchmark state-of-the-art objective metrics in this context. The subjective results provide insights for recommendations for subjective testing in MR/AR, showing that ACR-HR can be used in similar QoE tests and reflecting the influence among the lighting conditions, the content characteristics, and the type of degradations. The objective results show an acceptable performance of perceptual metrics for geometry quantization artifacts and point out the need of further research on metrics covering both geometry and texture compression degradations.

2021 ◽  
Vol 1 ◽  
pp. 2107-2116
Author(s):  
Agnese Brunzini ◽  
Alessandra Papetti ◽  
Michele Germani ◽  
Erica Adrario

AbstractIn the medical education field, the use of highly sophisticated simulators and extended reality (XR) simulations allow training complex procedures and acquiring new knowledge and attitudes. XR is considered useful for the enhancement of healthcare education; however, several issues need further research.The main aim of this study is to define a comprehensive method to design and optimize every kind of simulator and simulation, integrating all the relevant elements concerning the scenario design and prototype development.A complete framework for the design of any kind of advanced clinical simulation is proposed and it has been applied to realize a mixed reality (MR) prototype for the simulation of the rachicentesis. The purpose of the MR application is to immerse the trainee in a more realistic environment and to put him/her under pressure during the simulation, as in real practice.The application was tested with two different devices: the headset Vox Gear Plus for smartphone and the Microsoft Hololens. Eighteen students of the 6th year of Medicine and Surgery Course were enrolled in the study. Results show the comparison of user experience related to the two different devices and simulation performance using the Hololens.


Sensors ◽  
2021 ◽  
Vol 21 (6) ◽  
pp. 1949
Author(s):  
Lukas Sevcik ◽  
Miroslav Voznak

Video quality evaluation needs a combined approach that includes subjective and objective metrics, testing, and monitoring of the network. This paper deals with the novel approach of mapping quality of service (QoS) to quality of experience (QoE) using QoE metrics to determine user satisfaction limits, and applying QoS tools to provide the minimum QoE expected by users. Our aim was to connect objective estimations of video quality with the subjective estimations. A comprehensive tool for the estimation of the subjective evaluation is proposed. This new idea is based on the evaluation and marking of video sequences using the sentinel flag derived from spatial information (SI) and temporal information (TI) in individual video frames. The authors of this paper created a video database for quality evaluation, and derived SI and TI from each video sequence for classifying the scenes. Video scenes from the database were evaluated by objective and subjective assessment. Based on the results, a new model for prediction of subjective quality is defined and presented in this paper. This quality is predicted using an artificial neural network based on the objective evaluation and the type of video sequences defined by qualitative parameters such as resolution, compression standard, and bitstream. Furthermore, the authors created an optimum mapping function to define the threshold for the variable bitrate setting based on the flag in the video, determining the type of scene in the proposed model. This function allows one to allocate a bitrate dynamically for a particular segment of the scene and maintains the desired quality. Our proposed model can help video service providers with the increasing the comfort of the end users. The variable bitstream ensures consistent video quality and customer satisfaction, while network resources are used effectively. The proposed model can also predict the appropriate bitrate based on the required quality of video sequences, defined using either objective or subjective assessment.


2013 ◽  
Vol 411-414 ◽  
pp. 1362-1367 ◽  
Author(s):  
Qing Lan Wei ◽  
Yuan Zhang

This paper presents the thoughts about application of saliency map to the video objective quality evaluation system. It computes the SMSE and SPSNR values as the objective assessment scores according to the saliency map, and compares with conditional objective evaluation methods as PSNR and MSE. Experimental results demonstrate that this method can well fit the subjective assessment results.


2019 ◽  
Vol 2 ◽  
pp. 1-7
Author(s):  
Mathias Jahnke ◽  
Edyta P. Bogucka ◽  
Maria Turchenko

<p><strong>Abstract.</strong> Mixed reality is a rather new technology but came to its nowadays success through the availability of devices like Microsoft HoloLens which easily support the users and developers to use such devices. Therefore, visualization specialists like cartographers paid attention due to interaction possibilities such devices provide. In particular, to utilize the huge amount of opportunities such device gave. The applicability within the cartographic domain needs to be further investigated.</p><p>The main goal of this contribution is to evaluate the applicability of a mixed reality device in the domain of spatio-temporal representations on the example of the space-time cube to show cultural landscape changes. The hologram of the space-time cube provides the changes of the Royal Castle in Warsaw and their surrounding elements. The hologram therefore incorporated the different buildings of the castle, space-time prisms and space-time links to connect building elements over the years. The visual variables colour hue, colour value and transparency are mainly used to feature distinguishable space-time prisms and to show the space-time links. Different colour schemes are developed which features the characteristics of a mixed reality device. The possibilities of input actions are ranging from gaze/head movement, to gesture and voice.</p><p>The usability evaluation of the mixed reality hologram showed the overall comfort of interactions, perception of the visual components of the space-time cube and determines advantageous features and limitations of the technology. Most of the found limitations are connected to current devices, like e.g. resolution or field of view. An important aspect which came out is, that the experience the user has which such devices/technology plays an important role in successfully use and knowledge discovery from such applications.</p>


2021 ◽  
Vol 82 (4) ◽  
pp. 186
Author(s):  
Kathleen Phillips ◽  
Valerie A. Lynn ◽  
Amie Yenser ◽  
Christina Wissinger

Current teaching practice in undergraduate higher education anatomy and physiology courses incorporates the use of various instructional methodologies to reinforce the anatomical relationships between structures.1,2 These methods can include basic hands-on physical models, human and animal dissection labs, and interactive technology. Technological advances continue to drive the production of innovative anatomy and physiology electronic tools, including:virtual dissection in 3-D (e.g., Virtual Dissection Boards from Anatomage, 3D4Medical, and Anatomy.TV),augmented reality (AR) (e.g., Human Anatomy Atlas),mixed reality (e.g., Microsoft HoloLens Case Western Reserve Medical School and Cleveland Clinic digital anatomy app), and3-D virtual reality (VR) (e.g., 3D Organon VR Anatomy and YOU by Sharecare apps).


2021 ◽  
Author(s):  
Lohit Petikam

<p>Art direction is crucial for films and games to maintain a cohesive visual style. This involves carefully controlling visual elements like lighting and colour to unify the director's vision of a story. With today's computer graphics (CG) technology 3D animated films and games have become increasingly photorealistic. Unfortunately, art direction using CG tools remains laborious. Since realistic lighting can go against artistic intentions, art direction is almost impossible to preserve in real-time and interactive applications. New live applications like augmented and mixed reality (AR and MR) now demand automatically art-directed compositing in unpredictably changing real-world lighting. </p> <p>This thesis addresses the problem of dynamically art-directed 3D composition into real scenes. Realism is a basic component of art direction, so we begin by optimising scene geometry capture in realistic composites. We find low perceptual thresholds to retain perceived seamlessness with respect to optimised real-scene fidelity. We then propose new techniques for automatically preserving art-directed appearance and shading for virtual 3D characters. Our methods allow artists to specify their intended appearance for different lighting conditions. Unlike with previous work, artists can direct and animate stylistic edits to automatically adapt to changing real-world environments. We achieve this with a new framework for look development and art direction using a novel latent space of varied lighting conditions. For more dynamic stylised lighting, we also propose a new framework for art-directing stylised shadows using novel parametric shadow editing primitives. This is a first approach that preserves art direction and stylisation under varied lighting in AR/MR.</p>


Sign in / Sign up

Export Citation Format

Share Document