scholarly journals Modelling and Visualizing Holographic 3D Geographical Scenes with Timely Data Based on the HoloLens

2019 ◽  
Vol 8 (12) ◽  
pp. 539
Author(s):  
Wei Wang ◽  
Xingxing Wu ◽  
An He ◽  
Zeqiang Chen

Commonly, a three-dimensional (3D) geographic information system (GIS) is based on a two-dimensional (2D) visualization platform, hindering the understanding and expression of the real world in 3D space that further limits user cognition and understanding of 3D geographic information. Mixed reality (MR) adopts 3D display technology, which enables users to recognize and understand a computer-generated world from the perspective of 3D glasses and solves the problem that users are restricted to the perspective of a 2D screen, with a broad application foreground. However, there is a gap, especially dynamically, in modelling and visualizing a holographic 3D geographical Scene with GIS data/information under the development mechanism of a mixed reality system (e.g., the Microsoft HoloLens). This paper attempts to propose a design architecture (HoloDym3DGeoSce) to model and visualize holographic 3D geographical scenes with timely data based on mixed reality technology and the Microsoft HoloLens. The HoloDym3DGeoSce includes two modules, 3D geographic scene modelling with timely data and HoloDym3DGeoSce interactive design. 3D geographic scene modelling with timely data dynamically creates 3D geographic scenes based on Web services, providing materials and content for the HoloDym3DGeoSce system. The HoloDym3DGeoSce interaction module includes two methods: Human–computer physical interaction and human–computer virtual–real interaction. The human–computer physical interaction method provides an interface for users to interact with virtual geographic scenes. The human–computer virtual–real interaction method maps virtual geographic scenes to physical space to achieve virtual and real fusion. According to the proposed architecture design scheme, OpenStreetMap data and the BingMap Server are used as experimental data to realize the application of mixed reality technology to the modelling, rendering, and interacting of 3D geographic scenes, providing users with a stronger and more realistic 3D geographic information experience, and more natural human–computer GIS interactions. The experimental results show that the feasibility and practicability of the scheme have good prospects for further development.

Author(s):  
Igor Ivkovic ◽  
Sage Franch

Abstract – Augmented reality (AR) technology facilitates augmentation of current views with digital artifacts, such as information, three-dimensional objects, audio, and video. Mixed reality (MR) represents an enhanced version of AR, where advanced spatial mapping is used to anchor digital artifacts in physical space. Using MR technology, digital artifacts can be more closely integrated into the natural environment, thereby transcending physical limitations and creating enhanced blended learning environments. In this paper, we propose an approach for integration of MR technology into engineering education. Specifically, we propose to integrate Microsoft HoloLens into a first-year course on data structures and algorithms to improve student engagement and learning outcomes. In the pilot study, students were assigned to implement A* algorithm and then given a chance to visualize their implementation using Microsoft HoloLens. The feedback provided by students indicated increased engagement and interest in graph-based path-finding algorithms as well as MR technology.


2021 ◽  
Vol 2 ◽  
Author(s):  
Richard Skarbez ◽  
Missie Smith ◽  
Mary C. Whitton

Since its introduction in 1994, Milgram and Kishino's reality-virtuality (RV) continuum has been used to frame virtual and augmented reality research and development. While originally, the RV continuum and the three dimensions of the supporting taxonomy (extent of world knowledge, reproduction fidelity, and extent of presence metaphor) were intended to characterize the capabilities of visual display technology, researchers have embraced the RV continuum while largely ignoring the taxonomy. Considering the leaps in technology made over the last 25 years, revisiting the RV continuum and taxonomy is timely. In reexamining Milgram and Kishino's ideas, we realized, first, that the RV continuum is actually discontinuous; perfect virtual reality cannot be reached. Secondly, mixed reality is broader than previously believed, and, in fact, encompasses conventional virtual reality experiences. Finally, our revised taxonomy adds coherence, accounting for the role of users, which is critical to assessing modern mixed reality experiences. The 3D space created by our taxonomy incorporates familiar constructs such as presence and immersion, and also proposes new constructs that may be important as mixed reality technology matures.


2021 ◽  
Vol 42 (Supplement_1) ◽  
Author(s):  
A Malaweera ◽  
R Jogi ◽  
M Wright ◽  
M O'Neill ◽  
S Williams

Abstract Introduction Three dimensional (3D) electroanatomical maps (EAMs) created during electrophysiology procedures are traditionally displayed on 2D monitors connected to mapping systems. This has limitations, such as the lack of interaction with EAMs, the need for another user to control them, and the size of EAM displayed, which is limited by the resolution of these monitors. To overcome these, we created a novel technology to display EAMs on a mixed reality (MR) platform. Methods We used the Microsoft® HoloLens to create this MR platform. Studies from patients who had already undergone catheter ablation for atrial fibrillation, where EAMs of the left atria had been generated using different mapping systems (CARTO®, Rhythmia™ and EnSite Precision™) were utilised. These EAMs consisting of 3D coordinates and annotations (e.g. voltage & activation times) were exported from the mapping system. EAMs were then compiled and transferred to the HoloLens using custom-developed functions on Unity©, Microsoft® C# and VisualStudio. Subsequently, feedback was obtained from 3 independent electrophysiologists on this technology. Results We successfully exported the EAMs generated on CARTO®, Rhythmia™ and EnSite Precision™ mapping systems as holograms on to the HoloLens (Figure). Positive feedback included themes such as 1) the ability to use hand gestures and voice commands to interact with EAMs independent of another user unlike traditional cardiac mapping systems 2) offering an interactive 3D holographic experience whilst preserving the operators' physical interaction in the cardiac catheter lab 3) the capacity to better appreciate 3D geometry of EAMs in comparison to 2D monitors. The challenge of wearing a headset during long procedures was perceived as a disadvantage. Conclusion This technology, which can be used with any mapping system, is currently optimised for offline display. Our software will be made available as an opensource teaching and simulation tool. Users will be able to explore EAMs for research, planning complex cases and immersive learning. The future directions will include extending this toolkit for real-time cardiac mapping with catheter localisation, and could potentially be translated to other cardiac imaging modalities. FUNDunding Acknowledgement Type of funding sources: Public hospital(s). Main funding source(s): Cardiovascular diseases charitable fund (CDCF) at Guy's and St Thomas' NHS Foundation Trust. Process of creating Holograms of EAMs Voltage map of left atrium as a Hologram


Author(s):  
Heather Raikes

Corpus Corvus is a mixed reality performance artwork that utilizes stereoscopic projection, motion capture animation, an integrated physical/media choreographic vocabulary, and electroacoustic composition to explore the Pacific Northwest Native American myth of the raven as god and thief who steals the sun and creates the universe. Formally, the work explores the relationship between movement of a physical body and stereoscopic animation in a physical/digital three-dimensional image field. The animation is generated from motion capture data and kinesthetic media composition processes based on physical choreography. Through precise temporal alignment and stereoscopic theatrical effect, the projected animation is perceived to surround the performing body in physical space. The art/research process contextualizing Corpus Corvus is a practice-based exploration and discovery of an emerging poetics that extends the human sensory system into immersive media perceptual hyperspaces. This paper illuminates the process of research, manifestation, and discovery that informs the artwork and its poetics.


Micromachines ◽  
2021 ◽  
Vol 12 (4) ◽  
pp. 444
Author(s):  
Guoning Si ◽  
Liangying Sun ◽  
Zhuo Zhang ◽  
Xuping Zhang

This paper presents the design, fabrication, and testing of a novel three-dimensional (3D) three-fingered electrothermal microgripper with multiple degrees of freedom (multi DOFs). Each finger of the microgripper is composed of a V-shaped electrothermal actuator providing one DOF, and a 3D U-shaped electrothermal actuator offering two DOFs in the plane perpendicular to the movement of the V-shaped actuator. As a result, each finger possesses 3D mobilities with three DOFs. Each beam of the actuators is heated externally with the polyimide film. The durability of the polyimide film is tested under different voltages. The static and dynamic properties of the finger are also tested. Experiments show that not only can the microgripper pick and place microobjects, such as micro balls and even highly deformable zebrafish embryos, but can also rotate them in 3D space.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Hossein Eskandari ◽  
Juan Luis Albadalejo-Lijarcio ◽  
Oskar Zetterstrom ◽  
Tomáš Tyc ◽  
Oscar Quevedo-Teruel

AbstractConformal transformation optics is employed to enhance an H-plane horn’s directivity by designing a graded-index all-dielectric lens. The transformation is applied so that the phase error at the aperture is gradually eliminated inside the lens, leading to a low-profile high-gain lens antenna. The physical space shape is modified such that singular index values are avoided, and the optical path inside the lens is rescaled to eliminate superluminal regions. A prototype of the lens is fabricated using three-dimensional printing. The measurement results show that the realized gain of an H-plane horn antenna can be improved by 1.5–2.4 dB compared to a reference H-plane horn.


2011 ◽  
Vol 332-334 ◽  
pp. 539-544
Author(s):  
Xiao Dong Liu ◽  
Xin Qun Feng ◽  
Dong Yang

When room space extends from a simple three-dimensional physical space to a four-dimensional spiritual space, when people begin to rise aesthetic appeal to a higher level and emphasize harmony with the environment, the textile works of art at this time were all considered to play one of the most important evolutional roles. Hanging textiles which featured multi-functional made themselves irreplaceable contents in indoor space. From the application and development view of hanging textiles, the article emphasizes on the decorative function and application strategies to look forward to continuously improvement of hanging textiles’ application and design levels in indoor space.


2021 ◽  
Vol 1 ◽  
pp. 2107-2116
Author(s):  
Agnese Brunzini ◽  
Alessandra Papetti ◽  
Michele Germani ◽  
Erica Adrario

AbstractIn the medical education field, the use of highly sophisticated simulators and extended reality (XR) simulations allow training complex procedures and acquiring new knowledge and attitudes. XR is considered useful for the enhancement of healthcare education; however, several issues need further research.The main aim of this study is to define a comprehensive method to design and optimize every kind of simulator and simulation, integrating all the relevant elements concerning the scenario design and prototype development.A complete framework for the design of any kind of advanced clinical simulation is proposed and it has been applied to realize a mixed reality (MR) prototype for the simulation of the rachicentesis. The purpose of the MR application is to immerse the trainee in a more realistic environment and to put him/her under pressure during the simulation, as in real practice.The application was tested with two different devices: the headset Vox Gear Plus for smartphone and the Microsoft Hololens. Eighteen students of the 6th year of Medicine and Surgery Course were enrolled in the study. Results show the comparison of user experience related to the two different devices and simulation performance using the Hololens.


Sensor Review ◽  
2017 ◽  
Vol 37 (3) ◽  
pp. 312-321 ◽  
Author(s):  
Yixiang Bian ◽  
Can He ◽  
Kaixuan Sun ◽  
Longchao Dai ◽  
Hui Shen ◽  
...  

Purpose The purpose of this paper is to design and fabricate a three-dimensional (3D) bionic airflow sensing array made of two multi-electrode piezoelectric metal-core fibers (MPMFs), inspired by the structure of a cricket’s highly sensitive airflow receptor (consisting of two cerci). Design/methodology/approach A metal core was positioned at the center of an MPMF and surrounded by a hollow piezoceramic cylinder. Four thin metal films were spray-coated symmetrically on the surface of the fiber that could be used as two pairs of sensor electrodes. Findings In 3D space, four output signals of the two MPMFs arrays can form three “8”-shaped spheres. Similarly, the sensing signals for the same airflow are located on a spherical surface. Originality/value Two MPMF arrays are sufficient to detect the speed and direction of airflow in all three dimensions.


Sign in / Sign up

Export Citation Format

Share Document