Session details: Sharing live user experience: how new mixed reality technologies and networks support real-time interactions

Author(s):  
Marc Pallot ◽  
Petros Daras
Impact ◽  
2020 ◽  
Vol 2020 (2) ◽  
pp. 9-11
Author(s):  
Tomohiro Fukuda

Mixed reality (MR) is rapidly becoming a vital tool, not just in gaming, but also in education, medicine, construction and environmental management. The term refers to systems in which computer-generated content is superimposed over objects in a real-world environment across one or more sensory modalities. Although most of us have heard of the use of MR in computer games, it also has applications in military and aviation training, as well as tourism, healthcare and more. In addition, it has the potential for use in architecture and design, where buildings can be superimposed in existing locations to render 3D generations of plans. However, one major challenge that remains in MR development is the issue of real-time occlusion. This refers to hiding 3D virtual objects behind real articles. Dr Tomohiro Fukuda, who is based at the Division of Sustainable Energy and Environmental Engineering, Graduate School of Engineering at Osaka University in Japan, is an expert in this field. Researchers, led by Dr Tomohiro Fukuda, are tackling the issue of occlusion in MR. They are currently developing a MR system that realises real-time occlusion by harnessing deep learning to achieve an outdoor landscape design simulation using a semantic segmentation technique. This methodology can be used to automatically estimate the visual environment prior to and after construction projects.


2021 ◽  
Vol 20 (3) ◽  
pp. 1-22
Author(s):  
David Langerman ◽  
Alan George

High-resolution, low-latency apps in computer vision are ubiquitous in today’s world of mixed-reality devices. These innovations provide a platform that can leverage the improving technology of depth sensors and embedded accelerators to enable higher-resolution, lower-latency processing for 3D scenes using depth-upsampling algorithms. This research demonstrates that filter-based upsampling algorithms are feasible for mixed-reality apps using low-power hardware accelerators. The authors parallelized and evaluated a depth-upsampling algorithm on two different devices: a reconfigurable-logic FPGA embedded within a low-power SoC; and a fixed-logic embedded graphics processing unit. We demonstrate that both accelerators can meet the real-time requirements of 11 ms latency for mixed-reality apps. 1


Leonardo ◽  
2014 ◽  
Vol 47 (5) ◽  
pp. 500-501 ◽  
Author(s):  
Mónica Mendes ◽  
Pedro Ângelo ◽  
Nuno Correia

Hug@ree is an interactive installation that provides a bond between urban beings and the forest. It is an ARTiVIS (Arts, Real-Time Video and Interactivity for Sustainability) experience that provides interaction with trees and videos of trees in real-time, raising awareness of the natural environment and how individual action can collectively become so relevant. In this paper, the authors present an overview of the Hug@ree concept, related work, implementation, user experience evaluation and future work.


Author(s):  
Aleshia T. Hayes ◽  
Carrie L. Straub ◽  
Lisa A. Dieker ◽  
Charlie E. Hughes ◽  
Michael C. Hynes

New and emerging technology in the field of virtual environments has permitted a certain malleability of learning milieus. These emerging environments allow learning and transfer through interactions that have been intentionally designed to be pleasurable experiences. TLE TeachLivE™ is just such an emerging environment that engages teachers in practice on pedagogical and content aspects of teaching in a simulator. The sense of presence, engagement, and ludus of TLE TeachLivE™ are derived from the compelling Mixed Reality that includes components of off-the shelf and emerging technologies. Some of the noted features that have been identified relevant to the ludic nature of TeachLivE include the flow, fidelity, unpredicability, suspension of disbelief, social presence, and gamelike elements. This article explores TLE TeachLivE™ in terms of the ludology, paideic user experience, the source of the ludus, and outcomes of the ludic nature of the experience.


Author(s):  
Panagiotis Antoniou ◽  
George Arfaras ◽  
Niki Pandria ◽  
George Ntakakis ◽  
Emmanuil Bambatsikos ◽  
...  

2017 ◽  
Vol 2 (3) ◽  
pp. 103
Author(s):  
Uwe Rieger

<p>With the current exponential growth in the sector of Spatial Data Technology and Mixed Reality display devises we experience an increasing overlap of the physical and digital world. Next to making data spatially visible the attempt is to connect digital information with physical properties. Over the past years a number of research institutions have been laying the ground for these developments. In contemporary architecture architectural design the dominant application of data technology is connected to graphical presentation, form finding and digital fabrication.<br />The <em>arc/sec Lab for Digital Spatial Operations </em>at the University of Auckland takes a further step. The Lab explores concepts for a new condition of buildings and urban patterns in which digital information is connected with spatial appearance and linked to material properties. The approach focuses on the step beyond digital re-presentation and digital fabrication, where data is re-connected to the multi-sensory human perceptions and physical skills. The work at the Lab is conducted in a cross disciplinary design environment and based on experiential investigations. The arc/sec Lab utilizes large-scale interactive installations as the driving vehicle for the exploration and communication of new dimensions in architectural space. The experiments are aiming to make data “touchable” and to demonstrate real time responsive environments. In parallel they are the starting point for both the development of practice oriented applications and speculation on how our cities and buildings might change in the future.<br />The article gives an overview of the current experiments being undertaken at the arc/sec Lab. It discusses how digital technologies allow for innovation between the disciplines by introducing real time adaptive behaviours to our build environment and it speculates on the type of spaces we can construct when <em>digital matter </em>is used as a new dynamic building material.</p>


2019 ◽  
Vol 5 ◽  
Author(s):  
Konstantinos Kotis

ARTIST is a research approach introducing novel methods for real-time multi-entity interaction between human and non-human entities, to create reusable and optimized Mixed Reality (MR) experiences with low-effort, towards a Shared MR Experiences Ecosystem (SMRE2). As a result, ARTIST delivers high quality MR experiences, facilitating the interaction between a variety of entities which interact in a virtual and symbiotic way within a mega, virtual and fully-experiential world. Specifically, ARTIST aims to develop novel methods for low-effort (code-free) implementation and deployment of open and reusable MR content, applications and tools, introducing the novel concept of an Experience as a Trajectory (EaaT). In addition, ARTIST will provide tools for the tracking, monitoring and analysis of user behaviour and their interaction with the environment and with other users, towards optimizing MR experiences by recommending their reconfiguration, dynamically (at run-time) or statically (at development time). Finally, it will provide tools for synthesizing experiences into new mega and still reconfigurable EaaTs, enhancing them at the same time using semantically integrated related data/information available in disparate and heterogeneous resources.


Sign in / Sign up

Export Citation Format

Share Document