Real-time Upper Body Reconstruction and Streaming for Mixed Reality Applications

Author(s):  
Dimitrios Laskos ◽  
Konstantinos Moustakas
Keyword(s):  
Impact ◽  
2020 ◽  
Vol 2020 (2) ◽  
pp. 9-11
Author(s):  
Tomohiro Fukuda

Mixed reality (MR) is rapidly becoming a vital tool, not just in gaming, but also in education, medicine, construction and environmental management. The term refers to systems in which computer-generated content is superimposed over objects in a real-world environment across one or more sensory modalities. Although most of us have heard of the use of MR in computer games, it also has applications in military and aviation training, as well as tourism, healthcare and more. In addition, it has the potential for use in architecture and design, where buildings can be superimposed in existing locations to render 3D generations of plans. However, one major challenge that remains in MR development is the issue of real-time occlusion. This refers to hiding 3D virtual objects behind real articles. Dr Tomohiro Fukuda, who is based at the Division of Sustainable Energy and Environmental Engineering, Graduate School of Engineering at Osaka University in Japan, is an expert in this field. Researchers, led by Dr Tomohiro Fukuda, are tackling the issue of occlusion in MR. They are currently developing a MR system that realises real-time occlusion by harnessing deep learning to achieve an outdoor landscape design simulation using a semantic segmentation technique. This methodology can be used to automatically estimate the visual environment prior to and after construction projects.


2021 ◽  
Vol 20 (3) ◽  
pp. 1-22
Author(s):  
David Langerman ◽  
Alan George

High-resolution, low-latency apps in computer vision are ubiquitous in today’s world of mixed-reality devices. These innovations provide a platform that can leverage the improving technology of depth sensors and embedded accelerators to enable higher-resolution, lower-latency processing for 3D scenes using depth-upsampling algorithms. This research demonstrates that filter-based upsampling algorithms are feasible for mixed-reality apps using low-power hardware accelerators. The authors parallelized and evaluated a depth-upsampling algorithm on two different devices: a reconfigurable-logic FPGA embedded within a low-power SoC; and a fixed-logic embedded graphics processing unit. We demonstrate that both accelerators can meet the real-time requirements of 11 ms latency for mixed-reality apps. 1


Author(s):  
Panagiotis Antoniou ◽  
George Arfaras ◽  
Niki Pandria ◽  
George Ntakakis ◽  
Emmanuil Bambatsikos ◽  
...  

2017 ◽  
Vol 2 (3) ◽  
pp. 103
Author(s):  
Uwe Rieger

<p>With the current exponential growth in the sector of Spatial Data Technology and Mixed Reality display devises we experience an increasing overlap of the physical and digital world. Next to making data spatially visible the attempt is to connect digital information with physical properties. Over the past years a number of research institutions have been laying the ground for these developments. In contemporary architecture architectural design the dominant application of data technology is connected to graphical presentation, form finding and digital fabrication.<br />The <em>arc/sec Lab for Digital Spatial Operations </em>at the University of Auckland takes a further step. The Lab explores concepts for a new condition of buildings and urban patterns in which digital information is connected with spatial appearance and linked to material properties. The approach focuses on the step beyond digital re-presentation and digital fabrication, where data is re-connected to the multi-sensory human perceptions and physical skills. The work at the Lab is conducted in a cross disciplinary design environment and based on experiential investigations. The arc/sec Lab utilizes large-scale interactive installations as the driving vehicle for the exploration and communication of new dimensions in architectural space. The experiments are aiming to make data “touchable” and to demonstrate real time responsive environments. In parallel they are the starting point for both the development of practice oriented applications and speculation on how our cities and buildings might change in the future.<br />The article gives an overview of the current experiments being undertaken at the arc/sec Lab. It discusses how digital technologies allow for innovation between the disciplines by introducing real time adaptive behaviours to our build environment and it speculates on the type of spaces we can construct when <em>digital matter </em>is used as a new dynamic building material.</p>


2019 ◽  
Vol 5 ◽  
Author(s):  
Konstantinos Kotis

ARTIST is a research approach introducing novel methods for real-time multi-entity interaction between human and non-human entities, to create reusable and optimized Mixed Reality (MR) experiences with low-effort, towards a Shared MR Experiences Ecosystem (SMRE2). As a result, ARTIST delivers high quality MR experiences, facilitating the interaction between a variety of entities which interact in a virtual and symbiotic way within a mega, virtual and fully-experiential world. Specifically, ARTIST aims to develop novel methods for low-effort (code-free) implementation and deployment of open and reusable MR content, applications and tools, introducing the novel concept of an Experience as a Trajectory (EaaT). In addition, ARTIST will provide tools for the tracking, monitoring and analysis of user behaviour and their interaction with the environment and with other users, towards optimizing MR experiences by recommending their reconfiguration, dynamically (at run-time) or statically (at development time). Finally, it will provide tools for synthesizing experiences into new mega and still reconfigurable EaaTs, enhancing them at the same time using semantically integrated related data/information available in disparate and heterogeneous resources.


2019 ◽  
Vol 5 ◽  
Author(s):  
Konstantinos Kotis

ARTIST is a research approach introducing novel methods for real-time multi-entity interaction between human and non-human entities, to create reusable and optimized Mixed Reality (MR) experiences with low-effort, towards a Shared MR Experiences Ecosystem (SMRE2). As a result, ARTIST delivers high quality MR experiences, facilitating the interaction between a variety of entities which interact in a virtual and symbiotic way within a mega, virtual and fully-experiential world. Specifically, ARTIST aims to develop novel methods for low-effort (code-free) implementation and deployment of open and reusable MR content, applications and tools, introducing the novel concept of an Experience as a Trajectory (EaaT). In addition, ARTIST will provide tools for the tracking, monitoring and analysis of user behaviour and their interaction with the environment and with other users, towards optimizing MR experiences by recommending their reconfiguration, dynamically (at run-time) or statically (at development time). Finally, it will provide tools for synthesizing experiences into new mega and still reconfigurable EaaTs, enhancing them at the same time using semantically integrated related data/information available in disparate and heterogeneous resources.


2021 ◽  
Author(s):  
Vitalii Nezhelskii ◽  
Iuliia Golovchanskaia ◽  
Andrey Zhdanov ◽  
Dmitry Zhdanov

Author(s):  
Fil J. Arenas ◽  
Andrew S. Clayton ◽  
Kate D. Simmons

Several schools and colleges under Air University have found utility in using a mixed-reality approach to developing leadership acumen in this unique risk-free environment. This chapter will describe the power of collaboration between Air University and Auburn University at Montgomery while demonstrating the impact of this mixed-reality approach on developing military leaders. These mixed-reality exercises (MRXs) leverage an environment that establishes a practical laboratory for developing leaders to interact with avatars using a combination of virtual immersion and human intelligence (live simulation engagement). This innovative approach to “real play” allows real-time learning to take place in real time through virtual immersion.


Sign in / Sign up

Export Citation Format

Share Document