stereoscopic rendering
Recently Published Documents


TOTAL DOCUMENTS

25
(FIVE YEARS 4)

H-INDEX

6
(FIVE YEARS 1)

2021 ◽  
Author(s):  
◽  
Byron Mallett

<p>This thesis presents the design for a method of controlling music software for live performance by utilising virtual reality (VR) technologies. By analysing the performance methods of artists that use either physical or gestural methods for controlling music, it is apparent that physical limitations of musical input devices can hamper the creative process involved in authoring an interface for a performance. This thesis proposes the use of VR technologies as a central foundation for authoring a unique workspace where a performance interface can be both constructed and performed with. Through a number of design experiments using a variety of gestural input technologies, the relationship between a musical performer, interface, and audience was analysed. The final proposed design of a VR interface for musical performance focuses on providing the performer with objects that can be directly manipulated with physical gestures performed by touching virtual controls. By utilising the strengths provided by VR, a performer can learn how to effectively operate their performance environment through the use of spatial awareness provided by VR stereoscopic rendering and hand tracking, as well as allowing for the construction of unique interfaces that are not limited by physical hardware constraints. This thesis also presents a software framework for connecting together multiple musical devices within a single performance ecosystem that can all be directly controlled from a single VR space. The final outcome of this research is a shared musical environment that is designed to foster closer connections between an audience, a performer and a performance interface into a coherent and appealing experience for all.</p>


2021 ◽  
Author(s):  
◽  
Byron Mallett

<p>This thesis presents the design for a method of controlling music software for live performance by utilising virtual reality (VR) technologies. By analysing the performance methods of artists that use either physical or gestural methods for controlling music, it is apparent that physical limitations of musical input devices can hamper the creative process involved in authoring an interface for a performance. This thesis proposes the use of VR technologies as a central foundation for authoring a unique workspace where a performance interface can be both constructed and performed with. Through a number of design experiments using a variety of gestural input technologies, the relationship between a musical performer, interface, and audience was analysed. The final proposed design of a VR interface for musical performance focuses on providing the performer with objects that can be directly manipulated with physical gestures performed by touching virtual controls. By utilising the strengths provided by VR, a performer can learn how to effectively operate their performance environment through the use of spatial awareness provided by VR stereoscopic rendering and hand tracking, as well as allowing for the construction of unique interfaces that are not limited by physical hardware constraints. This thesis also presents a software framework for connecting together multiple musical devices within a single performance ecosystem that can all be directly controlled from a single VR space. The final outcome of this research is a shared musical environment that is designed to foster closer connections between an audience, a performer and a performance interface into a coherent and appealing experience for all.</p>


Author(s):  
Caroline Garcia Forlim ◽  
Lukas Bittner ◽  
Fariba Mostajeran ◽  
Frank Steinicke ◽  
Jürgen Gallinat ◽  
...  

2018 ◽  
Vol 15 (2) ◽  
Author(s):  
Christoph Müller ◽  
Michael Krone ◽  
Markus Huber ◽  
Verena Biener ◽  
Dominik Herr ◽  
...  

AbstractImmersive technologies like stereo rendering, virtual reality, or augmented reality (AR) are often used in the field of molecular visualisation. Modern, comparably lightweight and affordable AR headsets like Microsoft’s HoloLens open up new possibilities for immersive analytics in molecular visualisation. A crucial factor for a comprehensive analysis of molecular data in AR is the rendering speed. HoloLens, however, has limited hardware capabilities due to requirements like battery life, fanless cooling and weight. Consequently, insights from best practises for powerful desktop hardware may not be transferable. Therefore, we evaluate the capabilities of the HoloLens hardware for modern, GPU-enabled, high-quality rendering methods for the space-filling model commonly used in molecular visualisation. We also assess the scalability for large molecular data sets. Based on the results, we discuss ideas and possibilities for immersive molecular analytics. Besides more obvious benefits like the stereoscopic rendering offered by the device, this specifically includes natural user interfaces that use physical navigation instead of the traditional virtual one. Furthermore, we consider different scenarios for such an immersive system, ranging from educational use to collaborative scenarios.


2017 ◽  
Vol 23 (4) ◽  
pp. 1332-1341 ◽  
Author(s):  
Andre Schollmeyer ◽  
Simon Schneegans ◽  
Stephan Beck ◽  
Anthony Steed ◽  
Bernd Froehlich

Author(s):  
Sunho Ki ◽  
Jinhong Park ◽  
Jeong-Ho Woo ◽  
Yeongkyu Lim ◽  
Chulho Shin

Author(s):  
Kurt Satter ◽  
Alley Butler

A competitive usability study was employed to measure user performance and user preference for immersive virtual environments (VEs) with multimodal gestural interfaces when compared directly with nonstereoscopic traditional CAD interfaces that use keyboard and mouse. The immersive interfaces included a wand and a data glove with voice interface with an 86 in. stereoscopic rendering to screen; whereas, the traditional CAD interfaces included a 19 in. workstation with keyboard and mouse and an 86 in. nonstereoscopic display with keyboard and mouse. The context for this study was a set of “real world” engineering design scenarios. These design scenarios include benchmark 1—navigation, benchmark 2—error finding and repair, and benchmark 3—spatial awareness. For this study, two populations of users were employed, novice (n = 15) and experienced (n = 15). All users experienced three successive trials to quantify the effects of limited learning. Statistically based comparisons were made using both parametric and nonparametric methods. Conclusions included improved capability and user preference for immersive VEs and their interfaces were statistically significant for navigation and error finding/repair for immersive interfaces.


Sign in / Sign up

Export Citation Format

Share Document