scholarly journals Virtual Reality User Interface for Autonomous Production

2000 ◽  
pp. 279-286 ◽  
Author(s):  
Christopher Schlick ◽  
Ralph Reuth ◽  
Holger Luczak
Author(s):  
Randall Spain ◽  
Jason Saville ◽  
Barry Lui ◽  
Donia Slack ◽  
Edward Hill ◽  
...  

Because advances in broadband capabilities will soon allow first responders to access and use many forms of data when responding to emergencies, it is becoming critically important to design heads-up displays to present first responders with information in a manner that does not induce extraneous mental workload or cause undue interaction errors. Virtual reality offers a unique medium for envisioning and testing user interface concepts in a realistic and controlled environment. In this paper, we describe a virtual reality-based emergency response scenario that was designed to support user experience research for evaluating the efficacy of intelligent user interfaces for firefighters. We describe the results of a usability test that captured firefighters’ feedback and reactions to the VR scenario and the prototype intelligent user interface that presented them with task critical information through the VR headset. The paper concludes with lessons learned from our development process and a discussion of plans for future research.


PLoS ONE ◽  
2021 ◽  
Vol 16 (10) ◽  
pp. e0258103
Author(s):  
Andreas Bueckle ◽  
Kilian Buehling ◽  
Patrick C. Shih ◽  
Katy Börner

Working with organs and extracted tissue blocks is an essential task in many medical surgery and anatomy environments. In order to prepare specimens from human donors for further analysis, wet-bench workers must properly dissect human tissue and collect metadata for downstream analysis, including information about the spatial origin of tissue. The Registration User Interface (RUI) was developed to allow stakeholders in the Human Biomolecular Atlas Program (HuBMAP) to register tissue blocks—i.e., to record the size, position, and orientation of human tissue data with regard to reference organs. The RUI has been used by tissue mapping centers across the HuBMAP consortium to register a total of 45 kidney, spleen, and colon tissue blocks, with planned support for 17 organs in the near future. In this paper, we compare three setups for registering one 3D tissue block object to another 3D reference organ (target) object. The first setup is a 2D Desktop implementation featuring a traditional screen, mouse, and keyboard interface. The remaining setups are both virtual reality (VR) versions of the RUI: VR Tabletop, where users sit at a physical desk which is replicated in virtual space; VR Standup, where users stand upright while performing their tasks. All three setups were implemented using the Unity game engine. We then ran a user study for these three setups involving 42 human subjects completing 14 increasingly difficult and then 30 identical tasks in sequence and reporting position accuracy, rotation accuracy, completion time, and satisfaction. All study materials were made available in support of future study replication, alongside videos documenting our setups. We found that while VR Tabletop and VR Standup users are about three times as fast and about a third more accurate in terms of rotation than 2D Desktop users (for the sequence of 30 identical tasks), there are no significant differences between the three setups for position accuracy when normalized by the height of the virtual kidney across setups. When extrapolating from the 2D Desktop setup with a 113-mm-tall kidney, the absolute performance values for the 2D Desktop version (22.6 seconds per task, 5.88 degrees rotation, and 1.32 mm position accuracy after 8.3 tasks in the series of 30 identical tasks) confirm that the 2D Desktop interface is well-suited for allowing users in HuBMAP to register tissue blocks at a speed and accuracy that meets the needs of experts performing tissue dissection. In addition, the 2D Desktop setup is cheaper, easier to learn, and more practical for wet-bench environments than the VR setups.


2017 ◽  
Vol 23 (3) ◽  
pp. 39-46
Author(s):  
KIM MIN GYU ◽  
jeonchangyu ◽  
김진모 ◽  
JI WON LEE

2018 ◽  
Vol 2018 ◽  
pp. 1-10
Author(s):  
Sang Hun Nam ◽  
Ji Yong Lee ◽  
Jung Yoon Kim

Biosignal interfaces provide important data that reveal the physical status of a user, and they are used in the medical field for patient health status monitoring, medical automation, or rehabilitation services. Biosignals can be used in developing new contents, in conjunction with virtual reality, and are important factors for extracting user emotion or measuring user experience. A biological-signal-based user-interface system composed of sensor devices, a user-interface system, and an application that can extract biological-signal data from multiple biological-signal devices and be used by content developers was designed. A network-based protocol was used for unconstrained use of the device so that the biological signals can be freely received via USB, Bluetooth, WiFi, and an internal system module. A system that can extract biological-signal data from multiple biological-signal data and simultaneously extract and analyze the data from a virtual-reality-specific eye-tracking device was developed so that users who develop healthcare contents based on virtual-reality technology can easily use the biological signals.


2018 ◽  
pp. 1377-1392
Author(s):  
Yogendra Patil ◽  
Guilherme Galdino Siqueira ◽  
Iara Brandao ◽  
Fei Hu

Stroke rehabilitation techniques have gathered an immense attention due to the addition of virtual reality environment for rehabilitation purposes. Current techniques involve ideas such as imitating various stroke rehabilitation exercises in virtual world. This makes rehabilitation process more attractive as compared to conventional methods and motivates the patient to continue the therapy. However, most of the virtual reality based stroke rehabilitation studies focus on patient performing sedentary rehabilitation exercises. In this chapter, we introduce our virtual reality based post stroke rehabilitation system that allows a post stroke patient to perform dynamic exercises. With the introduction of our system, we hope to increase post stroke patient's ability to perform their daily routine exercises independently. Our discussion in this chapter is mainly centered around collaboration of rehabilitation system with virtual reality software. We also detail the design process of our modern user interface for collecting useful data during rehabilitation. A simple experiment is carried out to validate the visibility of our system.


2018 ◽  
pp. 119-137
Author(s):  
Alan Radley

A new philosophy of user interface design is described. Named the “Lookable User Interface,” or LUI, the approach is based on the concept of a Personal Reality (PR) system. Here the computer adapts to the user's worldview in a personalized way, and according to the specific requirements, behaviors, and perceptive skills of the individual. Typically, a PR system creates and adjusts (in real-time) 3D perspective view(s) of a data-set, including (potentially) the field of view of a scene and the apparent distance and scale of objects, whilst also creating an aesthetic “eye-friendly” context for computing operations. A Lookable User Interface (LUI) affords the maximum degree of visual accessibility to digital content. The authors examine the results of testing a Lookable User Interface. Spectasia is one example of a Personal Virtual Reality (PVR) that can be used to visualize links between universals and particulars within digital worlds.


2008 ◽  
pp. 897-921 ◽  
Author(s):  
Claudio Kirner ◽  
Tereza G. Kirner

This chapter introduces virtual reality and augmented reality as a basis for simulation visualization. It shows how these technologies can support simulation visualization and gives important considerations about the use of simulation in virtual and augmented reality environments. Hardware and software features, as well as user interface and examples related to simulation, using and supporting virtual reality and augmented reality, are discussed, stressing their benefits and disadvantages. The chapter intends to discuss virtual and augmented reality in the context of simulation, emphasizing the visualization of data and behavior of systems. The importance of simulation to give dynamic and realistic behaviors to virtual and augmented reality is also pointed out. The work indicates that understanding the integrated use of virtual reality and simulation should create better conditions to the development of innovative simulation environments as well as to the improvement of virtual and augmented reality environments.


2021 ◽  
pp. 435-446
Author(s):  
Marcela Saavedra ◽  
Morelva Saeteros ◽  
Adriana Riofrio ◽  
Gustavo Caiza

Author(s):  
Alan Radley

A new philosophy of user interface design is described. Named the “Lookable User Interface,” or LUI, the approach is based on the concept of a Personal Reality (PR) system. Here the computer adapts to the user's worldview in a personalized way, and according to the specific requirements, behaviors, and perceptive skills of the individual. Typically, a PR system creates and adjusts (in real-time) 3D perspective view(s) of a data-set, including (potentially) the field of view of a scene and the apparent distance and scale of objects, whilst also creating an aesthetic “eye-friendly” context for computing operations. A Lookable User Interface (LUI) affords the maximum degree of visual accessibility to digital content. The authors examine the results of testing a Lookable User Interface. Spectasia is one example of a Personal Virtual Reality (PVR) that can be used to visualize links between universals and particulars within digital worlds.


Sign in / Sign up

Export Citation Format

Share Document