Development of a 3D User Interface for Programming a Robotic Arm Using Virtual Reality

2021 ◽  
pp. 435-446
Author(s):  
Marcela Saavedra ◽  
Morelva Saeteros ◽  
Adriana Riofrio ◽  
Gustavo Caiza
1999 ◽  
Author(s):  
Dirk Rantzau ◽  
Ralf Breining ◽  
Oliver Riedel ◽  
Ulrich Haefner ◽  
Holger Scharm ◽  
...  

Abstract Virtual Reality (VR) in conjunction with Immersive Projection Technology (IPT) are well introduced and highly accepted in the international society of institutional research. But does the (manufacturing) industry share this excitement about recent stereo projection technology? What are the key factors of successfully creating a VR solution with production quality? How to identify fields of application and how to get return on invest? These questions will be addressed especially for the automotive industry. While some of the considerations hold for other manufacturing industries as well, some do not. Problems and solutions will be discussed on sample installations and applications for the automotive industry. These VR systems help engineers to deal with different problems: • evaluation of exterior car design, • analysis of thermal comfort in a car cabin, • visualization of a production line • acceptance of tools for deep drawing We will outline the different projection and software technologies used for these engineering tasks as well as the user interface aspects, because there is a special correlation between used projection technology and useful user interaction. The careful design of a special purpose 3D user interface is one of the keys for user productivity and user acceptance.


2019 ◽  
Vol 9 (22) ◽  
pp. 4861 ◽  
Author(s):  
Hind Kharoub ◽  
Mohammed Lataifeh ◽  
Naveed Ahmed

This work presents a novel design of a new 3D user interface for an immersive virtual reality desktop and a new empirical analysis of the proposed interface using three interaction modes. The proposed novel dual-layer 3D user interface allows for user interactions with multiple screens portrayed within a curved 360-degree effective field of view available for the user. Downward gaze allows the user to raise the interaction layer that facilitates several traditional desktop tasks. The 3D user interface is analyzed using three different interaction modes, point-and-click, controller-based direct manipulation, and a gesture-based user interface. A comprehensive user study is performed within a mixed-methods approach for the usability and user experience analysis of all three user interaction modes. Each user interaction is quantitatively and qualitatively analyzed for simple and compound tasks in both standing and seated positions. The crafted mixed approach for this study allows to collect, evaluate, and validate the viability of the new 3D user interface. The results are used to draw conclusions about the suitability of the interaction modes for a variety of tasks in an immersive Virtual Reality 3D desktop environment.


2019 ◽  
Vol 11 (1) ◽  
Author(s):  
Agatha Maisie Tjandra

(Simulasi Mitigasi Gunung Berapi- mitigation simulator volcanic eruption) is an application of serious game with story line by using virtual reality using head mounted display. There are three parts of SIMIGAPI based on the process of mitigation. The main focus of this paper is on the evacuation parts. In this part, user are given a mission to escape from volcanic ashes by walking through the virtual world and passing the pin points. Briefing are given by using text, and graphic elements using 3D graphic user interface. On the other hand, bad user interface may decrease the immersive purposes and easily children as user can be bored. This automatically can affect failed the process transferring information evacuation mitigation to user. This paper aim to explain about creating 3D user interface and observing user experience for education purposes on evacuation part of SIMIGAPI. This project use production method and quantitative questionnaire test to know user perspective about SIMIGAPI information by using GUI.


2020 ◽  
Vol 6 (3) ◽  
pp. 127-130
Author(s):  
Max B. Schäfer ◽  
Kent W. Stewart ◽  
Nico Lösch ◽  
Peter P. Pott

AbstractAccess to systems for robot-assisted surgery is limited due to high costs. To enable widespread use, numerous issues have to be addressed to improve and/or simplify their components. Current systems commonly use universal linkage-based input devices, and only a few applicationoriented and specialized designs are used. A versatile virtual reality controller is proposed as an alternative input device for the control of a seven degree of freedom articulated robotic arm. The real-time capabilities of the setup, replicating a system for robot-assisted teleoperated surgery, are investigated to assess suitability. Image-based assessment showed a considerable system latency of 81.7 ± 27.7 ms. However, due to its versatility, the virtual reality controller is a promising alternative to current input devices for research around medical telemanipulation systems.


Author(s):  
Randall Spain ◽  
Jason Saville ◽  
Barry Lui ◽  
Donia Slack ◽  
Edward Hill ◽  
...  

Because advances in broadband capabilities will soon allow first responders to access and use many forms of data when responding to emergencies, it is becoming critically important to design heads-up displays to present first responders with information in a manner that does not induce extraneous mental workload or cause undue interaction errors. Virtual reality offers a unique medium for envisioning and testing user interface concepts in a realistic and controlled environment. In this paper, we describe a virtual reality-based emergency response scenario that was designed to support user experience research for evaluating the efficacy of intelligent user interfaces for firefighters. We describe the results of a usability test that captured firefighters’ feedback and reactions to the VR scenario and the prototype intelligent user interface that presented them with task critical information through the VR headset. The paper concludes with lessons learned from our development process and a discussion of plans for future research.


PLoS ONE ◽  
2021 ◽  
Vol 16 (10) ◽  
pp. e0258103
Author(s):  
Andreas Bueckle ◽  
Kilian Buehling ◽  
Patrick C. Shih ◽  
Katy Börner

Working with organs and extracted tissue blocks is an essential task in many medical surgery and anatomy environments. In order to prepare specimens from human donors for further analysis, wet-bench workers must properly dissect human tissue and collect metadata for downstream analysis, including information about the spatial origin of tissue. The Registration User Interface (RUI) was developed to allow stakeholders in the Human Biomolecular Atlas Program (HuBMAP) to register tissue blocks—i.e., to record the size, position, and orientation of human tissue data with regard to reference organs. The RUI has been used by tissue mapping centers across the HuBMAP consortium to register a total of 45 kidney, spleen, and colon tissue blocks, with planned support for 17 organs in the near future. In this paper, we compare three setups for registering one 3D tissue block object to another 3D reference organ (target) object. The first setup is a 2D Desktop implementation featuring a traditional screen, mouse, and keyboard interface. The remaining setups are both virtual reality (VR) versions of the RUI: VR Tabletop, where users sit at a physical desk which is replicated in virtual space; VR Standup, where users stand upright while performing their tasks. All three setups were implemented using the Unity game engine. We then ran a user study for these three setups involving 42 human subjects completing 14 increasingly difficult and then 30 identical tasks in sequence and reporting position accuracy, rotation accuracy, completion time, and satisfaction. All study materials were made available in support of future study replication, alongside videos documenting our setups. We found that while VR Tabletop and VR Standup users are about three times as fast and about a third more accurate in terms of rotation than 2D Desktop users (for the sequence of 30 identical tasks), there are no significant differences between the three setups for position accuracy when normalized by the height of the virtual kidney across setups. When extrapolating from the 2D Desktop setup with a 113-mm-tall kidney, the absolute performance values for the 2D Desktop version (22.6 seconds per task, 5.88 degrees rotation, and 1.32 mm position accuracy after 8.3 tasks in the series of 30 identical tasks) confirm that the 2D Desktop interface is well-suited for allowing users in HuBMAP to register tissue blocks at a speed and accuracy that meets the needs of experts performing tissue dissection. In addition, the 2D Desktop setup is cheaper, easier to learn, and more practical for wet-bench environments than the VR setups.


2017 ◽  
Vol 23 (3) ◽  
pp. 39-46
Author(s):  
KIM MIN GYU ◽  
jeonchangyu ◽  
김진모 ◽  
JI WON LEE

Sign in / Sign up

Export Citation Format

Share Document