Computer-Generated Virtual/Physical Reality: Blurring the Lines

Author(s):  
Harry C. Petersen ◽  
Andrzej Markowski ◽  
Paul Sullivan ◽  
Robert Petersen

Abstract As computers grow in ability to access and process ever-larger blocks of data within real-time responses, their ability to generate virtual reality responses has multiplied exponentially. Simultaneously, computer capabilities of using huge data files to control manufacturing processes, create rapid prototypes, augment human senses, and control vehicles and machines have given them the ability to control and even create physical reality. But computers now have the ability to blur the lines between virtual and physical realities in areas which include video manipulation, virtual reality with tactile feedback, and physical training devices such as flight training simulators. This paper investigates types of computer-generated virtual/physical realities and their uses and implications for industry and consumers alike. Examples of research by the authors in video manipulation and training, solid modeling, animated simulation, manufacturing, rapid prototyping, and reverse engineering will be presented, along with data base corruption, and data manipulation methods and problems. Finally, applications and future implications of this technology will be presented.

Author(s):  
S Leinster-Evans ◽  
J Newell ◽  
S Luck

This paper looks to expand on the INEC 2016 paper ‘The future role of virtual reality within warship support solutions for the Queen Elizabeth Class aircraft carriers’ presented by Ross Basketter, Craig Birchmore and Abbi Fisher from BAE Systems in May 2016 and the EAAW VII paper ‘Testing the boundaries of virtual reality within ship support’ presented by John Newell from BAE Systems and Simon Luck from BMT DSL in June 2017. BAE Systems and BMT have developed a 3D walkthrough training system that supports the teams working closely with the QEC Aircraft Carriers in Portsmouth and this work was presented at EAAW VII. Since then this work has been extended to demonstrate the art of the possible on Type 26. This latter piece of work is designed to explore the role of 3D immersive environments in the development and fielding of support and training solutions, across the range of support disciplines. The combined team are looking at how this digital thread leads from design of platforms, both surface and subsurface, through build into in-service support and training. This rich data and ways in which it could be used in the whole lifecycle of the ship, from design and development (used for spatial acceptance, HazID, etc) all the way through to operational support and maintenance (in conjunction with big data coming off from the ship coupled with digital tech docs for maintenance procedures) using constantly developing technologies such as 3D, Virtual Reality, Augmented Reality and Mixed Reality, will be proposed.  The drive towards gamification in the training environment to keep younger recruits interested and shortening course lengths will be explored. The paper develops the options and looks to how this technology can be used and where the value proposition lies. 


Sensors ◽  
2021 ◽  
Vol 21 (4) ◽  
pp. 1537
Author(s):  
Florin Covaciu ◽  
Adrian Pisla ◽  
Anca-Elena Iordan

The traditional systems used in the physiotherapy rehabilitation process are evolving towards more advanced systems that use virtual reality (VR) environments so that the patient in the rehabilitation process can perform various exercises in an interactive way, thus improving the patient’s motivation and reducing the therapist’s work. The paper presents a VR simulator for an intelligent robotic system of physiotherapeutic rehabilitation of the ankle of a person who has had a stroke. This simulator can interact with a real human subject by attaching a sensor that contains a gyroscope and accelerometer to identify the position and acceleration of foot movement on three axes. An electromyography (EMG) sensor is also attached to the patient’s leg muscles to measure muscle activity because a patient who is in a worse condition has weaker muscle activity. The data collected from the sensors are taken by an intelligent module that uses machine learning to create new levels of exercise and control of the robotic rehabilitation structure of the virtual environment. Starting from these objectives, the virtual reality simulator created will have a low dependence on the therapist, this being the main improvement over other simulators already created for this purpose.


2007 ◽  
Vol 61 ◽  
pp. 379-391 ◽  
Author(s):  
Ralf A. Kockro ◽  
Axel Stadie ◽  
Eike Schwandt ◽  
Robert Reisch ◽  
Cleopatra Charalampaki ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document