scholarly journals RaViS: Real-time accelerated View Synthesizer for immersive video 6DoF VR

2020 ◽  
Vol 2020 (13) ◽  
pp. 382-1-382-9
Author(s):  
Daniele Bonatto ◽  
Sarah Fachada ◽  
Gauthier Lafruit

MPEG-I, the upcoming standard for immersive video, has steadily explored immersive video technology for free navigation applications, where any virtual viewpoint to the scene is created using Depth Image-Based Rendering (DIBR) from any number of stationary cameras positioned around the scene. This exploration has recently evolved towards a rendering pipeline using camera feeds, as well as a standard file format, containing all information for synthesizing a virtual viewpoint to a scene. We present an acceleration of our Reference View Synthesis software (RVS) that enables the rendering in real-time of novel views in a head mounted display, hence supporting virtual reality (VR) with 6 Degrees of Freedom (6DoF) including motion parallax within a restricted viewing volume. In this paper, we explain its main engineering challenges.

Author(s):  
Monica Bordegoni ◽  
Mario Covarrubias ◽  
Giandomenico Caruso ◽  
Umberto Cugini

This paper presents a novel system that allows product designers to design, experience, and modify new shapes of objects, starting from existing ones. The system allows designers to acquire and reconstruct the 3D model of a real object and to visualize and physically interact with this model. In addition, the system allows designer to modify the shape through physical manipulation of the 3D model and to eventually print it using a 3D printing technology. The system is developed by integrating state-of-the-art technologies in the sectors of reverse engineering, virtual reality, and haptic technology. The 3D model of an object is reconstructed by scanning its shape by means of a 3D scanning device. Then, the 3D model is imported into the virtual reality environment, which is used to render the 3D model of the object through an immersive head mounted display (HMD). The user can physically interact with the 3D model by using the desktop haptic strip for shape design (DHSSD), a 6 degrees of freedom servo-actuated developable metallic strip, which reproduces cross-sectional curves of 3D virtual objects. The DHSSD device is controlled by means of hand gestures recognized by a leap motion sensor.


2001 ◽  
Author(s):  
Taichi Shiiba ◽  
Yoshihiro Suda

Abstract In this paper, the authors propose to apply the full vehicle model of multibody dynamics to driving simulator with 6 degrees of freedom motion system. By this proposal, the characteristics of driving simulator become very similar to the actual automobiles. It becomes possible to predict the performance of vehicle dynamics and the riding comfort by feeling test without prototyping automobile. To realize real-time calculation that is necessary for driving simulator, the authors proposed approximated real-time analysis method. By this method, real-time vehicle analysis of 2 ms step time of numerical integration is achieved with 91 degrees of freedom vehicle model.


2020 ◽  
Vol 6 (1) ◽  
Author(s):  
Kevin Yu ◽  
Thomas Wegele ◽  
Daniel Ostler ◽  
Dirk Wilhelm ◽  
Hubertus Feußner

AbstractTelemedicine has become a valuable asset in emergency responses for assisting paramedics in decision making and first contact treatment. Paramedics in unfamiliar environments or time-critical situations often encounter complications for which they require external advice. Modern ambulance vehicles are equipped with microphones, cameras, and vital sensors, which allow experts to remotely join the local team. However, the visual channels are rarely used since the statically installed cameras only allow broad views at the patient. They neither allow a close-up view nor a dynamic viewpoint controlled by the remote expert. In this paper, we present EyeRobot, a concept which enables dynamic viewpoints for telepresence using the intuitive control of the user’s head motion. In particular, EyeRobot utilizes the 6 degrees of freedom pose estimation capabilities of modern head-mounted displays and applies them in real-time to the pose of a robot arm. A stereo-camera, installed on the end-effector of the robot arm, serves as the eyes of the remote expert at the local site. We put forward an implementation of EyeRobot and present the results of our pilot study which indicates its intuitive control.


2013 ◽  
Vol 3 (1) ◽  
pp. 71-78 ◽  
Author(s):  
Md. Mahbubar Rahman ◽  
Hiroshi Miki ◽  
Shinpei Sugimori ◽  
Yugo Sanada ◽  
Yasuyuki Toda

Sign in / Sign up

Export Citation Format

Share Document