Freehand Gesture and Tactile Interaction for Shape Design

Author(s):  
Monica Bordegoni ◽  
Mario Covarrubias ◽  
Giandomenico Caruso ◽  
Umberto Cugini

This paper presents a novel system that allows product designers to design, experience, and modify new shapes of objects, starting from existing ones. The system allows designers to acquire and reconstruct the 3D model of a real object and to visualize and physically interact with this model. In addition, the system allows designer to modify the shape through physical manipulation of the 3D model and to eventually print it using a 3D printing technology. The system is developed by integrating state-of-the-art technologies in the sectors of reverse engineering, virtual reality, and haptic technology. The 3D model of an object is reconstructed by scanning its shape by means of a 3D scanning device. Then, the 3D model is imported into the virtual reality environment, which is used to render the 3D model of the object through an immersive head mounted display (HMD). The user can physically interact with the 3D model by using the desktop haptic strip for shape design (DHSSD), a 6 degrees of freedom servo-actuated developable metallic strip, which reproduces cross-sectional curves of 3D virtual objects. The DHSSD device is controlled by means of hand gestures recognized by a leap motion sensor.

2021 ◽  
Vol 10 (5) ◽  
pp. 3546-3551
Author(s):  
Tamanna Nurai

Cybersickness continues to become a negative consequence that degrades the interface for users of virtual worlds created for Virtual Reality (VR) users. There are various abnormalities that might cause quantifiable changes in body awareness when donning an Head Mounted Display (HMD) in a Virtual Environment (VE). VR headsets do provide VE that matches the actual world and allows users to have a range of experiences. Motion sickness and simulation sickness performance gives self-report assessments of cybersickness with VEs. In this study a simulator sickness questionnaire is being used to measure the aftereffects of the virtual environment. This research aims to answer if Immersive VR induce cybersickness and impact equilibrium coordination. The present research is formed as a cross-sectional observational analysis. According to the selection criteria, a total of 40 subjects would be recruited from AVBRH, Sawangi Meghe for the research. With intervention being used the experiment lasted 6 months. Simulator sickness questionnaire is used to evaluate the after-effects of a virtual environment. It holds a single period for measuring motion sickness and evaluation of equilibrium tests were done twice at exit and after 10 mins. Virtual reality being used in video games is still in its development. Integrating gameplay action into the VR experience will necessitate a significant amount of study and development. The study has evaluated if Immersive VR induce cybersickness and impact equilibrium coordination. To measure cybersickness, numerous scales have been developed. The essence of cybersickness has been revealed owing to work on motion sickness in a simulated system.


2021 ◽  
Author(s):  
Manpreet Kaur Bhamra ◽  
Waqar M. Naqvi ◽  
Sakshi P. Arora

Abstract Introduction: Anxiety disorders impact a large number of population all over the world, prohibiting them from undertaking everyday tasks such as driving, staying in crowded places, or dealing with strangers. The Hamilton anxiety (HAM-A) scale is the first rating Questionnaire for determining the sign anxiety symptoms. HAM-A is a 14 point scale containing a clinician-based questionnaire that has been utilized as a self-scored survey based on both physical and psychological symptoms. The components of questionnaires for analyzing the depressive or anxious symptoms are developed and tested in medical practice with great success. Virtual Reality (VR) is a computer-simulated world that allows the user to feel as they are physically present in it. Oculus rift is a VR ski-masked shaped goggle having a better and deeper understanding of the range and user experiences that will help to guide future efforts.Method: The cross-sectional observational study will be including 70 participants aged 18 to 32 from Ravi Nair College of Physiotherapy, India for the study. With intervention, the duration of analysis of the study will be of 6 months. HAM-A scale is used to evaluate the symptoms of anxiety in people before they show up on the oculus rift.Discussion: The study will evaluate the severity of anxiety before going to VR surrounding. Virtual reality devices are more popular, many studies have been undertaken on the construction and validation of interfaces, but more research is needed on anxiety before entering a virtual reality environment has been limited; specifically, There are only a few techniques that may be used to measure anxiety in a virtual reality surrounding.The Institutional Ethical Clearance reference number for this study is RNPC/IEC/2020-21/0012.


2018 ◽  
Author(s):  
Yoshihito Masuoka ◽  
Hiroyuki Morikawa ◽  
Takashi Kawai ◽  
Toshio Nakagohri

BACKGROUND Virtual reality (VR) technology has started to gain attention as a form of surgical support in medical settings. Likewise, the widespread use of smartphones has resulted in the development of various medical applications; for example, Google Cardboard, which can be used to build simple head-mounted displays (HMDs). However, because of the absence of observed and reported outcomes of the use of three-dimensional (3D) organ models in relevant environments, we have yet to determine the effects of or issues with the use of such VR technology. OBJECTIVE The aim of this paper was to study the issues that arise while observing a 3D model of an organ that is created based on an actual surgical case through the use of a smartphone-based simple HMD. Upon completion, we evaluated and gathered feedback on the performance and usability of the simple observation environment we had created. METHODS We downloaded our data to a smartphone (Galaxy S6; Samsung, Seoul, Korea) and created a simple HMD system using Google Cardboard (Google). A total of 17 medical students performed 2 experiments: an observation conducted by a single observer and another one carried out by multiple observers using a simple HMD. Afterward, they assessed the results by responding to a questionnaire survey. RESULTS We received a largely favorable response in the evaluation of the dissection model, but also a low score because of visually induced motion sickness and eye fatigue. In an introspective report on simultaneous observations made by multiple observers, positive opinions indicated clear image quality and shared understanding, but displeasure caused by visually induced motion sickness, eye fatigue, and hardware problems was also expressed. CONCLUSIONS We established a simple system that enables multiple persons to observe a 3D model. Although the observation conducted by multiple observers was successful, problems likely arose because of poor smartphone performance. Therefore, smartphone performance improvement may be a key factor in establishing a low-cost and user-friendly 3D observation environment.


2021 ◽  
pp. 1-17
Author(s):  
Iqra Arshad ◽  
Paulo De Mello ◽  
Martin Ender ◽  
Jason D. McEwen ◽  
Elisa R. Ferré

Abstract Despite the technological advancements in Virtual Reality (VR), users are constantly combating feelings of nausea and disorientation, the so-called cybersickness. Cybersickness symptoms cause severe discomfort and hinder the immersive VR experience. Here we investigated cybersickness in 360-degree head-mounted display VR. In traditional 360-degree VR experiences, translational movement in the real world is not reflected in the virtual world, and therefore self-motion information is not corroborated by matching visual and vestibular cues, which may trigger symptoms of cybersickness. We evaluated whether a new Artificial Intelligence (AI) software designed to supplement the 360-degree VR experience with artificial six-degrees-of-freedom motion may reduce cybersickness. Explicit (simulator sickness questionnaire and Fast Motion Sickness (FMS) rating) and implicit (heart rate) measurements were used to evaluate cybersickness symptoms during and after 360-degree VR exposure. Simulator sickness scores showed a significant reduction in feelings of nausea during the AI-supplemented six-degrees-of-freedom motion VR compared to traditional 360-degree VR. However, six-degrees-of-freedom motion VR did not reduce oculomotor or disorientation measures of sickness. No changes were observed in FMS and heart rate measures. Improving the congruency between visual and vestibular cues in 360-degree VR, as provided by the AI-supplemented six-degrees-of-freedom motion system considered, is essential for a more engaging, immersive and safe VR experience, which is critical for educational, cultural and entertainment applications.


2020 ◽  
Vol 6 (1) ◽  
Author(s):  
Kevin Yu ◽  
Thomas Wegele ◽  
Daniel Ostler ◽  
Dirk Wilhelm ◽  
Hubertus Feußner

AbstractTelemedicine has become a valuable asset in emergency responses for assisting paramedics in decision making and first contact treatment. Paramedics in unfamiliar environments or time-critical situations often encounter complications for which they require external advice. Modern ambulance vehicles are equipped with microphones, cameras, and vital sensors, which allow experts to remotely join the local team. However, the visual channels are rarely used since the statically installed cameras only allow broad views at the patient. They neither allow a close-up view nor a dynamic viewpoint controlled by the remote expert. In this paper, we present EyeRobot, a concept which enables dynamic viewpoints for telepresence using the intuitive control of the user’s head motion. In particular, EyeRobot utilizes the 6 degrees of freedom pose estimation capabilities of modern head-mounted displays and applies them in real-time to the pose of a robot arm. A stereo-camera, installed on the end-effector of the robot arm, serves as the eyes of the remote expert at the local site. We put forward an implementation of EyeRobot and present the results of our pilot study which indicates its intuitive control.


Author(s):  
Bosede Iyiade Edwards ◽  
Kevin S. Bielawski ◽  
Rui F. Prada ◽  
Adrian David Cheok

Human-Computer Interaction, including technology-aided instruction, is beginning to focus on virtual reality (VR) technology due to its ability to support immersive learning, teaching through simulation, and gamification of learning. These systems can deliver high-level multisensory learning experiences that are important in the teaching of many subjects, especially those involving abstract concepts or requiring spatial skills, such as organic chemistry. Haptic experiences with VR, however, remain a challenge. In addition, development have focused on general entertainment/gaming; VR systems in chemistry implement simulations of the chemistry laboratory and other advanced systems whereas those that support safe, game-like, immersive and multisensory learning of organic chemistry with haptics at pre-university education levels are scarce. We developed the VR Multisensory Classroom (VRMC) as an immersive learning environment within a VR head-mounted display, where learners employ hand movements to build hydrocarbon molecules and experience haptic feedback through gloves with built-in sensors and hand-tracking with the Leap Motion system. We report here the evaluation of the first prototype by learners from diverse backgrounds who reported on the ability of the VRMC to support high engagement, motivation, interest and organic chemistry learning as well as diverse learning styles. The VRMC is a novel VR classroom that supports immersive learning in molecular organic chemistry with haptics for multisensory learning.


2019 ◽  
Vol 47 (4) ◽  
pp. 513-522 ◽  
Author(s):  
Tilanka Chandrasekera ◽  
Kinkini Fernando ◽  
Luis Puig

The purpose of this research was to explore the use of virtual reality (VR) in early design studios. In this research project, two different types of Head-Mounted Display (HMD) systems were used. One type of HMD provided six degrees of freedom and the other HMD provided three degrees of freedom. The research findings provide comparison on the functionality of the different types of HMDs and the sense of presence in VR environments. Sense of presence is defined as the sense of “being there” in a computer-simulated environment. The outcomes of this research are (a) development of a new presence questionnaire that focuses on newer VR systems and (b) understanding student perception of using VR in design projects.


2020 ◽  
Vol 2020 (13) ◽  
pp. 382-1-382-9
Author(s):  
Daniele Bonatto ◽  
Sarah Fachada ◽  
Gauthier Lafruit

MPEG-I, the upcoming standard for immersive video, has steadily explored immersive video technology for free navigation applications, where any virtual viewpoint to the scene is created using Depth Image-Based Rendering (DIBR) from any number of stationary cameras positioned around the scene. This exploration has recently evolved towards a rendering pipeline using camera feeds, as well as a standard file format, containing all information for synthesizing a virtual viewpoint to a scene. We present an acceleration of our Reference View Synthesis software (RVS) that enables the rendering in real-time of novel views in a head mounted display, hence supporting virtual reality (VR) with 6 Degrees of Freedom (6DoF) including motion parallax within a restricted viewing volume. In this paper, we explain its main engineering challenges.


2021 ◽  
Vol 2 ◽  
Author(s):  
Julie Madelen Madshaven ◽  
Tonje Fjeldstad Markseth ◽  
David Bye Jomås ◽  
Ghislain Maurice Norbert Isabwe ◽  
Morten Ottestad ◽  
...  

Virtual reality (VR) technology is a promising tool in physical rehabilitation. Research indicates that VR-supported rehabilitation is beneficial for task-specific training, multi-sensory feedback, diversified rehabilitation tasks, and patient motivation. Our first goal was to create a biomechatronics laboratory with a VR setup for increasing immersion and a motion platform to provide realistic feedback to patients. The second goal was to investigate possibilities to replicate features of the biomechatronics laboratory in a home-based training system using commercially available components. The laboratory comprises of a motion platform with 6-degrees-of-freedom (Rexroth eMotion), fitted with a load cell integrated treadmill, and an Oculus Quest virtual reality headset. The load cells provide input for data collection, as well as VR motion control. The home-based rehabilitation system consists of a Nintendo Wii Balance Board and an Oculus Rift virtual reality headset. User studies in the laboratory and home environment used direct observation techniques and self-reported attitudinal research methods to assess the solution’s usability and user experience. The findings indicate that the proposed VR solution is feasible. Participants using the home-based system experienced more cybersickness and imbalance compared to those using the biomechatronics laboratory solution. Future studies will look at a setup that is safe for first patient studies, and exercises to improve diagnosis of patients and progress during rehabilitation.


2019 ◽  
Vol 26 (3) ◽  
pp. 359-370 ◽  
Author(s):  
Maurizio Vertemati ◽  
Simone Cassin ◽  
Francesco Rizzetto ◽  
Angelo Vanzulli ◽  
Marco Elli ◽  
...  

Introduction. With the availability of low-cost head-mounted displays (HMDs), virtual reality environments (VREs) are increasingly being used in medicine for teaching and clinical purposes. Our aim was to develop an interactive, user-friendly VRE for tridimensional visualization of patient-specific organs, establishing a workflow to transfer 3-dimensional (3D) models from imaging datasets to our immersive VRE. Materials and Methods. This original VRE model was built using open-source software and a mobile HMD, Samsung Gear VR. For its validation, we enrolled 33 volunteers: morphologists (n = 11), trainee surgeons (n = 15), and expert surgeons (n = 7). They tried our VRE and then filled in an original 5-point Likert-type scale 6-item questionnaire, considering the following parameters: ease of use, anatomy comprehension compared with 2D radiological imaging, explanation of anatomical variations, explanation of surgical procedures, preoperative planning, and experience of gastrointestinal/neurological disorders. Results in the 3 groups were statistically compared using analysis of variance. Results. Using cross-sectional medical imaging, the developed VRE allowed to visualize a 3D patient-specific abdominal scene in 1 hour. Overall, the 6 items were evaluated positively by all groups; only anatomy comprehension was statistically significant different among the 3 groups. Conclusions. Our approach, based on open-source software and mobile hardware, proved to be a valid and well-appreciated system to visualize 3D patient-specific models, paving the way for a potential new tool for teaching and preoperative planning.


Sign in / Sign up

Export Citation Format

Share Document