MoveVR: Enabling Multiform Force Feedback in Virtual Reality using Household Cleaning Robot

Author(s):  
Yuntao Wang ◽  
Zichao (Tyson) Chen ◽  
Hanchuan Li ◽  
Zhengyi Cao ◽  
Huiyi Luo ◽  
...  
2018 ◽  
Vol 35 (2) ◽  
pp. 149-160 ◽  
Author(s):  
Mustufa H. Abidi ◽  
Abdulrahman M. Al-Ahmari ◽  
Ali Ahmad ◽  
Saber Darmoul ◽  
Wadea Ameen

AbstractThe design and verification of assembly operations is essential for planning product production operations. Recently, virtual prototyping has witnessed tremendous progress, and has reached a stage where current environments enable rich and multi-modal interaction between designers and models through stereoscopic visuals, surround sound, and haptic feedback. The benefits of building and using Virtual Reality (VR) models in assembly process verification are discussed in this paper. In this paper, we present the virtual assembly (VA) of an aircraft turbine engine. The assembly parts and sequences are explained using a virtual reality design system. The system enables stereoscopic visuals, surround sounds, and ample and intuitive interaction with developed models. A special software architecture is suggested to describe the assembly parts and assembly sequence in VR. A collision detection mechanism is employed that provides visual feedback to check the interference between components. The system is tested for virtual prototype and assembly sequencing of a turbine engine. We show that the developed system is comprehensive in terms of VR feedback mechanisms, which include visual, auditory, tactile, as well as force feedback. The system is shown to be effective and efficient for validating the design of assembly, part design, and operations planning.


2004 ◽  
Vol 43 (1) ◽  
pp. 85-98 ◽  
Author(s):  
Jonathan D. French ◽  
James H. Mutti ◽  
Satish S. Nair ◽  
Michael Prewitt

Author(s):  
Michaela Huber ◽  
Uwe Katzky ◽  
Karolina Müller ◽  
Markus Blätzinger ◽  
Wolfgang Goetz ◽  
...  

2021 ◽  
Author(s):  
Yongsheng Zhou ◽  
Yaning Li ◽  
Hongqiang Ye ◽  
Siyu Wu ◽  
Xiaohan Zhao ◽  
...  

BACKGROUND Dental simulator is used in preclinical skills training and virtual reality is the main technology of it. With the development of XR technology, mixed reality appeared and it has significant advantage over virtual reality. OBJECTIVE This study intended to research and develop a mixed reality (MR) and haptic-based dental simulator for tooth preparation and preliminarily evaluate its face validity. METHODS A prototype of MR dental simulator for tooth preparation was innovatively developed by integrating the head-mounted display (HMD), special force feedback handles, foot pedal, computer hardware, and software program. Thirty-four participants were recruited and were divided into Novice group (N=17) and Skilled group (N=17) based on their clinical experience. All participants prepared a maxillary right central incisor for all ceramic crown in the dental simulator, and completed a questionnaire after the preparation to investigate their experience and evaluation toward the dental simulator in aspects of the hardware and software. RESULTS A prototype of MR dental simulator for tooth preparation (Unidental MR Simulator) was newly developed. 73.53% of the participants were satisfied with the overall experience in using Unidental MR Simulator. Over 90% of the participants agreed with that Unidental MR Simulator can stimulate their interest in learning and over 80 % of them were willing to use dental simulator Unidental MR Simulator for skills training in the future. The differences in the experience of the HMD, simulation of the dental instruments, realism of the force feedback of teeth, simulation of the tooth preparation process, overall experience of the simulator and attitudes toward the simulator between Novice group and Skilled group were not statistically significant (P>0.05). Novice group were more satisfied with the ease of use of the simulator. (P<0.05). The resolution of the HMD and the simulation of the preparation process had significant positive correlations with the overall using experience of the simulator (P<0.05). CONCLUSIONS The newly developed dental simulator for tooth preparation, Unidental MR simulator, has a good face validity. It can achieve a higher degree of similarity to the real clinical treatment environment by achieving position adjustment of patients, allowing users to have a better dental skill training experience.


2019 ◽  
Vol 121 (4) ◽  
pp. 1398-1409 ◽  
Author(s):  
Vonne van Polanen ◽  
Robert Tibold ◽  
Atsuo Nuruki ◽  
Marco Davare

Lifting an object requires precise scaling of fingertip forces based on a prediction of object weight. At object contact, a series of tactile and visual events arise that need to be rapidly processed online to fine-tune the planned motor commands for lifting the object. The brain mechanisms underlying multisensory integration serially at transient sensorimotor events, a general feature of actions requiring hand-object interactions, are not yet understood. In this study we tested the relative weighting between haptic and visual signals when they are integrated online into the motor command. We used a new virtual reality setup to desynchronize visual feedback from haptics, which allowed us to probe the relative contribution of haptics and vision in driving participants’ movements when they grasped virtual objects simulated by two force-feedback robots. We found that visual delay changed the profile of fingertip force generation and led participants to perceive objects as heavier than when lifts were performed without visual delay. We further modeled the effect of vision on motor output by manipulating the extent to which delayed visual events could bias the force profile, which allowed us to determine the specific weighting the brain assigns to haptics and vision. Our results show for the first time how visuo-haptic integration is processed at discrete sensorimotor events for controlling object-lifting dynamics and further highlight the organization of multisensory signals online for controlling action and perception. NEW & NOTEWORTHY Dexterous hand movements require rapid integration of information from different senses, in particular touch and vision, at different key time points as movement unfolds. The relative weighting between vision and haptics for object manipulation is unknown. We used object lifting in virtual reality to desynchronize visual and haptic feedback and find out their relative weightings. Our findings shed light on how rapid multisensory integration is processed over a series of discrete sensorimotor control points.


2006 ◽  
Vol 84 (1) ◽  
pp. 11-18 ◽  
Author(s):  
P. Wang ◽  
A.A. Becker ◽  
I.A. Jones ◽  
A.T. Glover ◽  
S.D. Benford ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document