Identification Accuracy and Efficiency of Haptic Virtual Objects Using Force-feedback

Author(s):  
Maik Stamm ◽  
M. Ercan Altinsoy ◽  
Sebastian Merchel
Author(s):  
Rasul Fesharakifard ◽  
Maryam Khalili ◽  
Laure Leroy ◽  
Alexis Paljic ◽  
Philippe Fuchs

A grasp exoskeleton actuated by a string-based platform is proposed to provide the force feedback for a user’s hand in human-scale virtual environments. The user of this interface accedes to seven active degrees of freedom in interaction with virtual objects, which comprises three degrees of translation, three degrees of rotation, and one degree of grasping. The exoskeleton has a light and ergonomic structure and provides the grasp gesture for five fingers. The actuation of the exoskeleton is performed by eight strings that are the parallel arms of the platform. Each string is connected to a block of motor, rotary encoder, and force sensor with a novel design to create the necessary force and precision for the interface. A hybrid control method based on the string’s tension measured by the force sensor is developed to resolve the ordinary problems of string-based interface. The blocks could be moved on a cubic frame around the virtual environment. Finally the results of preliminary experimentation of interface are presented to show its practical characteristics. Also the interface is mounted on an automotive model to demonstrate its industrial adaptability.


1993 ◽  
Vol 5 (1) ◽  
pp. 79-84 ◽  
Author(s):  
Haruhisa Kawasaki ◽  
◽  
Takahiro Hayashi

This paper presents a new force feedback glove for manipulation of virtual objects. The glove is comprised of wire, link, servo motor, force sensor, and joint angle sensor of fingers. These devices are mounted to the back of glove. The object grasping sense is generated by the force feedback control of the servo motor. We show the force transmission characteristics of the glove and the experimental results of recognition of the difference in rigidity of object.


2018 ◽  
Author(s):  
Wenyan Bi ◽  
Jonathan Newport ◽  
Bei Xiao

ABSTRACTWe use force-feedback device and a game engine to measure the effects of material appearance on the perception of mass of virtual objects. We discover that the perceived mass is mainly determined by the ground-truth mass output by the force-feedback device. Different from the classic Material Weight Illusion (MWI), however, heavy-looking objects (e.g. steel) are consistently rated heavier than light-looking ones (e.g. fabric) with the same ground-truth mass. Analysis of the initial accelerated velocity of the movement trajectories of the virtual probe shows greater acceleration for materials with heavier rated mass. This effect is diminished when the participants lift the object for the second time, meaning that the influence of visual appearance disappears in the movement trajectories once it is calibrated by the force-feedback. We also show how the material categories are affected by both the visual appearance and the weight of the object. We conclude that visual appearance has a significant interaction with haptic force-feedback on the perception of mass and also affects the kinematics of how participants manipulate the object.CCS CONCEPTS• Human-centered computing → Empirical studies in HCI; Empirical studies in interaction design; Empirical studies in visualization;ACM Reference FormatWenyan Bi, Jonathan Newport, and Bei Xiao. 2018. Interaction between static visual cues and force-feedback on the perception of mass of virtual objects. In Proceedings of. ACM, New York, NY, USA, 5 pages.


1999 ◽  
Vol 4 (1) ◽  
pp. 8-17 ◽  
Author(s):  
G Jansson ◽  
H Petrie ◽  
C Colwell ◽  
D. Kornbrot ◽  
J. Fänger ◽  
...  

This paper is a fusion of two independent studies investigating related problems concerning the use of haptic virtual environments for blind people: a study in Sweden using a PHANToM 1.5 A and one in the U.K. using an Impulse Engine 3000. In general, the use of such devices is a most interesting option to provide blind people with information about representations of the 3D world, but the restriction at each moment to only one point of contact between observer and virtual object might decrease their effectiveness. The studies investigated the perception of virtual textures, the identification of virtual objects and the perception of their size and angles. Both sighted (blindfolded in one study) and blind people served as participants. It was found (1) that the PHANToM can effectively render textures in the form of sandpapers and simple 3D geometric forms and (2) that the Impulse Engine can effectively render textures consisting of grooved surfaces, as well as 3D objects, properties of which were, however, judged with some over- or underestimation. When blind and sighted participants' performance was compared differences were found that deserves further attention. In general, the haptic devices studied have demonstrated the great potential of force feedback devices in rendering relatively simple environments, in spite of the restricted ways they allow for exploring the virtual world. The results highly motivate further studies of their effectiveness, especially in more complex contexts.


Author(s):  
Hugo I. Medellín-Castillo ◽  
Germánico González-Badillo ◽  
Eder Govea ◽  
Raquel Espinosa-Castañeda ◽  
Enrique Gallegos

The technological growth in the last years have conducted to the development of virtual reality (VR) systems able to immerse the user into a three-dimensional (3D) virtual environment where the user can interact in real time with virtual objects. This interaction is mainly based on visualizing the virtual environment and objects. However, with the recent beginning of haptic systems, the interaction with the virtual world has been extended to also feel, touch and manipulate virtual objects. Virtual reality has been successfully used in the development of applications in different scientific areas ranging from basic sciences, social science, education and entertainment. On the other hand, the use of haptics has increased in the last decade in domains from sciences and engineering to art and entertainment. Despite many developments, there is still relatively little knowledge about the confluence of software, enabling hardware, visual and haptic representations, to enable the conditions that best provide for an immersive sensory environment to convey information about a particular subject domain. In this paper, the state of the art of the research work regarding virtual reality and haptic technologies carried out by the authors in the last years is presented. The aim is to evidence the potential use of these technologies to develop usable systems for analysis and simulation in different areas of knowledge. The development of three different systems in the areas of engineering, medicine and art is presented. In the area of engineering, a system for the planning, evaluation and training of assembly and manufacturing tasks has been developed. The system, named as HAMS (Haptic Assembly and Manufacturing System), is able to simulate assembly tasks of complex components with force feedback provided by the haptic device. On the other hand, in the area of medicine, a surgical simulator for planning and training orthognathic surgeries has been developed. The system, named as VOSS (Virtual Osteotomy Simulator System), allows the realization of virtual osteotomies with force feedback. Finally, in the area of art, an interactive cinema system for blind people has been developed. The system is able to play a 3D virtual movie for the blind user to listen to and touch by means of the haptic device. The development of these applications and the results obtained from these developments are presented and discussed in this paper.


Author(s):  
Conrad Bullion ◽  
Goktug A. Dazkir ◽  
Hakan Gurocak

In this paper we present details of a finger mechanism designed as part of an ongoing research on a force feedback glove. The glove will be used in virtual reality applications where it will provide force feedback to the user as he grasps virtual objects. Haptic (touch and force) feedback is an essential component to make the simulated environment feel more realistic to the user. The design employs an innovative mechanism that wraps around each finger. Each mechanism is controlled by one cable. By controlling the tension on the cable and the displacement of the cable, we can control the amount of force applied to the user’s finger at any given position of the mechanism. The glove can provide distributed forces at the bottom surface of each finger while reducing the number of actuators and sensors. First kinematic and force analysis of the mechanism along with experimental verifications are presented. Following description of an experiment to determine grasping forces, we conclude with an overview of the next steps in this research.


Author(s):  
Jukka Kuusisto ◽  
Asko Ellman ◽  
Joonas Reunamo ◽  
Joonatan Kuosa

In mechanical engineering, hardware mock-ups are increasingly being replaced by virtual models. Virtual environments enable the testing of different designs with considerable savings on time and money. Haptic feedback helps the user in getting a realistic conception about the cabin dimensions and how different controls actually look and feel. The haptic interface must be convenient to use and give realistic feedback on the functioning of the controls. The haptic force-feedback glove “SPM Glove” with soft pneumatic muscles — SPMs for short — on the palm side has been developed at the Department of Mechanics and Design at Tampere University of Technology. The glove provides force feedback to the thumb, index, and middle fingertips. In this paper, the usability of the SPM Glove for grasping, moving, and comparing the size of virtual objects is investigated. For achieving finger position information, the SPM Glove was worn over a data glove. Hand position was tracked with a magnetic tracker. The results indicate that users find manipulating cylindrical objects easier, more comfortable, and more natural with force feedback provided by the SPM Glove than without it. Moreover, all test users managed to arrange three invisible virtual cylinders of different sizes in order of increasing thickness using the SPM Glove.


Author(s):  
Rakesh Gupta ◽  
David Zeltzer

Abstract This work investigates whether estimates of ease of part handling and part insertion can be provided by multimodal simulation using virtual environment (VE) technology, rather than by using conventional table-based methods such as Boothroyd and Dewhurst Charts. To do this, a unified physically based model has been developed for modeling dynamic interactions among virtual objects and haptic interactions between the human designer and the virtual objects. This model is augmented with auditory events in a multimodal VE system called the “Virtual Environment for Design for Assembly” (VEDA). Currently these models are 2D in order to preserve interactive update rates, but we expect that these results will be generalizable to 3d models. VEDA has been used to evaluate the feasibility and advantages of using multimodal virtual environments as a design tool for manual assembly. The designer sees a visual representation of the objects and can interactively sense and manipulate virtual objects through haptic interface devices with force feedback. He/She can feel these objects and hear sounds when there are collisions among the objects. Objects can be interactively grasped and assembled with other parts of the assembly to prototype new designs and perform Design for Assembly analysis. Experiments have been conducted with human subjects to investigate whether Multimodal Virtual Environments are able to replicate experiments linking increases in assembly time with increase in task difficulty. In particular, the effect of clearance, friction, chamfers and distance of travel on handling and insertion time have been compared in real and virtual environments for peg-in-hole assembly task. In addition, the effects of degrading/removing the different modes (visual, auditory and haptic) on different phases of manual assembly have been examined.


Sign in / Sign up

Export Citation Format

Share Document