An Experiment on Weight Sensation in Real and Virtual Environments

Author(s):  
Adam S. Coutee ◽  
Bert Bras

Virtual reality allows users to visualize and interact with a three-dimensional world in a computer-generated environment. Haptic technology has allowed enhancement to these environments, adding the sense of touch through force and tactile feedback devices. In the engineering domain, these devices have been implemented in many areas including product design. We have developed a real-time simulation test bed to assess the usefulness of haptic technology for assembly and disassembly planning. In this paper, we present a study conducted to characterize the perception of weight in this virtual environment. Specifically, the experiments performed test the ability of a user to distinguish weight differences between two objects in real and virtual environments. This paper describes the experiments conducted and an analysis of the results.

2010 ◽  
pp. 180-193 ◽  
Author(s):  
F. Steinicke ◽  
G. Bruder ◽  
J. Jerald ◽  
H. Frenz

In recent years virtual environments (VEs) have become more and more popular and widespread due to the requirements of numerous application areas in particular in the 3D city visualization domain. Virtual reality (VR) systems, which make use of tracking technologies and stereoscopic projections of three-dimensional synthetic worlds, support better exploration of complex datasets. However, due to the limited interaction space usually provided by the range of the tracking sensors, users can explore only a portion of the virtual environment (VE). Redirected walking allows users to walk through large-scale immersive virtual environments (IVEs) such as virtual city models, while physically remaining in a reasonably small workspace by intentionally injecting scene motion into the IVE. With redirected walking users are guided on physical paths that may differ from the paths they perceive in the virtual world. The authors have conducted experiments in order to quantify how much humans can unknowingly be redirected. In this chapter they present the results of this study and the implications for virtual locomotion user interfaces that allow users to view arbitrary real world locations, before the users actually travel there in a natural environment.


2005 ◽  
Vol 32 (5) ◽  
pp. 777-785 ◽  
Author(s):  
Ebru Cubukcu ◽  
Jack L Nasar

Discrepanices between perceived and actual distance may affect people's spatial behavior. In a previous study Nasar, using self report of behavior, found that segmentation (measured through the number of buildings) along the route affected choice of parking garage and path from the parking garage to a destination. We recreated that same environment in a three-dimensional virtual environment and conducted a test to see whether the same factors emerged under these more controlled conditions and to see whether spatial behavior in the virtual environment accurately reflected behavior in the real environment. The results confirmed similar patterns of response in the virtual and real environments. This supports the use of virtual reality as a tool for predicting behavior in the real world and confirms increases in segmentation as related to increases in perceived distance.


Author(s):  
You Wu ◽  
Lara Schmidt ◽  
Matthew Parker ◽  
John Strong ◽  
Michael Bruns ◽  
...  

We present a novel, low-power and untethered pneumatic haptic device, namely the ACTIVE-Hand, for realistic and real-time 3D gaming experience. Currently, body-motion based 3D gaming systems primarily use visual feedback to provide partly immersive gaming experiences. Tactile feedback systems in Virtual Reality provide immersion with high tactile resolution, but they are expensive and difficult to setup and calibrate. The conceptually economical modular design of the ACTIVE-Hand allows easily configurable tactile feedback as per application requirements. Contrary to commercial systems like Wii™ which provide global vibrations as a proxy for synthetic tactile feed-back, the ACTIVE Hand is comparably lightweight, yet scalable to meet localized tactile resolution requirements. The ACTIVE-Hand provides controllable pulses for dynamic virtual interactions such as pressing virtual buttons and hitting moving virtual balls. We successfully demonstrate the paradigm of dynamic tactile interactions in virtual environments through a 3D Pong game by integrating the ACTIVE-Hand with Kinect™ camera.


2006 ◽  
Vol 5-6 ◽  
pp. 55-62
Author(s):  
I.A. Jones ◽  
A.A. Becker ◽  
A.T. Glover ◽  
P. Wang ◽  
S.D. Benford ◽  
...  

Boundary element (BE) analysis is well known as a tool for assessing the stiffness and strength of engineering components, but, along with finite element (FE) techniques, it is also finding new applications as a means of simulating the behaviour of deformable objects within virtual reality simulations since it exploits precisely the same kind of surface-only definition used for visual rendering of three-dimensional solid objects. This paper briefly reviews existing applications of BE and FE within virtual reality, and describes recent work on the BE-based simulation of aspects of surgical operations on the brain, making use of commercial hand-held force-feedback interfaces (haptic devices) to measure the positions of the virtual surgical tools and provide tactile feedback to the user. The paper presents an overview of the project then concentrates on recent developments, including the incorporation of simulated tumours in the virtual brain.


Robotics ◽  
2021 ◽  
Vol 10 (1) ◽  
pp. 29
Author(s):  
Gowri Shankar Giri ◽  
Yaser Maddahi ◽  
Kourosh Zareinia

Recent technological development has led to the invention of different designs of haptic devices, electromechanical devices that mediate communication between the user and the computer and allow users to manipulate objects in a virtual environment while receiving tactile feedback. The main criteria behind providing an interactive interface are to generate kinesthetic feedback and relay information actively from the haptic device. Sensors and feedback control apparatus are of paramount importance in designing and manufacturing a haptic device. In general, haptic technology can be implemented in different applications such as gaming, teleoperation, medical surgeries, augmented reality (AR), and virtual reality (VR) devices. This paper classifies the application of haptic devices based on the construction and functionality in various fields, followed by addressing major limitations related to haptics technology and discussing prospects of this technology.


2016 ◽  
Vol 15 (2) ◽  
pp. 18-29
Author(s):  
Andrew Ray

Virtual environments (VEs) demonstrate the immense potential computer technology can provide to society. VEs have been created for almost two decades, but standardized tools and procedures for their creation do not exist. Numerous efforts to create tools for creating VEs have come and gone, but there is little consensus among tool creators for establishing a common subset of standard features that developers can expect. Currently, developers use one of many Virtual Reality (VR) toolkits to create a VE. However, VR toolkits are problematic when it comes to interoperability between applications and other VR toolkits. This paper investigates why the development tools are in this state. A discussion on the history of VR toolkits and developer experiences is used to show what developers face when they create a VE. Next, Three Dimensional Interaction Technique (3DIT) toolkits are introduced to show a new way of developing some parts of VEs. Lastly, a vision for the future of VE development that may help improve the next generation of toolkits is presented.


2009 ◽  
pp. 202-210
Author(s):  
Paulo N.M. Sampaio ◽  
Ildeberto A. Rodello ◽  
Laura M. Rodríguez Peralta ◽  
Paulo Alexandre Bressan

Virtual reality (VR) represents a modern human-computer interface consisting of a three-dimensional (3D) environment generated by computer where the user can interact in different ways. VR can be applied in several applications domains such as medicine, education, entertainment, etc. In particular, interest is drawn to the application of VR in education since a student is able to interact and to be involved with a 3D environment, which simulates situations that are difficult or even impossible to be carried out in the traditional education process.


Author(s):  
William Bricken ◽  
Geoffrey Coco

Computer technology has only recently become advanced enough to solve the problems it creates with its own interface. One solution, virtual reality (VR), immediately raises fundamental issues in both semantics and epistcmology. Broadly, virtual reality is that aspect of reality which people construct from information, a reality which is potentially orthogonal to the reality of mass. Within computer science, VR refers to interaction with computer-generated spatial environments, environments constructed to include and immerse those who enter them. VR affords non-symbolic experience within a symbolic environment Since people evolve in a spatial environment, our knowledge skills are anchored to interactions within spatial environments. VR design techniques, such as scientific visualization, map digital information onto spatial concepts. When our senses are immersed in stimuli from the virtual world, our minds construct a closure to create the experience of inclusion. Participant inclusion is the defining characteristic of VR. (Participation within information is often called immersion.) Inclusion is measured by the degree of presence a participant experiences in a virtual environment. We currently use computers as symbol processors, interacting with them through a layer of symbolic mediation. The computer user, just like the reader of books, must provide cognitive effort to convert the screen’s representations into the user’s meanings. VR systems, in contrast, provide interface tools which support natural behavior as input and direct perceptual recognition of output. The idea is to access digital data in the form most easy for our comprehension; this generally implies using representations that look and feel like the thing they represent. A physical pendulum, for example, might be represented by an accurate three-dimensional digital model of a pendulum which supports direct spatial interaction and dynamically behaves as would an actual pendulum. Immersive environments redefine the relationship between experience and representation, in effect eliminating the syntax-semantics barrier. Reading, writing, and arithmetic are cast out of the computer interface, replaced by direct, non-symbolic environmental experience. Before we can explore the deeper issues of experience in virtual environments, we must develop an infrastructure of hardware and software to support “tricking the senses” into believing that representation is reality. The VEOS project was designed to provide a rapid prototyping infrastructure for exploring virtual environments.


Sign in / Sign up

Export Citation Format

Share Document