Motion planning and control: from virtual environments to the real world

2003 ◽  
Vol 36 (12) ◽  
pp. 105-110
Author(s):  
Omar A.A. Orqueda ◽  
José Figueroa ◽  
Osvaldo E. Agamennoni
1999 ◽  
Vol 8 (6) ◽  
pp. 598-617 ◽  
Author(s):  
James N. Templeman ◽  
Patricia S. Denbrook ◽  
Linda E. Sibert

This paper presents both an analysis of requirements for user control over simulated locomotion and a new control technique designed to meet these requirements. The goal is to allow the user to move through virtual environments in as similar a manner as possible to walking through the real world. We approach this problem by examining the interrelationships between motion control and the other actions people use to act, sense, and react to their environment. If the interactions between control actions and sensory feedback can be made comparable to those of actions in the real world, then there is hope for constructing an effective new technique. Candidate solutions are reviewed once the analysis is developed. This analysis leads to a promising new design for a sensor-based virtual locomotion called Gaiter. The new control allows users to direct their movement through virtual environments by stepping in place. The movement of a person's legs is sensed, and in-place walking is treated as a gesture indicating the user intends to take a virtual step. More specifically, the movement of the user's legs determines the direction, extent, and timing of their movement through virtual environments. Tying virtual locomotion to leg motion allows a person to step in any direction and control the stride length and cadence of his virtual steps. The user can walk straight, turn in place, and turn while advancing. Motion is expressed in a body-centric coordinate system similar to that of actual stepping. The system can discriminate between gestural and actual steps, so both types of steps can be intermixed.


2004 ◽  
Vol 4 (2) ◽  
pp. 109-113 ◽  
Author(s):  
Thomas Reuding ◽  
Pamela Meil

The predictive value and the reliability of evaluations made in immersive projection environments are limited when compared to the real world. As in other applications of numerical simulations, the acceptance of such techniques does not only depend on the stability of the methods, but also on the quality and credibility of the results obtained. In this paper, we investigate the predictive value of virtual reality and virtual environments when used for engineering assessment tasks. We examine the ergonomics evaluation of a vehicle interior, which is a complex activity relying heavily on know-how gained from personal experience, and compare performance in a VE with performance in the real world. If one assumes that within complex engineering processes certain types of work will be performed by more or less the same personnel, one can infer that a fairly consistent base of experience-based knowledge exists. Under such premises and if evaluations are conducted as comparisons within the VE, we believe that the reliability of the assessments is suitable for conceptual design work. Despite a number of unanswered questions at this time we believe this study leads to a better understanding of what determines the reliability of results obtained in virtual environments, thus making it useful for optimizing virtual prototyping processes and better utilization of the potential of VR and VEs in company work processes.


Author(s):  
Fahad Iqbal Khawaja ◽  
Akira Kanazawa ◽  
Jun Kinugawa ◽  
Kazuhiro Kosuge

Human-Robot Interaction (HRI) for collaborative robots has become an active research topic recently. Collaborative robots assist the human workers in their tasks and improve their efficiency. But the worker should also feel safe and comfortable while interacting with the robot. In this paper, we propose a human-following motion planning and control scheme for a collaborative robot which supplies the necessary parts and tools to a worker in an assembly process in a factory. In our proposed scheme, a 3-D sensing system is employed to measure the skeletal data of the worker. At each sampling time of the sensing system, an optimal delivery position is estimated using the real-time worker data. At the same time, the future positions of the worker are predicted as probabilistic distributions. A Model Predictive Control (MPC) based trajectory planner is used to calculate a robot trajectory that supplies the required parts and tools to the worker and follows the predicted future positions of the worker. We have installed our proposed scheme in a collaborative robot system with a 2-DOF planar manipulator. Experimental results show that the proposed scheme enables the robot to provide anytime assistance to a worker who is moving around in the workspace while ensuring the safety and comfort of the worker.


2020 ◽  
Author(s):  
Timothy F. Brady ◽  
Viola S. Störmer ◽  
Anna Shafer-Skelton ◽  
Jamal Rodgers Williams ◽  
Angus F. Chapman ◽  
...  

Both visual attention and visual working memory tend to be studied with very simple stimuli and low-level paradigms, designed to allow us to understand the representations and processes in detail, or with fully realistic stimuli that make such precise understanding difficult but are more representative of the real world. In this chapter we argue for an intermediate approach in which visual attention and visual working memory are studied by scaling up from the simplest settings to more complex settings that capture some aspects of the complexity of the real-world, while still remaining in the realm of well-controlled stimuli and well-understood tasks. We believe this approach, which we have been taking in our labs, will allow a more generalizable set of knowledge about visual attention and visual working memory while maintaining the rigor and control that is typical of vision science and psychophysics studies.


Sign in / Sign up

Export Citation Format

Share Document