HAND–EYE COORDINATION THROUGH ENDPOINT CLOSED-LOOP AND LEARNED ENDPOINT OPEN-LOOP VISUAL SERVO CONTROL

2005 ◽  
Vol 02 (02) ◽  
pp. 203-224 ◽  
Author(s):  
CHRIS GASKETT ◽  
ALEŠ UDE ◽  
GORDON CHENG

We propose a hand-eye coordination system for a humanoid robot that supports bimanual reaching. The system combines endpoint closed-loop and open-loop visual servo control. The closed-loop component moves the eyes, head, arms, and torso, based on the position of the target and the robot's hands, as seen by the robot's head-mounted cameras. The open-loop component uses a motor-motor mapping that is learnt online to support movement when visual cues are not available.

Author(s):  
R. Mahony ◽  
P. Corke ◽  
T. Hamel

This paper considers the question of designing a fully image-based visual servo control for a class of dynamic systems. The work is motivated by the ongoing development of image-based visual servo control of small aerial robotic vehicles. The kinematics and dynamics of a rigid-body dynamical system (such as a vehicle airframe) maneuvering over a flat target plane with observable features are expressed in terms of an un-normalized spherical centroid and an optic flow measurement. The image-plane dynamics with respect to force input are dependent on the height of the camera above the target plane. This dependence is compensated by introducing virtual height dynamics and adaptive estimation in the proposed control. A fully nonlinear adaptive control design is provided that ensures asymptotic stability of the closed-loop system for all feasible initial conditions. The choice of control gains is based on an analysis of the asymptotic dynamics of the system. Results from a realistic simulation are presented that demonstrate the performance of the closed-loop system. To the author’s knowledge, this paper documents the first time that an image-based visual servo control has been proposed for a dynamic system using vision measurement for both position and velocity.


2018 ◽  
Vol 10 (6) ◽  
Author(s):  
Dejun Guo ◽  
Hesheng Wang ◽  
Kam K. Leang

This paper presents a nonlinear vision-based observer to estimate 3D translational position and velocity of a quadrotor aerial robot for closed-loop, position-based, visual-servo control in global positioning system (GPS)-denied environments. The method allows for motion control in areas where GPS signals are weak or absent, for example, inside of a building. Herein, the robot uses a low-cost on-board camera to observe at least two feature points fixed in the world frame to self-localize for feedback control, without constraints on the altitude of the robot. The nonlinear observer described takes advantage of the geometry of the perspective projection and is designed to update the translational position and velocity in real-time by exploiting visual information and information from an inertial measurement unit. One key advantage of the algorithm is it does not require constraints or assumptions on the altitude and initial estimation errors. Two new controllers based on the backstepping technique that take advantage of the estimator's output are described and implemented for trajectory tracking. The Lyapunov method is used to show asymptotic stability of the closed-loop system. Simulation and experimental results from an indoor environment where GPS localization is not available are presented to demonstrate feasibility and validate the performance of the observer and control system for hovering and tracking a circular trajectory defined in the world frame.


Author(s):  
Ghananeel Rotithor ◽  
Ashwin P. Dani

Abstract Combining perception feedback control with learning-based open-loop motion generation for the robot’s end-effector control is an attractive solution for many robotic manufacturing tasks. For instance, while performing a peg-in-the-hole or an insertion task when the hole or the recipient part is not visible in the eye-in-the-hand camera, an open-loop learning-based motion primitive method can be used to generate end-effector path. Once the recipient part is in the field of view (FOV), visual servo control can be used to control the motion of the robot. Inspired by such applications, this paper presents a control scheme that switches between Dynamic Movement Primitives (DMPs) and Image-based Visual Servo (IBVS) control combining end-effector control with perception-based feedback control. A simulation result is performed that switches the controller between DMP and IBVS to verify the performance of the proposed control methodology.


Author(s):  
Dejun Guo ◽  
Kam K. Leang

This paper presents a new nonlinear adaptive vision-based observer to estimate position and linear velocity information for closed-loop position-based visual servo control of an aerial robot in GPS-denied environments. Specifically, the observer determines the position and linear velocity of the robot for closed-loop control by observing using a low-cost on-board camera at least two feature points fixed in the world frame. The nonlinear adaptive observer takes advantage of the geometry of perspective projection, and is designed to update position and velocity information in real-time. Thus, there are no constraints or assumptions on the depth and initial estimation errors. Furthermore, the proposed parameter estimator addresses the challenge in situations where GPS signals may be weak, unreliable, or nonexistent, such in valleys, canyons, and between tall buildings, or inside of a building and under dense canopy. For closed-loop tracking control using the estimated position and velocity information, a backstepping controller is employed for the underactuated aerial robot system. The Lyapunov method is used to show stability of the closed-loop system. Simulation and experimental results are presented that validate the performance of the observer and control system for hovering and tracking a circular trajectory, where both are defined in the world (lab) frame.


2006 ◽  
Vol 315-316 ◽  
pp. 809-812
Author(s):  
Shi Jie Tian ◽  
B. Li ◽  
Zhuang De Jiang ◽  
Jun Jie Guo ◽  
H.Y. Zhao

A new visual servo control system under microscope was introduced based on the research of image-based computer visual control system, including the installation and calibration of camera, deduction of the image Jacobian and the algorithm of controlling and so on. The experiments of motion controlling and positioning showed that one pixel about 2.97μm can be reached easily in single axes under 5× eyepiece and under the same condition a 400μm micro shaft was assembled into a micro-gear with 415μm inner bore. Knowing from the control algorithm, the accuracy of the system is independent on errors in the robot kinematics or the camera calibration. The research results indicate that this method is a full closed-loop visual servo control system with a high reliability and simple control algorithm and it can be used in microassembly system.


2004 ◽  
Author(s):  
J. Chen ◽  
D. M. Dason ◽  
W. E. Dixon ◽  
V. K. Chitrakaran

Sign in / Sign up

Export Citation Format

Share Document