scholarly journals Extrinsic Parameter Calibration Method for a Visual/Inertial Integrated System with a Predefined Mechanical Interface

Sensors ◽  
2019 ◽  
Vol 19 (14) ◽  
pp. 3086
Author(s):  
Ouyang ◽  
Shi ◽  
You ◽  
Zhao

For a visual/inertial integrated system, the calibration of extrinsic parameters plays a crucial role in ensuring accurate navigation and measurement. In this work, a novel extrinsic parameter calibration method is developed based on the geometrical constraints in the object space and is implemented by manual swing. The camera and IMU frames are aligned to the system body frame, which is predefined by the mechanical interface. With a swinging motion, the fixed checkerboard provides constraints for calibrating the extrinsic parameters of the camera, whereas angular velocity and acceleration provides constraints for calibrating the extrinsic parameters of the IMU. We exploit the complementary nature of both the camera and IMU, of which the latter assists in the checkerboard corner detection and correction while the former suppresses the effects of IMU drift. The results of the calibration experiment reveal that the extrinsic parameter accuracy reaches 0.04° for each Euler angle and 0.15 mm for each position vector component (1σ).

2021 ◽  
Vol 2021 ◽  
pp. 1-9
Author(s):  
Qin Shi ◽  
Huansheng Song ◽  
Shijie Sun

Calibration of extrinsic parameters of the RGB-D camera can be applied in many fields, such as 3D scene reconstruction, robotics, and target detection. Many calibration methods employ a specific calibration object (i.e., a chessboard, cuboid, etc.) to calibrate the extrinsic parameters of the RGB-D color camera without using the depth map. As a result, it is difficult to simplify the calibration process, and the color sensor gets calibrated instead of the depth sensor. To this end, we propose a method that employs the depth map to perform extrinsic calibration automatically. In detail, the depth map is first transformed to a 3D point cloud in the camera coordinate system, and then the planes in the 3D point cloud are automatically detected using the Maximum Likelihood Estimation Sample Consensus (MLESAC) method. After that, according to the constraint relationship between the ground plane and the world coordinate system, all planes are traversed and screened until the ground plane is obtained. Finally, the extrinsic parameters are calculated using the spatial relationship between the ground plane and the camera coordinate system. The results show that the mean roll angle error of extrinsic parameter calibration was −1.14°. The mean pitch angle error was 4.57°, and the mean camera height error was 3.96 cm. The proposed method can accurately and automatically estimate the extrinsic parameters of a camera. Furthermore, after parallel optimization, it can achieve real-time performance for automatically estimating a robot’s attitude.


2012 ◽  
Vol 605-607 ◽  
pp. 859-863
Author(s):  
Yu Bo Guo ◽  
Gang Chen ◽  
Dong Ye ◽  
Feng Yuan

A field calibration method based on virtual 1D target is proposed for the extrinsic parameters of binocular vision. A target is placed on high precision 1D lifting platform to create virtual 1D target through motions of lifting platform. Two cameras are used to obtain virtual target images of different positions and preliminarily achieve extrinsic parameter calibration of binocular vision based on epipolar constraint equation. Finally, the length of virtual 1-D target is used to optimize the extrinsic parameters. This method is featured with easy operation, flexible application and field calibration. The experimental results verify the feasibility of this calibration method and show it can yield high field calibration precision.


2017 ◽  
Vol 37 (9) ◽  
pp. 0915003
Author(s):  
宋佳慧 Song Jiahui ◽  
任永杰 Ren Yongjie ◽  
杨守瑞 Yang Shourui ◽  
尹仕斌 Yin Shibin ◽  
郭 寅 Guo Yin ◽  
...  

2013 ◽  
Vol 662 ◽  
pp. 717-720 ◽  
Author(s):  
Zhen Yu Zheng ◽  
Yan Bin Gao ◽  
Kun Peng He

As an inertial sensors assembly, the FOG inertial measurement unit (FIMU) must be calibrated before being used. The paper presents a one-time systematic IMU calibration method only using two-axis low precision turntable. First, the detail error model of inertial sensors using defined body frame is established. Then, only velocity taken as observation, system 33 state equation is established including the lever arm effects and nonlinear terms of scale factor error. The turntable experiments verify that the method can identify all the error coefficients of FIMU on low-precision two-axis turntable, after calibration the accuracy of navigation is improved.


2018 ◽  
Vol 10 (8) ◽  
pp. 1298 ◽  
Author(s):  
Lei Yin ◽  
Xiangjun Wang ◽  
Yubo Ni ◽  
Kai Zhou ◽  
Jilong Zhang

Multi-camera systems are widely used in the fields of airborne remote sensing and unmanned aerial vehicle imaging. The measurement precision of these systems depends on the accuracy of the extrinsic parameters. Therefore, it is important to accurately calibrate the extrinsic parameters between the onboard cameras. Unlike conventional multi-camera calibration methods with a common field of view (FOV), multi-camera calibration without overlapping FOVs has certain difficulties. In this paper, we propose a calibration method for a multi-camera system without common FOVs, which is used on aero photogrammetry. First, the extrinsic parameters of any two cameras in a multi-camera system is calibrated, and the extrinsic matrix is optimized by the re-projection error. Then, the extrinsic parameters of each camera are unified to the system reference coordinate system by using the global optimization method. A simulation experiment and a physical verification experiment are designed for the theoretical arithmetic. The experimental results show that this method is operable. The rotation error angle of the camera’s extrinsic parameters is less than 0.001rad and the translation error is less than 0.08 mm.


Sensors ◽  
2021 ◽  
Vol 21 (14) ◽  
pp. 4643
Author(s):  
Sang Jun Lee ◽  
Jeawoo Lee ◽  
Wonju Lee ◽  
Cheolhun Jang

In intelligent vehicles, extrinsic camera calibration is preferable to be conducted on a regular basis to deal with unpredictable mechanical changes or variations on weight load distribution. Specifically, high-precision extrinsic parameters between the camera coordinate and the world coordinate are essential to implement high-level functions in intelligent vehicles such as distance estimation and lane departure warning. However, conventional calibration methods, which solve a Perspective-n-Point problem, require laborious work to measure the positions of 3D points in the world coordinate. To reduce this inconvenience, this paper proposes an automatic camera calibration method based on 3D reconstruction. The main contribution of this paper is a novel reconstruction method to recover 3D points on planes perpendicular to the ground. The proposed method jointly optimizes reprojection errors of image features projected from multiple planar surfaces, and finally, it significantly reduces errors in camera extrinsic parameters. Experiments were conducted in synthetic simulation and real calibration environments to demonstrate the effectiveness of the proposed method.


Sign in / Sign up

Export Citation Format

Share Document