Easy calibration method for omni-stereo camera system

ICCAS 2010 ◽  
2010 ◽  
Author(s):  
I-Sak Choi ◽  
Jong-Eun Ha
Author(s):  
A. Choinowski ◽  
D. Dahlke ◽  
I. Ernst ◽  
S. Pless ◽  
I. Rettig

<p><strong>Abstract.</strong> This paper presents a system calibration method for a trifocal sensor, which is sensitive to different spectral bands. The trifocal camera system consists of a stereo camera, operating in the visual (VIS) spectrum and a thermal imaging camera, operating in the Long-Wave- Infrared (LWIR) spectrum. Intrinsic parameters and spatial alignment are determined simultaneously. As calibration target a passive aluminium chessboard is used. Corner detection and subsequent bundle adjustment is done on all synchronized image triplets. The remaining reprojection errors are in the sub-pixel range and enable the system to generate metric point clouds, colored with thermal intensities in real-time.</p>


2017 ◽  
Vol 2017 ◽  
pp. 1-12 ◽  
Author(s):  
Pathum Rathnayaka ◽  
Seung-Hae Baek ◽  
Soon-Yong Park

We present two simple approaches to calibrate a stereo camera setup with heterogeneous lenses: a wide-angle fish-eye lens and a narrow-angle lens in left and right sides, respectively. Instead of using a conventional black-white checkerboard pattern, we design an embedded checkerboard pattern by combining two differently colored patterns. In both approaches, we split the captured stereo images into RGB channels and extract R and inverted G channels from left and right camera images, respectively. In our first approach, we consider the checkerboard pattern as the world coordinate system and calculate left and right transformation matrices corresponding to it. We use these two transformation matrices to estimate the relative pose of the right camera by multiplying the inversed left transformation with the right. In the second approach, we calculate a planar homography transformation to identify common object points in left-right image pairs and treat them with the well-known Zhangs camera calibration method. We analyze the robustness of these two approaches by comparing reprojection errors and image rectification results. Experimental results show that the second method is more accurate than the first one.


2007 ◽  
Author(s):  
Satoshi Katahira ◽  
Eiji Shibata ◽  
Tatsuhiko Monji
Keyword(s):  

2018 ◽  
Vol 10 (8) ◽  
pp. 1298 ◽  
Author(s):  
Lei Yin ◽  
Xiangjun Wang ◽  
Yubo Ni ◽  
Kai Zhou ◽  
Jilong Zhang

Multi-camera systems are widely used in the fields of airborne remote sensing and unmanned aerial vehicle imaging. The measurement precision of these systems depends on the accuracy of the extrinsic parameters. Therefore, it is important to accurately calibrate the extrinsic parameters between the onboard cameras. Unlike conventional multi-camera calibration methods with a common field of view (FOV), multi-camera calibration without overlapping FOVs has certain difficulties. In this paper, we propose a calibration method for a multi-camera system without common FOVs, which is used on aero photogrammetry. First, the extrinsic parameters of any two cameras in a multi-camera system is calibrated, and the extrinsic matrix is optimized by the re-projection error. Then, the extrinsic parameters of each camera are unified to the system reference coordinate system by using the global optimization method. A simulation experiment and a physical verification experiment are designed for the theoretical arithmetic. The experimental results show that this method is operable. The rotation error angle of the camera’s extrinsic parameters is less than 0.001rad and the translation error is less than 0.08 mm.


2021 ◽  
Vol 15 (03) ◽  
pp. 337-357
Author(s):  
Alexander Julian Golkowski ◽  
Marcus Handte ◽  
Peter Roch ◽  
Pedro J. Marrón

For many application areas such as autonomous navigation, the ability to accurately perceive the environment is essential. For this purpose, a wide variety of well-researched sensor systems are available that can be used to detect obstacles or navigation targets. Stereo cameras have emerged as a very versatile sensing technology in this regard due to their low hardware cost and high fidelity. Consequently, much work has been done to integrate them into mobile robots. However, the existing literature focuses on presenting the concepts and algorithms used to implement the desired robot functions on top of a given camera setup. As a result, the rationale and impact of choosing this camera setup are usually neither discussed nor described. Thus, when designing the stereo camera system for a mobile robot, there is not much general guidance beyond isolated setups that worked for a specific robot. To close the gap, this paper studies the impact of the physical setup of a stereo camera system in indoor environments. To do this, we present the results of an experimental analysis in which we use a given software setup to estimate the distance to an object while systematically changing the camera setup. Thereby, we vary the three main parameters of the physical camera setup, namely the angle and distance between the cameras as well as the field of view and a rather soft parameter, the resolution. Based on the results, we derive several guidelines on how to choose the parameters for an application.


Sensors ◽  
2018 ◽  
Vol 18 (11) ◽  
pp. 3706 ◽  
Author(s):  
Joong-Jae Lee ◽  
Mun-Ho Jeong

This paper presents a stereo camera-based head-eye calibration method that aims to find the globally optimal transformation between a robot’s head and its eye. This method is highly intuitive and simple, so it can be used in a vision system for humanoid robots without any complex procedures. To achieve this, we introduce an extended minimum variance approach for head-eye calibration using surface normal vectors instead of 3D point sets. The presented method considers both positional and orientational error variances between visual measurements and kinematic data in head-eye calibration. Experiments using both synthetic and real data show the accuracy and efficiency of the proposed method.


Mechatronics ◽  
2011 ◽  
Vol 21 (2) ◽  
pp. 390-398 ◽  
Author(s):  
Martin Lauer ◽  
Miriam Schönbein ◽  
Sascha Lange ◽  
Stefan Welker
Keyword(s):  

Sign in / Sign up

Export Citation Format

Share Document