scholarly journals CALIBRATING A LENS WITH A “LOCAL” DISTORTION MODEL

Author(s):  
I. Detchev ◽  
D. Lichti

Abstract. This paper is about camera calibration where an abnormal systematic effect was discovered. The effect was first encountered in a multi-camera system used for close range 3D photogrammetric reconstruction. The objectives for this research were two-fold. The first objective is to identify the source of the systematic error, and the second objective is to model the error as rigorously as possible. The first objective was met after acquiring several calibration data sets where the camera bodies, the lenses, and the image formats were varied. It was concluded that the source of error is the lens system. The second objective was also met. The so called “local” lens distortion was modelled using second order polynomials as the plots of the residuals vs. the image coordinates resembled parabolic shapes. Overall, the final room mean square error for the residuals after applying radial and “local” lens distortion was reduced from 1/2 to 1/6 of a pixel or a 200% relative estimated error improvement.

2021 ◽  
Vol 13 (7) ◽  
pp. 1380
Author(s):  
Sébastien Dandrifosse ◽  
Alexis Carlier ◽  
Benjamin Dumont ◽  
Benoît Mercatoris

Multimodal images fusion has the potential to enrich the information gathered by multi-sensor plant phenotyping platforms. Fusion of images from multiple sources is, however, hampered by the technical lock of image registration. The aim of this paper is to provide a solution to the registration and fusion of multimodal wheat images in field conditions and at close range. Eight registration methods were tested on nadir wheat images acquired by a pair of red, green and blue (RGB) cameras, a thermal camera and a multispectral camera array. The most accurate method, relying on a local transformation, aligned the images with an average error of 2 mm but was not reliable for thermal images. More generally, the suggested registration method and the preprocesses necessary before fusion (plant mask erosion, pixel intensity averaging) would depend on the application. As a consequence, the main output of this study was to identify four registration-fusion strategies: (i) the REAL-TIME strategy solely based on the cameras’ positions, (ii) the FAST strategy suitable for all types of images tested, (iii) and (iv) the ACCURATE and HIGHLY ACCURATE strategies handling local distortion but unable to deal with images of very different natures. These suggestions are, however, limited to the methods compared in this study. Further research should investigate how recent cutting-edge registration methods would perform on the specific case of wheat canopy.


Author(s):  
Chris Eddy ◽  
Christopher de Saxe ◽  
David Cebon

Heavy goods vehicles are overrepresented in cyclist fatality statistics in the United Kingdom relative to their proportion of total traffic volume. In particular, the statistics highlight a problem for vehicles turning left across the path of a cyclist on their inside. In this article, we present a camera-based system to detect and track cyclists in the blind spot. The system uses boosted classifiers and geometric constraints to detect cyclist wheels, and Canny edge detection to locate the ground contact point. The locations of these points are mapped into physical coordinates using a calibration system based on the ground plane. A Kalman Filter is used to track and predict the future motion of the cyclist. Full-scale tests were conducted using a construction vehicle fitted with two cameras, and the results compared with measurements from an ultrasonic-sensor system. Errors were comparable to the ultrasonic system, with average error standard deviation of 4.3 cm when the cyclist was 1.5 m from the heavy goods vehicles, and 7.1 cm at a distance of 1 m. When results were compared to manually extracted cyclist position data, errors were less than 4 cm at separations of 1.5 and 1 m. Compared to the ultrasonic system, the camera system requires simple hardware and can easily differentiate cyclists from stationary or moving background objects such as parked cars or roadside furniture. However, the cameras suffer from reduced robustness and accuracy at close range and cannot operate in low-light conditions.


2017 ◽  
Vol 9 (1) ◽  
pp. 211-220 ◽  
Author(s):  
Amelie Driemel ◽  
Eberhard Fahrbach ◽  
Gerd Rohardt ◽  
Agnieszka Beszczynska-Möller ◽  
Antje Boetius ◽  
...  

Abstract. Measuring temperature and salinity profiles in the world's oceans is crucial to understanding ocean dynamics and its influence on the heat budget, the water cycle, the marine environment and on our climate. Since 1983 the German research vessel and icebreaker Polarstern has been the platform of numerous CTD (conductivity, temperature, depth instrument) deployments in the Arctic and the Antarctic. We report on a unique data collection spanning 33 years of polar CTD data. In total 131 data sets (1 data set per cruise leg) containing data from 10 063 CTD casts are now freely available at doi:10.1594/PANGAEA.860066. During this long period five CTD types with different characteristics and accuracies have been used. Therefore the instruments and processing procedures (sensor calibration, data validation, etc.) are described in detail. This compilation is special not only with regard to the quantity but also the quality of the data – the latter indicated for each data set using defined quality codes. The complete data collection includes a number of repeated sections for which the quality code can be used to investigate and evaluate long-term changes. Beginning with 2010, the salinity measurements presented here are of the highest quality possible in this field owing to the introduction of the OPTIMARE Precision Salinometer.


Sensors ◽  
2020 ◽  
Vol 20 (17) ◽  
pp. 4920
Author(s):  
Lin Cao ◽  
Xinyi Zhang ◽  
Tao Wang ◽  
Kangning Du ◽  
Chong Fu

In the multi-target traffic radar scene, the clustering accuracy between vehicles with close driving distance is relatively low. In response to this problem, this paper proposes a new clustering algorithm, namely an adaptive ellipse distance density peak fuzzy (AEDDPF) clustering algorithm. Firstly, the Euclidean distance is replaced by adaptive ellipse distance, which can more accurately describe the structure of data obtained by radar measurement vehicles. Secondly, the adaptive exponential function curve is introduced in the decision graph of the fast density peak search algorithm to accurately select the density peak point, and the initialization of the AEDDPF algorithm is completed. Finally, the membership matrix and the clustering center are calculated through successive iterations to obtain the clustering result.The time complexity of the AEDDPF algorithm is analyzed. Compared with the density-based spatial clustering of applications with noise (DBSCAN), k-means, fuzzy c-means (FCM), Gustafson-Kessel (GK), and adaptive Euclidean distance density peak fuzzy (Euclid-ADDPF) algorithms, the AEDDPF algorithm has higher clustering accuracy for real measurement data sets in certain scenarios. The experimental results also prove that the proposed algorithm has a better clustering effect in some close-range vehicle scene applications. The generalization ability of the proposed AEDDPF algorithm applied to other types of data is also analyzed.


Sensors ◽  
2020 ◽  
Vol 20 (20) ◽  
pp. 5934
Author(s):  
Xiao Li ◽  
Wei Li ◽  
Xin’an Yuan ◽  
Xiaokang Yin ◽  
Xin Ma

Lens distortion is closely related to the spatial position of depth of field (DoF), especially in close-range photography. The accurate characterization and precise calibration of DoF-dependent distortion are very important to improve the accuracy of close-range vision measurements. In this paper, to meet the need of short-distance and small-focal-length photography, a DoF-dependent and equal-partition based lens distortion modeling and calibration method is proposed. Firstly, considering the direction along the optical axis, a DoF-dependent yet focusing-state-independent distortion model is proposed. By this method, manual adjustment of the focus and zoom rings is avoided, thus eliminating human errors. Secondly, considering the direction perpendicular to the optical axis, to solve the problem of insufficient distortion representations caused by using only one set of coefficients, a 2D-to-3D equal-increment partitioning method for lens distortion is proposed. Accurate characterization of DoF-dependent distortion is thus realized by fusing the distortion partitioning method and the DoF distortion model. Lastly, a calibration control field is designed. After extracting line segments within a partition, the de-coupling calibration of distortion parameters and other camera model parameters is realized. Experiment results shows that the maximum/average projection and angular reconstruction errors of equal-increment partition based DoF distortion model are 0.11 pixels/0.05 pixels and 0.013°/0.011°, respectively. This demonstrates the validity of the lens distortion model and calibration method proposed in this paper.


2018 ◽  
Vol 11 (7) ◽  
pp. 4239-4260 ◽  
Author(s):  
Richard Anthes ◽  
Therese Rieckh

Abstract. In this paper we show how multiple data sets, including observations and models, can be combined using the “three-cornered hat” (3CH) method to estimate vertical profiles of the errors of each system. Using data from 2007, we estimate the error variances of radio occultation (RO), radiosondes, ERA-Interim, and Global Forecast System (GFS) model data sets at four radiosonde locations in the tropics and subtropics. A key assumption is the neglect of error covariances among the different data sets, and we examine the consequences of this assumption on the resulting error estimates. Our results show that different combinations of the four data sets yield similar relative and specific humidity, temperature, and refractivity error variance profiles at the four stations, and these estimates are consistent with previous estimates where available. These results thus indicate that the correlations of the errors among all data sets are small and the 3CH method yields realistic error variance profiles. The estimated error variances of the ERA-Interim data set are smallest, a reasonable result considering the excellent model and data assimilation system and assimilation of high-quality observations. For the four locations studied, RO has smaller error variances than radiosondes, in agreement with previous studies. Part of the larger error variance of the radiosondes is associated with representativeness differences because radiosondes are point measurements, while the other data sets represent horizontal averages over scales of ∼ 100 km.


2000 ◽  
Vol 54 (4) ◽  
pp. 608-623 ◽  
Author(s):  
Vítézslav Centner ◽  
Jorge Verdú-Andrés ◽  
Beata Walczak ◽  
Delphine Jouan-Rimbaud ◽  
Frédéric Despagne ◽  
...  

The present study compares the performance of different multivariate calibration techniques applied to four near-infrared data sets when test samples are well within the calibration domain. Three types of problems are discussed: the nonlinear calibration, the calibration using heterogeneous data sets, and the calibration in the presence of irrelevant information in the set of predictors. Recommendations are derived from the comparison, which should help to guide a nonchemometrician through the selection of an appropriate calibration method for a particular type of calibration data. A flexible methodology is proposed to allow selection of an appropriate calibration technique for a given calibration problem.


Radiocarbon ◽  
2010 ◽  
Vol 52 (3) ◽  
pp. 953-961 ◽  
Author(s):  
Christopher Bronk Ramsey ◽  
Michael Dee ◽  
Sharen Lee ◽  
Takeshi Nakagawa ◽  
Richard A Staff

Calibration is a core element of radiocarbon dating and is undergoing rapid development on a number of different fronts. This is most obvious in the area of 14C archives suitable for calibration purposes, which are now demonstrating much greater coherence over the earlier age range of the technique. Of particular significance to this end is the development of purely terrestrial archives such as those from the Lake Suigetsu sedimentary profile and Kauri tree rings from New Zealand, in addition to the groundwater records from speleothems. Equally important, however, is the development of statistical tools that can be used with, and help develop, such calibration data. In the context of sedimentary deposition, age-depth modeling provides a very useful way to analyze series of measurements from cores, with or without the presence of additional varve information. New methods are under development, making use of model averaging, that generate more robust age models. In addition, all calibration requires a coherent approach to outliers, for both single samples and where entire data sets might be offset relative to the calibration curve. This paper looks at current developments in these areas.


Radiocarbon ◽  
2004 ◽  
Vol 46 (1) ◽  
pp. 325-344 ◽  
Author(s):  
Christopher Bronk Ramsey ◽  
Sturt W Manning ◽  
Mariagrazia Galimberti

The eruption of the volcano at Thera (Santorini) in the Aegean Sea undoubtedly had a profound influence on the civilizations of the surrounding region. The date of the eruption has been a subject of much controversy because it must be linked into the established and intricate archaeological phasings of both the prehistoric Aegean and the wider east Mediterranean. Radiocarbon dating of material from the volcanic destruction layer itself can provide some evidence for the date of the eruption, but because of the shape of the calibration curve for the relevant period, the value of such dates relies on there being no biases in the data sets. However, by dating the material from phases earlier and later than the eruption, some of the problems of the calibration data set can be circumvented and the chronology for the region can be resolved with more certainty.In this paper, we draw together the evidence we have accumulated so far, including new data on the destruction layer itself and for the preceding cultural horizon at Thera, and from associated layers at Miletos in western Turkey. Using Bayesian models to synthesize the data and to identify outliers, we conclude from the most reliable 14C evidence (and using the INTCAL98 calibration data set) that the eruption of Thera occurred between 1663 and 1599 BC.


Sign in / Sign up

Export Citation Format

Share Document