scholarly journals Improving Plane Fitting Accuracy with Rigorous Error Models of Structured Light-Based RGB-D Sensors

2020 ◽  
Vol 12 (2) ◽  
pp. 320 ◽  
Author(s):  
Yaxin Li ◽  
Wenbin Li ◽  
Walid Darwish ◽  
Shengjun Tang ◽  
Yuling Hu ◽  
...  

Plane fitting is a fundamental operation for point cloud data processing. Most existing methods for point cloud plane fitting have been developed based on high-quality Lidar data giving equal weight to the point cloud data. In recent years, using low-quality RGB-Depth (RGB-D) sensors to generate 3D models has attracted much attention. However, with low-quality point cloud data, equal weight plane fitting methods are not optimal as the range errors of RGB-D sensors are distance-related. In this paper, we developed an accurate plane fitting method for a structured light (SL)-based RGB-D sensor. First, we derived an error model of a point cloud dataset from the SL-based RGB-D sensor through error propagation from the raw measurement to the point coordinates. A new cost function based on minimizing the radial distances with the derived rigorous error model was then proposed for the random sample consensus (RANSAC)-based plane fitting method. The experimental results demonstrated that our method is robust and practical for different operating ranges and different working conditions. In the experiments, for the operating ranges from 1.23 meters to 4.31 meters, the mean plane angle errors were about one degree, and the mean plane distance errors were less than six centimeters. When the dataset is of a large-depth-measurement scale, the proposed method can significantly improve the plane fitting accuracy, with a plane angle error of 0.5 degrees and a mean distance error of 4.7 cm, compared to 3.8 degrees and 16.8 cm, respectively, from the conventional un-weighted RANSAC method. The experimental results also demonstrate that the proposed method is applicable for different types of SL-based RGB-D sensor. The rigorous error model of the SL-based RGB-D sensor is essential for many applications such as in outlier detection and data authorization. Meanwhile, the precise plane fitting method developed in our research will benefit algorithms based on high-accuracy plane features such as depth calibration, 3D feature-based simultaneous localization and mapping (SLAM), and the generation of indoor building information models (BIMs).

Author(s):  
M. Kuschnerus ◽  
D. Schröder ◽  
R. Lindenbergh

Abstract. The advancement of permanently measuring laser scanners has opened up a wide range of new applications, but also led to the need for more advanced approaches on error quantification and correction. Time-dependent and systematic error influences may only become visible in data of quasi-permanent measurements. During a scan experiment in February/March 2020 point clouds were acquired every thirty minutes with a Riegl VZ-2000 laser scanner, and various other sensors (inclination sensors, weather station and GNSS sensors) were used to survey the environment of the laser scanner and the study site. Using this measurement configuration, our aim is to identify apparent displacements in multi-temporal scans due to systematic error influences and to investigate data quality for assessment of geomorphic changes in coastal regions. We analyse scan data collected around two storm events around 09/02/2020 (Ciara) and around 22/02/2020 (Yulia) and derive the impact of heavy storms on the point cloud data through comparison with the collected auxiliary data. To investigate the systematic residuals on data acquired by permanent laser scanning, we extracted several stable flat surfaces from the point cloud data. From a plane fitted through the respective surfaces of each scan, we estimated the mean displacement of each plane with the respective root mean square errors. Inclination sensors, internal and external, recorded pitch and roll values during each scan. We derived a mean inclination per scan (in pitch and roll) and the standard deviation from the mean as a measure of the stability of the laser scanner during each scan. Evaluation of the data recorded by a weather station together with knowledge of the movement behaviour, allows to derive possible causes of displacements and/or noise and correction models. The results are compared to independent measurements from GNSS sensors for validation. For wind speeds of 10 m/s and higher, movements of the scanner considerably increase the noise level in the point cloud data.


Author(s):  
Rafael Radkowski

The paper introduces a method for an augmented reality (AR) assembly assistance application that allows one to quantify the alignment of two parts. Point cloud-based tracking is one method to recognize and to track physical parts. However, the correct fitting of two parts cannot be determined with high fidelity from point cloud tracking data due to occlusion and other challenges. A Maximum Likelihood Estimate (MLE) of an error model is suggested to quantify the probability that two parts are correctly aligned. An initial solution was investigated. The results of an offline-simulation with point cloud data are promising and indicate the efficacy of the suggested method.


2021 ◽  
Vol 11 (3) ◽  
pp. 1007
Author(s):  
Kaleel Al-Durgham ◽  
Derek D. Lichti ◽  
Eunju Kwak ◽  
Ryan Dixon

The accuracy assessment of mobile mapping system (MMS) outputs is usually reliant on manual labor to inspect the quality of a vast amount of collected geospatial data. This paper presents an automated framework for the accuracy assessment and quality inspection of point cloud data collected by MMSs operating with lightweight laser scanners and consumer-grade microelectromechanical systems (MEMS) sensors. A new, large-scale test facility has been established in a challenging navigation environment (downtown area) to support the analyses conducted in this research work. MMS point cloud data are divided into short time slices for comparison with the higher-accuracy, terrestrial laser scanner (TLS) point cloud of the test facility. MMS data quality is quantified by the results of registering the point cloud of each slice with the TLS datasets. Experiments on multiple land vehicle MMS point cloud datasets using a lightweight laser scanner and three different MEMS devices are presented to demonstrate the effectiveness of the proposed method. The mean accuracy of a consumer grade MEMS (<$100) was found to be 1.13 ± 0.47 m. The mean accuracy of two commercial MEMS (>$100) was in the range of 0.48 ± 0.23 m to 0.85 ± 0.52 m. The method presented here in can be straightforwardly implemented and adopted for the accuracy assessment of other MMSs types such as unmanned aerial vehicles (UAV)s.


Author(s):  
Jiayong Yu ◽  
Longchen Ma ◽  
Maoyi Tian, ◽  
Xiushan Lu

The unmanned aerial vehicle (UAV)-mounted mobile LiDAR system (ULS) is widely used for geomatics owing to its efficient data acquisition and convenient operation. However, due to limited carrying capacity of a UAV, sensors integrated in the ULS should be small and lightweight, which results in decrease in the density of the collected scanning points. This affects registration between image data and point cloud data. To address this issue, the authors propose a method for registering and fusing ULS sequence images and laser point clouds, wherein they convert the problem of registering point cloud data and image data into a problem of matching feature points between the two images. First, a point cloud is selected to produce an intensity image. Subsequently, the corresponding feature points of the intensity image and the optical image are matched, and exterior orientation parameters are solved using a collinear equation based on image position and orientation. Finally, the sequence images are fused with the laser point cloud, based on the Global Navigation Satellite System (GNSS) time index of the optical image, to generate a true color point cloud. The experimental results show the higher registration accuracy and fusion speed of the proposed method, thereby demonstrating its accuracy and effectiveness.


Author(s):  
Keisuke YOSHIDA ◽  
Shiro MAENO ◽  
Syuhei OGAWA ◽  
Sadayuki ISEKI ◽  
Ryosuke AKOH

Sign in / Sign up

Export Citation Format

Share Document