<i>Low-cost and precise phenotyping using 3D point cloud reconstruction to determine plant architecture and morphology</i>

2021 ◽  
Author(s):  
Odin Zhaowei Guo ◽  
Bo-Sen Wu ◽  
Mark Lefsrud
2021 ◽  
Vol 11 (3) ◽  
pp. 913
Author(s):  
Chang Yuan ◽  
Shusheng Bi ◽  
Jun Cheng ◽  
Dongsheng Yang ◽  
Wei Wang

For a rotating 2D lidar, the inaccurate matching between the 2D lidar and the motor is an important error resource of the 3D point cloud, where the error is shown both in shape and attitude. Existing methods need to measure the angle position of the motor shaft in real time to synchronize the 2D lidar data and the motor shaft angle. However, the sensor used for measurement is usually expensive, which can increase the cost. Therefore, we propose a low-cost method to calibrate the matching error between the 2D lidar and the motor, without using an angular sensor. First, the sequence between the motor and the 2D lidar is optimized to eliminate the shape error of the 3D point cloud. Next, we eliminate the attitude error with uncertainty of the 3D point cloud by installing a triangular plate on the prototype. Finally, the Levenberg–Marquardt method is used to calibrate the installation error of the triangular plate. Experiments verified that the accuracy of our method can meet the requirements of the 3D mapping of indoor autonomous mobile robots. While we use a 2D lidar Hokuyo UST-10LX with an accuracy of ±40 mm in our prototype, we can limit the mapping error within ±50 mm when the distance is no more than 2.2996 m for a 1 s scan (mode 1), and we can limit the mapping error within ±50 mm at the measuring range 10 m for a 16 s scan (mode 7). Our method can reduce the cost while the accuracy is ensured, which can make a rotating 2D lidar cheaper.


2021 ◽  
Vol 12 (1) ◽  
pp. 395
Author(s):  
Ying Wang ◽  
Ki-Young Koo

The 3D point cloud reconstruction from photos taken by an unmanned aerial vehicle (UAV) is a promising tool for monitoring and managing risks of cut-slopes. However, surface changes on cut-slopes are likely to be hidden by seasonal vegetation variations on the cut-slopes. This paper proposes a vegetation removal method for 3D reconstructed point clouds using (1) a 2D image segmentation deep learning model and (2) projection matrices available from photogrammetry. For a given point cloud, each 3D point of it is reprojected into the image coordinates by the projection matrices to determine if it belongs to vegetation or not using the 2D image segmentation model. The 3D points belonging to vegetation in the 2D images are deleted from the point cloud. The effort to build a 2D image segmentation model was significantly reduced by using U-Net with the dataset prepared by the colour index method complemented by manual trimming. The proposed method was applied to a cut-slope in Doam Dam in South Korea, and showed that vegetation from the two point clouds of the cut-slope at winter and summer was removed successfully. The M3C2 distance between the two vegetation-removed point clouds showed a feasibility of the proposed method as a tool to reveal actual change of cut-slopes without the effect of vegetation.


2016 ◽  
Vol 8 (1) ◽  
pp. 26-31 ◽  
Author(s):  
Francesca Murgia ◽  
Cristian Perra ◽  
Daniele Giusto

2013 ◽  
Vol 64 (9) ◽  
pp. 1099-1114 ◽  
Author(s):  
Thomas Hoegg ◽  
Damien Lefloch ◽  
Andreas Kolb

Sign in / Sign up

Export Citation Format

Share Document