Estimation of leaf area index at the late growth stage of crops using unmanned aerial vehicle hyperspectral images

2021 ◽  
Author(s):  
Weiping Kong ◽  
Wenjiang Huang ◽  
Lingling Ma ◽  
Binbin Chen ◽  
Chuanrong Li ◽  
...  
2021 ◽  
Author(s):  
Shuang Wu ◽  
Lei Deng ◽  
Lijie Guo ◽  
Yanjie Wu

Abstract Background: Leaf Area Index (LAI) is half of the amount of leaf area per unit horizontal ground surface area. Consequently, accurate vegetation extraction in remote sensing imagery is critical for LAI estimation. However, most studies do not fully exploit the advantages of Unmanned Aerial Vehicle (UAV) imagery with high spatial resolution, such as not removing the background (soil and shadow, etc.). Furthermore, the advancement of multi-sensor synchronous observation and integration technology allows for the simultaneous collection of canopy spectral, structural, and thermal data, making it possible for data fusion.Methods: To investigate the potential of high-resolution UAV imagery combined with multi-sensor data fusion in LAI estimation. High-resolution UAV imagery was obtained with a multi-sensor integrated MicaSense Altum camera to extract the wheat canopy's spectral, structural, and thermal features. After removing the soil background, all features were fused, and LAI was estimated using Random Forest and Support Vector Machine Regression.Result: The results show that: (1) the soil background reduced the accuracy of the LAI prediction, and soil background could be effectively removed by taking advantage of high-resolution UAV imagery. After removing the soil background, the LAI prediction accuracy improved significantly, R2 raised by about 0.27, and RMSE fell by about 0.476. (2) The fusion of multi-sensor synchronous observation data improved LAI prediction accuracy and achieved the best accuracy (R2 = 0.815 and RMSE = 1.023). (3) When compared to other variables, 23 CHM, NRCT, NDRE, and BLUE are crucial for LAI estimation. Even the simple Multiple Linear Regression model could achieve high prediction accuracy (R2 = 0.679 and RMSE = 1.231), providing inspiration for rapid and efficient LAI prediction.Conclusions: The method of this study can be transferred to other sites with more extensive areas or similar agriculture structures, which will facilitate agricultural production and management.


2013 ◽  
Vol 115 (1) ◽  
pp. 31-42 ◽  
Author(s):  
Juan I. Córcoles ◽  
Jose F. Ortega ◽  
David Hernández ◽  
Miguel A. Moreno

2020 ◽  
Vol 13 (1) ◽  
pp. 84
Author(s):  
Tomoaki Yamaguchi ◽  
Yukie Tanaka ◽  
Yuto Imachi ◽  
Megumi Yamashita ◽  
Keisuke Katsura

Leaf area index (LAI) is a vital parameter for predicting rice yield. Unmanned aerial vehicle (UAV) surveillance with an RGB camera has been shown to have potential as a low-cost and efficient tool for monitoring crop growth. Simultaneously, deep learning (DL) algorithms have attracted attention as a promising tool for the task of image recognition. The principal aim of this research was to evaluate the feasibility of combining DL and RGB images obtained by a UAV for rice LAI estimation. In the present study, an LAI estimation model developed by DL with RGB images was compared to three other practical methods: a plant canopy analyzer (PCA); regression models based on color indices (CIs) obtained from an RGB camera; and vegetation indices (VIs) obtained from a multispectral camera. The results showed that the estimation accuracy of the model developed by DL with RGB images (R2 = 0.963 and RMSE = 0.334) was higher than those of the PCA (R2 = 0.934 and RMSE = 0.555) and the regression models based on CIs (R2 = 0.802-0.947 and RMSE = 0.401–1.13), and comparable to that of the regression models based on VIs (R2 = 0.917–0.976 and RMSE = 0.332–0.644). Therefore, our results demonstrated that the estimation model using DL with an RGB camera on a UAV could be an alternative to the methods using PCA and a multispectral camera for rice LAI estimation.


2019 ◽  
Vol 11 (23) ◽  
pp. 6829 ◽  
Author(s):  
Umut Hasan ◽  
Mamat Sawut ◽  
Shuisen Chen

The leaf area index (LAI) is not only an important parameter for monitoring crop growth, but also an important input parameter for crop yield prediction models and hydrological and climatic models. Several studies have recently been conducted to estimate crop LAI using unmanned aerial vehicle (UAV) multispectral and hyperspectral data. However, there are few studies on estimating the LAI of winter wheat using unmanned aerial vehicle (UAV) RGB images. In this study, we estimated the LAI of winter wheat at the jointing stage on simple farmland in Xinjiang, China, using parameters derived from UAV RGB images. According to gray correlation analysis, UAV RGB-image parameters such as the Visible Atmospherically Resistant Index (VARI), the Red Green Blue Vegetation Index (RGBVI), the Digital Number (DN) of Blue Channel (B) and the Green Leaf Algorithm (GLA) were selected to develop models for estimating the LAI of winter wheat. The results showed that it is feasible to use UAV RGB images for inverting and mapping the LAI of winter wheat at the jointing stage on the field scale, and the partial least squares regression (PLSR) model based on the VARI, RGBVI, B and GLA had the best prediction accuracy (R2 = 0.776, root mean square error (RMSE) = 0.468, residual prediction deviation (RPD) = 1.838) among all the regression models. To conclude, UAV RGB images not only have great potential in estimating the LAI of winter wheat, but also can provide more reliable and accurate data for precision agriculture management.


Sign in / Sign up

Export Citation Format

Share Document