scholarly journals Automated Hyperspectral Vegetation Index Derivation Using A Hyperparameter Optimization Framework for High-Throughput Plant Phenotyping

2021 ◽  
Author(s):  
Joshua Koh ◽  
Bikram Banerjee ◽  
German Spangenberg ◽  
Surya Kant

Hyperspectral vegetation indices (VIs) are widely deployed in agriculture remote sensing and plant phenotyping to estimate plant biophysical and biochemical traits. However, existing VIs consist mainly of simple 2-band indices which limits the net performance and often do not generalize well for other traits than they were originally designed for. We present an automated hyperspectral vegetation index (AutoVI) system for the rapid generation of novel 2- to 6-band trait-specific indices in a streamlined process covering model selection, optimization and evaluation driven by the tree parzen estimator algorithm. Its performance was tested in generating novel indices to estimate chlorophyll and sugar contents in wheat. Results show that AutoVI can rapidly generate complex novel VIs (≥4-band index) which correlated strongly (R2 > 0.8) with measured chlorophyll and sugar contents in wheat. AutoVI-derived indices were used as features in simple and stepwise multiple linear regression for chlorophyll and sugar content estimation, and outperformed results achieved with existing 47 VIs and those provided by partial least squares regression. The AutoVI system can deliver novel trait-specific VIs readily adoptable in high-throughput plant phenotyping platforms and should appeal to plant scientists and breeders. A graphical user interface of AutoVI is herein provided.

2019 ◽  
Vol 2019 ◽  
pp. 1-9 ◽  
Author(s):  
Hema S. N. Duddu ◽  
Eric N. Johnson ◽  
Christian J. Willenborg ◽  
Steven J. Shirtliffe

The traditional visual rating system is labor-intensive, time-consuming, and prone to human error. Unmanned aerial vehicle (UAV) imagery-based vegetation indices (VI) have potential applications in high-throughput plant phenotyping. The study objective is to determine if UAV imagery provides accurate and consistent estimations of crop injury from herbicide application and its potential as an alternative to visual ratings. The study was conducted at the Kernen Crop Research Farm, University of Saskatchewan in 2016 and 2017. Fababean (Vicia faba L.) crop tolerance to nine herbicide tank mixtures was evaluated with 2 rates distributed in a randomized complete block design (RCBD) with 4 blocks. The trial was imaged using a multispectral camera with a ground sample distance (GSD) of 1.2 cm, one week after the treatment application. Visual ratings of growth reduction and physiological chlorosis were recorded simultaneously with imaging. The optimized soil-adjusted vegetation index (OSAVI) was calculated from the thresholded orthomosaics. The UAV-based vegetation index (OSAVI) produced more precise results compared to visual ratings for both years. The coefficient of variation (CV) of OSAVI was ~1% when compared to 18-43% for the visual ratings. Furthermore, Tukey’s honestly significance difference (HSD) test yielded a more precise mean separation for the UAV-based vegetation index than visual ratings. The significant correlations between OSAVI and the visual ratings from the study suggest that undesirable variability associated with visual assessments can be minimized with the UAV-based approach. UAV-based imagery methods had greater precision than the visual-based ratings for crop herbicide damage. These methods have the potential to replace visual ratings and aid in screening crops for herbicide tolerance.


Plant Methods ◽  
2021 ◽  
Vol 17 (1) ◽  
Author(s):  
Shuai Che ◽  
Guoying Du ◽  
Ning Wang ◽  
Kun He ◽  
Zhaolan Mo ◽  
...  

Abstract Background Pyropia is an economically advantageous genus of red macroalgae, which has been cultivated in the coastal areas of East Asia for over 300 years. Realizing estimation of macroalgae biomass in a high-throughput way would great benefit their cultivation management and research on breeding and phenomics. However, the conventional method is labour-intensive, time-consuming, manually destructive, and prone to human error. Nowadays, high-throughput phenotyping using unmanned aerial vehicle (UAV)-based spectral imaging is widely used for terrestrial crops, grassland, and forest, but no such application in marine aquaculture has been reported. Results In this study, multispectral images of cultivated Pyropia yezoensis were taken using a UAV system in the north of Haizhou Bay in the midwestern coast of Yellow Sea. The exposure period of P. yezoensis was utilized to prevent the significant shielding effect of seawater on the reflectance spectrum. The vegetation indices of normalized difference vegetation index (NDVI), ratio vegetation index (RVI), difference vegetation index (DVI) and normalized difference of red edge (NDRE) were derived and indicated no significant difference between the time that P. yezoensis was completely exposed to the air and 1 h later. The regression models of the vegetation indices and P. yezoensis biomass per unit area were established and validated. The quadratic model of DVI (Biomass = − 5.550DVI2 + 105.410DVI + 7.530) showed more accuracy than the other index or indices combination, with the highest coefficient of determination (R2), root mean square error (RMSE), and relative estimated accuracy (Ac) values of 0.925, 8.06, and 74.93%, respectively. The regression model was further validated by consistently predicting the biomass with a high R2 value of 0.918, RMSE of 8.80, and Ac of 82.25%. Conclusions This study suggests that the biomass of Pyropia can be effectively estimated using UAV-based spectral imaging with high accuracy and consistency. It also implied that multispectral aerial imaging is potential to assist digital management and phenomics research on cultivated macroalgae in a high-throughput way.


2020 ◽  
Vol 63 (4) ◽  
pp. 1133-1146
Author(s):  
Beichen Lyu ◽  
Stuart D. Smith ◽  
Yexiang Xue ◽  
Katy M. Rainey ◽  
Keith Cherkauer

HighlightsThis study addresses two computational challenges in high-throughput phenotyping: scalability and efficiency.Specifically, we focus on extracting crop images and deriving vegetation indices using unmanned aerial systems.To this end, we outline a data processing pipeline, featuring a crop localization algorithm and trie data structure.We demonstrate the efficacy of our approach by computing large-scale and high-precision vegetation indices in a soybean breeding experiment, where we evaluate soybean growth under water inundation and temporal change.Abstract. In agronomy, high-throughput phenotyping (HTP) can provide key information for agronomists in genomic selection as well as farmers in yield prediction. Recently, HTP using unmanned aerial systems (UAS) has shown advantages in both cost and efficiency. However, scalability and efficiency have not been well studied when processing images in complex contexts, such as using multispectral cameras, and when images are collected during early and late growth stages. These challenges hamper further analysis to quantify phenotypic traits for large-scale and high-precision applications in plant breeding. To solve these challenges, our research team previously built a three-step data processing pipeline, which is highly modular. For this project, we present improvements to the previous pipeline to improve canopy segmentation and crop plot localization, leading to improved accuracy in crop image extraction. Furthermore, we propose a novel workflow based on a trie data structure to compute vegetation indices efficiently and with greater flexibility. For each of our proposed changes, we evaluate the advantages by comparison with previous models in the literature or by comparing processing results using both the original and improved pipelines. The improved pipeline is implemented as two MATLAB programs: Crop Image Extraction version 2 (CIE 2.0) and Vegetation Index Derivation version 1 (VID 1.0). Using CIE 2.0 and VID 1.0, we compute canopy coverage and normalized difference vegetation indices (NDVIs) for a soybean phenotyping experiment. We use canopy coverage to investigate excess water stress and NDVIs to evaluate temporal patterns across the soybean growth stages. Both experimental results compare favorably with previous studies, especially for approximation of soybean reproductive stage. Overall, the proposed methodology and implemented experiments provide a scalable and efficient paradigm for applying HTP with UAS to general plant breeding. Keywords: Data processing pipeline, High-throughput phenotyping, Image processing, Soybean breeding, Unmanned aerial systems, Vegetation indices.


Plants ◽  
2021 ◽  
Vol 10 (12) ◽  
pp. 2726
Author(s):  
Yaping Xu ◽  
Vivek Shrestha ◽  
Cristiano Piasecki ◽  
Benjamin Wolfe ◽  
Lance Hamilton ◽  
...  

Unmanned aerial vehicles (UAVs) provide an intermediate scale of spatial and spectral data collection that yields increased accuracy and consistency in data collection for morphological and physiological traits than satellites and expanded flexibility and high-throughput compared to ground-based data collection. In this study, we used UAV-based remote sensing for automated phenotyping of field-grown switchgrass (Panicum virgatum), a leading bioenergy feedstock. Using vegetation indices calculated from a UAV-based multispectral camera, statistical models were developed for rust disease caused by Puccinia novopanici, leaf chlorophyll, nitrogen, and lignin contents. For the first time, UAV remote sensing technology was used to explore the potentials for multiple traits associated with sustainable production of switchgrass, and one statistical model was developed for each individual trait based on the statistical correlation between vegetation indices and the corresponding trait. Also, for the first time, lignin content was estimated in switchgrass shoots via UAV-based multispectral image analysis and statistical analysis. The UAV-based models were verified by ground-truthing via correlation analysis between the traits measured manually on the ground-based with UAV-based data. The normalized difference red edge (NDRE) vegetation index outperformed the normalized difference vegetation index (NDVI) for rust disease and nitrogen content, while NDVI performed better than NDRE for chlorophyll and lignin content. Overall, linear models were sufficient for rust disease and chlorophyll analysis, but for nitrogen and lignin contents, nonlinear models achieved better results. As the first comprehensive study to model switchgrass sustainability traits from UAV-based remote sensing, these results suggest that this methodology can be utilized for switchgrass high-throughput phenotyping in the field.


2020 ◽  
Vol 12 (15) ◽  
pp. 2445
Author(s):  
Walter Chivasa ◽  
Onisimo Mutanga ◽  
Chandrashekhar Biradar

Accelerating crop improvement for increased yield and better adaptation to changing climatic conditions is an issue of increasing urgency in order to satisfy the ever-increasing global food demand. However, the major bottleneck is the absence of high-throughput plant phenotyping methods for rapid and cost-effective data-driven variety selection and release in plant breeding. Traditional phenotyping methods that rely on trained experts are slow, costly, labor-intensive, subjective, and often require destructive sampling. We explore ways to improve the efficiency of crop phenotyping through the use of unmanned aerial vehicle (UAV)-based multispectral remotely sensed data in maize (Zea mays L.) varietal response to maize streak virus (MSV) disease. Twenty-five maize varieties grown in a trial with three replications were evaluated under artificial MSV inoculation. Ground scoring for MSV infection was carried out at mid-vegetative, flowering, and mid-grain filling on a scale of 1 (resistant) to 9 (susceptible). UAV-derived spectral data were acquired at these three different phenological stages in multispectral bands corresponding to Green (0.53–0.57 μm), Red (0.64–0.68 μm), Rededge (0.73–0.74 μm), and Near-Infrared (0.77–0.81 μm). The imagery captured was stitched together in Pix4Dmapper, which generates two types of multispectral orthomosaics: the NoAlpha and the transparent mosaics for each band. The NoAlpha imagery was used as input into QGIS to extract reflectance data. Six vegetation indices were derived for each variety: normalized difference vegetation index (NDVI), green normalized difference vegetation index (GNDVI), Rededge NDVI (NDVIrededge), Simple Ratio (SR), green Chlorophyll Index (CIgreen), and Rededge Chlorophyll Index (CIrededge). The Random Forest (RF) classifier was used to evaluate UAV-derived spectral and VIs with and without variable optimization. Correlations between the UAV-derived data and manual MSV scores were significant (R = 0.74–0.84). Varieties were classified into resistant, moderately resistant, and susceptible with overall classification accuracies of 77.3% (Kappa = 0.64) with optimized and 68.2% (Kappa = 0.51) without optimized variables, representing an improvement of ~13.3% due to variable optimization. The RF model selected GNDVI, CIgreen, CIrededge, and the Red band as the most important variables for classification. Mid-vegetative was the most ideal phenological stage for accurate varietal phenotyping and discrimination using UAV-derived multispectral data with RF under artificial MSV inoculation. The results provide a rapid UAV-based remote sensing solution that offers a step-change towards data availability at high spatial (submeter) and temporal (daily/weekly) resolution in varietal analysis for quick and robust high-throughput plant phenotyping, important for timely and unbiased data-driven variety selection and release in plant breeding programs, especially as climate change accelerates.


Sensors ◽  
2021 ◽  
Vol 21 (22) ◽  
pp. 7694
Author(s):  
Veronika Blank ◽  
Roman Skidanov ◽  
Leonid Doskolovich ◽  
Nikolay Kazanskiy

We propose a novel type of spectral diffractive lenses that operate in the ±1-st diffraction orders. Such spectral lenses generate a sharp image of the wavelengths of interest in the +1-st and –1-st diffraction orders. The spectral lenses are convenient to use for obtaining remotely sensed vegetation index images instead of full-fledged hyperspectral images. We discuss the design and fabrication of spectral diffractive lenses for measuring vegetation indices, which include a Modified Red Edge Simple Ratio Index and a Water Band Index. We report synthesizing diffractive lenses with a microrelief thickness of 4 µm using the direct laser writing in a photoresist. The use of the fabricated spectral lenses in a prototype scheme of an imaging sensor for index measurements is discussed. Distributions of the aforesaid spectral indices are obtained by the linear scanning of vegetation specimens. Using a linear scanning of vegetation samples, distributions of the above-said water band index were experimentally measured.


2020 ◽  
Vol 7 (1) ◽  
pp. 21
Author(s):  
Faradina Marzukhi ◽  
Nur Nadhirah Rusyda Rosnan ◽  
Md Azlin Md Said

The aim of this study is to analyse the relationship between vegetation indices of Normalized Difference Vegetation Index (NDVI) and soil nutrient of oil palm plantation at Felcra Nasaruddin Bota in Perak for future sustainable environment. The satellite image was used and processed in the research. By Using NDVI, the vegetation index was obtained which varies from -1 to +1. Then, the soil sample and soil moisture analysis were carried in order to identify the nutrient values of Nitrogen (N), Phosphorus (P) and Potassium (K). A total of seven soil samples were acquired within the oil palm plantation area. A regression model was then made between physical condition of the oil palms and soil nutrients for determining the strength of the relationship. It is hoped that the risk map of oil palm healthiness can be produced for various applications which are related to agricultural plantation.


Sensors ◽  
2020 ◽  
Vol 20 (16) ◽  
pp. 4550
Author(s):  
Huajian Liu ◽  
Brooke Bruning ◽  
Trevor Garnett ◽  
Bettina Berger

The accurate and high throughput quantification of nitrogen (N) content in wheat using non-destructive methods is an important step towards identifying wheat lines with high nitrogen use efficiency and informing agronomic management practices. Among various plant phenotyping methods, hyperspectral sensing has shown promise in providing accurate measurements in a fast and non-destructive manner. Past applications have utilised non-imaging instruments, such as spectrometers, while more recent approaches have expanded to hyperspectral cameras operating in different wavelength ranges and at various spectral resolutions. However, despite the success of previous hyperspectral applications, some important research questions regarding hyperspectral sensors with different wavelength centres and bandwidths remain unanswered, limiting wide application of this technology. This study evaluated the capability of hyperspectral imaging and non-imaging sensors to estimate N content in wheat leaves by comparing three hyperspectral cameras and a non-imaging spectrometer. This study answered the following questions: (1) How do hyperspectral sensors with different system setups perform when conducting proximal sensing of N in wheat leaves and what aspects have to be considered for optimal results? (2) What types of photonic detectors are most sensitive to N in wheat leaves? (3) How do the spectral resolutions of different instruments affect N measurement in wheat leaves? (4) What are the key-wavelengths with the highest correlation to N in wheat? Our study demonstrated that hyperspectral imaging systems with satisfactory system setups can be used to conduct proximal sensing of N content in wheat with sufficient accuracy. The proposed approach could reduce the need for chemical analysis of leaf tissue and lead to high-throughput estimation of N in wheat. The methodologies here could also be validated on other plants with different characteristics. The results can provide a reference for users wishing to measure N content at either plant- or leaf-scales using hyperspectral sensors.


Plant Methods ◽  
2021 ◽  
Vol 17 (1) ◽  
Author(s):  
Shuo Zhou ◽  
Xiujuan Chai ◽  
Zixuan Yang ◽  
Hongwu Wang ◽  
Chenxue Yang ◽  
...  

Abstract Background Maize (Zea mays L.) is one of the most important food sources in the world and has been one of the main targets of plant genetics and phenotypic research for centuries. Observation and analysis of various morphological phenotypic traits during maize growth are essential for genetic and breeding study. The generally huge number of samples produce an enormous amount of high-resolution image data. While high throughput plant phenotyping platforms are increasingly used in maize breeding trials, there is a reasonable need for software tools that can automatically identify visual phenotypic features of maize plants and implement batch processing on image datasets. Results On the boundary between computer vision and plant science, we utilize advanced deep learning methods based on convolutional neural networks to empower the workflow of maize phenotyping analysis. This paper presents Maize-IAS (Maize Image Analysis Software), an integrated application supporting one-click analysis of maize phenotype, embedding multiple functions: (I) Projection, (II) Color Analysis, (III) Internode length, (IV) Height, (V) Stem Diameter and (VI) Leaves Counting. Taking the RGB image of maize as input, the software provides a user-friendly graphical interaction interface and rapid calculation of multiple important phenotypic characteristics, including leaf sheath points detection and leaves segmentation. In function Leaves Counting, the mean and standard deviation of difference between prediction and ground truth are 1.60 and 1.625. Conclusion The Maize-IAS is easy-to-use and demands neither professional knowledge of computer vision nor deep learning. All functions for batch processing are incorporated, enabling automated and labor-reduced tasks of recording, measurement and quantitative analysis of maize growth traits on a large dataset. We prove the efficiency and potential capability of our techniques and software to image-based plant research, which also demonstrates the feasibility and capability of AI technology implemented in agriculture and plant science.


Sign in / Sign up

Export Citation Format

Share Document