scholarly journals High-Throughput Corn Image Segmentation and Trait Extraction Using Chlorophyll Fluorescence Images

2021 ◽  
Vol 2021 ◽  
pp. 1-15
Author(s):  
Augusto Souza ◽  
Yang Yang

Plant segmentation and trait extraction for individual organs are two of the key challenges in high-throughput phenotyping (HTP) operations. To address this challenge, the Ag Alumni Seed Phenotyping Facility (AAPF) at Purdue University utilizes chlorophyll fluorescence images (CFIs) to enable consistent and efficient automatic segmentation of plants of different species, age, or color. A series of image analysis routines were also developed to facilitate the quantitative measurements of key corn plant traits. A proof-of-concept experiment was conducted to demonstrate the utility of the extracted traits in assessing drought stress reaction of corn plants. The image analysis routines successfully measured several corn morphological characteristics for different sizes such as plant height, area, top-node height and diameter, number of leaves, leaf area, and angle in relation to the stem. Data from the proof-of-concept experiment showed how corn plants behaved when treated with different water regiments or grown in pot of different sizes. High-throughput image segmentation and analysis basing on a plant’s fluorescence image was proved to be efficient and reliable. Extracted trait on the segmented stem and leaves of a corn plant demonstrated the importance and utility of this kind of trait data in evaluating the performance of corn plant under stress. Data collected from corn plants grown in pots of different volumes showed the importance of using pot of standard size when conducting and reporting plant phenotyping data in a controlled-environment facility.

2021 ◽  
Author(s):  
Lydia Kienbaum ◽  
Miguel Correa Abondano ◽  
Raul H. Blas Sevillano ◽  
Karl J Schmid

Background: Maize cobs are an important component of crop yield that exhibit a high diversity in size, shape and color in native landraces and modern varieties. Various phenotyping approaches were developed to measure maize cob parameters in a high throughput fashion. More recently, deep learning methods like convolutional neural networks (CNN) became available and were shown to be highly useful for high-throughput plant phenotyping. We aimed at comparing classical image segmentation with deep learning methods for maize cob image segmentation and phenotyping using a large image dataset of native maize landrace diversity from Peru. Results: Comparison of three image analysis methods showed that a Mask R-CNN trained on a diverse set of maize cob images was highly superior to classical image analysis using the Felzenszwalb-Huttenlocher algorithm and a Window-based CNN due to its robustness to image quality and object segmentation accuracy (r=0.99). We integrated Mask R-CNN into a high-throughput pipeline to segment both maize cobs and rulers in images and perform an automated quantitative analysis of eight phenotypic traits, including diameter, length, ellipticity, asymmetry, aspect ratio and average RGB values for cob color. Statistical analysis identified key training parameters for efficient iterative model updating. We also show that a small number of 10-20 images is sufficient to update the initial Mask R-CNN model to process new types of cob images. To demonstrate an application of the pipeline we analyzed phenotypic variation in 19,867 maize cobs extracted from 3,449 images of 2,484 accessions from the maize genebank of Peru to identify phenotypically homogeneous and heterogeneous genebank accessions using multivariate clustering. Conclusions: Single Mask R-CNN model and associated analysis pipeline are widely applicable tools for maize cob phenotyping in contexts like genebank phenomics or plant breeding.


Plant Methods ◽  
2021 ◽  
Vol 17 (1) ◽  
Author(s):  
Lydia Kienbaum ◽  
Miguel Correa Abondano ◽  
Raul Blas ◽  
Karl Schmid

Abstract Background Maize cobs are an important component of crop yield that exhibit a high diversity in size, shape and color in native landraces and modern varieties. Various phenotyping approaches were developed to measure maize cob parameters in a high throughput fashion. More recently, deep learning methods like convolutional neural networks (CNNs) became available and were shown to be highly useful for high-throughput plant phenotyping. We aimed at comparing classical image segmentation with deep learning methods for maize cob image segmentation and phenotyping using a large image dataset of native maize landrace diversity from Peru. Results Comparison of three image analysis methods showed that a Mask R-CNN trained on a diverse set of maize cob images was highly superior to classical image analysis using the Felzenszwalb-Huttenlocher algorithm and a Window-based CNN due to its robustness to image quality and object segmentation accuracy ($$r=0.99$$ r = 0.99 ). We integrated Mask R-CNN into a high-throughput pipeline to segment both maize cobs and rulers in images and perform an automated quantitative analysis of eight phenotypic traits, including diameter, length, ellipticity, asymmetry, aspect ratio and average values of red, green and blue color channels for cob color. Statistical analysis identified key training parameters for efficient iterative model updating. We also show that a small number of 10–20 images is sufficient to update the initial Mask R-CNN model to process new types of cob images. To demonstrate an application of the pipeline we analyzed phenotypic variation in 19,867 maize cobs extracted from 3449 images of 2484 accessions from the maize genebank of Peru to identify phenotypically homogeneous and heterogeneous genebank accessions using multivariate clustering. Conclusions Single Mask R-CNN model and associated analysis pipeline are widely applicable tools for maize cob phenotyping in contexts like genebank phenomics or plant breeding.


Plant Methods ◽  
2021 ◽  
Vol 17 (1) ◽  
Author(s):  
Shuo Zhou ◽  
Xiujuan Chai ◽  
Zixuan Yang ◽  
Hongwu Wang ◽  
Chenxue Yang ◽  
...  

Abstract Background Maize (Zea mays L.) is one of the most important food sources in the world and has been one of the main targets of plant genetics and phenotypic research for centuries. Observation and analysis of various morphological phenotypic traits during maize growth are essential for genetic and breeding study. The generally huge number of samples produce an enormous amount of high-resolution image data. While high throughput plant phenotyping platforms are increasingly used in maize breeding trials, there is a reasonable need for software tools that can automatically identify visual phenotypic features of maize plants and implement batch processing on image datasets. Results On the boundary between computer vision and plant science, we utilize advanced deep learning methods based on convolutional neural networks to empower the workflow of maize phenotyping analysis. This paper presents Maize-IAS (Maize Image Analysis Software), an integrated application supporting one-click analysis of maize phenotype, embedding multiple functions: (I) Projection, (II) Color Analysis, (III) Internode length, (IV) Height, (V) Stem Diameter and (VI) Leaves Counting. Taking the RGB image of maize as input, the software provides a user-friendly graphical interaction interface and rapid calculation of multiple important phenotypic characteristics, including leaf sheath points detection and leaves segmentation. In function Leaves Counting, the mean and standard deviation of difference between prediction and ground truth are 1.60 and 1.625. Conclusion The Maize-IAS is easy-to-use and demands neither professional knowledge of computer vision nor deep learning. All functions for batch processing are incorporated, enabling automated and labor-reduced tasks of recording, measurement and quantitative analysis of maize growth traits on a large dataset. We prove the efficiency and potential capability of our techniques and software to image-based plant research, which also demonstrates the feasibility and capability of AI technology implemented in agriculture and plant science.


2011 ◽  
Vol 12 (1) ◽  
pp. 148 ◽  
Author(s):  
Anja Hartmann ◽  
Tobias Czauderna ◽  
Roberto Hoffmann ◽  
Nils Stein ◽  
Falk Schreiber

2019 ◽  
Vol 10 ◽  
Author(s):  
Sruti Das Choudhury ◽  
Ashok Samal ◽  
Tala Awada

Plant Methods ◽  
2013 ◽  
Vol 9 (1) ◽  
pp. 17 ◽  
Author(s):  
Céline Rousseau ◽  
Etienne Belin ◽  
Edouard Bove ◽  
David Rousseau ◽  
Frédéric Fabre ◽  
...  

PLoS ONE ◽  
2018 ◽  
Vol 13 (4) ◽  
pp. e0196615 ◽  
Author(s):  
Unseok Lee ◽  
Sungyul Chang ◽  
Gian Anantrio Putra ◽  
Hyoungseok Kim ◽  
Dong Hwan Kim

2019 ◽  
Author(s):  
Anand Seethepalli ◽  
Haichao Guo ◽  
Xiuwei Liu ◽  
Marcus Griffiths ◽  
Hussien Almtarfi ◽  
...  

ABSTRACTRoot crown phenotyping measures the top portion of crop root systems and can be used for marker-assisted breeding, genetic mapping, and understanding how roots influence soil resource acquisition. Several imaging protocols and image analysis programs exist, but they are not optimized for high-throughput, repeatable, and robust root crown phenotyping. The RhizoVision Crown platform integrates an imaging unit, image capture software, and image analysis software that are optimized for reliable extraction of measurements from large numbers of root crowns. The hardware platform utilizes a back light and a monochrome machine vision camera to capture root crown silhouettes. RhizoVision Imager and RhizoVision Analyzer are free, open-source software that streamline image capture and image analysis with intuitive graphical user interfaces. RhizoVision Analyzer was physically validated using copper wire and features were extensively validated using 10,464 ground-truth simulated images of dicot and monocot root systems. This platform was then used to phenotype soybean and wheat root crowns. A total of 2,799 soybean (Glycine max) root crowns of 187 lines and 1,753 wheat (Triticum aestivum) root crowns of 186 lines were phenotyped. Principal component analysis indicated similar correlations among features in both species. The maximum heritability was 0.74 in soybean and 0.22 in wheat, indicating differences in species and populations need to be considered. The integrated RhizoVision Crown platform facilitates high-throughput phenotyping of crop root crowns, and sets a standard by which open plant phenotyping platforms can be benchmarked.


Sign in / Sign up

Export Citation Format

Share Document