scholarly journals HTPheno: An image analysis pipeline for high-throughput plant phenotyping

2011 ◽  
Vol 12 (1) ◽  
pp. 148 ◽  
Author(s):  
Anja Hartmann ◽  
Tobias Czauderna ◽  
Roberto Hoffmann ◽  
Nils Stein ◽  
Falk Schreiber
Plant Methods ◽  
2021 ◽  
Vol 17 (1) ◽  
Author(s):  
Shuo Zhou ◽  
Xiujuan Chai ◽  
Zixuan Yang ◽  
Hongwu Wang ◽  
Chenxue Yang ◽  
...  

Abstract Background Maize (Zea mays L.) is one of the most important food sources in the world and has been one of the main targets of plant genetics and phenotypic research for centuries. Observation and analysis of various morphological phenotypic traits during maize growth are essential for genetic and breeding study. The generally huge number of samples produce an enormous amount of high-resolution image data. While high throughput plant phenotyping platforms are increasingly used in maize breeding trials, there is a reasonable need for software tools that can automatically identify visual phenotypic features of maize plants and implement batch processing on image datasets. Results On the boundary between computer vision and plant science, we utilize advanced deep learning methods based on convolutional neural networks to empower the workflow of maize phenotyping analysis. This paper presents Maize-IAS (Maize Image Analysis Software), an integrated application supporting one-click analysis of maize phenotype, embedding multiple functions: (I) Projection, (II) Color Analysis, (III) Internode length, (IV) Height, (V) Stem Diameter and (VI) Leaves Counting. Taking the RGB image of maize as input, the software provides a user-friendly graphical interaction interface and rapid calculation of multiple important phenotypic characteristics, including leaf sheath points detection and leaves segmentation. In function Leaves Counting, the mean and standard deviation of difference between prediction and ground truth are 1.60 and 1.625. Conclusion The Maize-IAS is easy-to-use and demands neither professional knowledge of computer vision nor deep learning. All functions for batch processing are incorporated, enabling automated and labor-reduced tasks of recording, measurement and quantitative analysis of maize growth traits on a large dataset. We prove the efficiency and potential capability of our techniques and software to image-based plant research, which also demonstrates the feasibility and capability of AI technology implemented in agriculture and plant science.


2018 ◽  
Author(s):  
Sarah D. Turner ◽  
Shelby L. Ellison ◽  
Douglas A. Senalik ◽  
Philipp W. Simon ◽  
Edgar P. Spalding ◽  
...  

AbstractCarrot is a globally important crop, yet efficient and accurate methods for quantifying its most important agronomic traits are lacking. To address this problem, we developed an automated analysis platform that extracts components of size and shape for carrot shoots and roots, which are necessary to advance carrot breeding and genetics. This method reliably measured variation in shoot size and shape, leaf number, petiole length, and petiole width as evidenced by high correlations with hundreds of manual measurements. Similarly, root length and biomass were accurately measured from the images. This platform quantified shoot and root shapes in terms of principal components, which do not have traditional, manually-measurable equivalents. We applied the pipeline in a study of a six-parent diallel population and an F2 mapping population consisting of 316 individuals. We found high levels of repeatability within a growing environment, with low to moderate repeatability across environments. We also observed co-localization of quantitative trait loci for shoot and root characteristics on chromosomes 1, 2, and 7, suggesting these traits are controlled by genetic linkage and/or pleiotropy. By increasing the number of individuals and phenotypes that can be reliably quantified, the development of a high-throughput image analysis pipeline to measure carrot shoot and root morphology will expand the scope and scale of breeding and genetic studies.


Plant Methods ◽  
2020 ◽  
Vol 16 (1) ◽  
Author(s):  
Morteza Shabannejad ◽  
Mohammad-Reza Bihamta ◽  
Eslam Majidi-Hervan ◽  
Hadi Alipour ◽  
Asa Ebrahimi

Abstract Background High-throughput phenotyping and genomic selection accelerate genetic gain in breeding programs by advances in phenotyping and genotyping methods. This study developed a simple, cost-effective high-throughput image analysis pipeline to quantify digital images taken in a panel of 286 Iran bread wheat accessions under terminal drought stress and well-watered conditions. The color proportion of green to yellow (tolerance ratio) and the color proportion of yellow to green (stress ratio) was assessed for each canopy using the pipeline. The estimated tolerance and stress ratios were used as covariates in the genomic prediction models to evaluate the effect of change in canopy color on the improvement of the genomic prediction accuracy of different agronomic traits in wheat. Results The reliability of the high-throughput image analysis pipeline was proved by three to four times of improvement in the accuracy of genomic predictions for days to maturity with the use of tolerance and stress ratios as covariates in the univariate genomic selection models. The higher prediction accuracies were attained for days to maturity when both tolerance and stress ratios were used as fixed effects in the univariate models. The results of this study indicated that the Bayesian ridge regression and ridge regression-best linear unbiased prediction methods were superior to other genomic prediction methods which were used in this study under terminal drought stress and well-watered conditions, respectively. Conclusions This study provided a robust, quick, and cost-effective machine learning-enabled image-phenotyping pipeline to improve the genomic prediction accuracy for days to maturity in wheat. The results encouraged the integration of phenomics and genomics in breeding programs.


2019 ◽  
Vol 10 ◽  
Author(s):  
Sruti Das Choudhury ◽  
Ashok Samal ◽  
Tala Awada

2021 ◽  
Author(s):  
Lydia Kienbaum ◽  
Miguel Correa Abondano ◽  
Raul H. Blas Sevillano ◽  
Karl J Schmid

Background: Maize cobs are an important component of crop yield that exhibit a high diversity in size, shape and color in native landraces and modern varieties. Various phenotyping approaches were developed to measure maize cob parameters in a high throughput fashion. More recently, deep learning methods like convolutional neural networks (CNN) became available and were shown to be highly useful for high-throughput plant phenotyping. We aimed at comparing classical image segmentation with deep learning methods for maize cob image segmentation and phenotyping using a large image dataset of native maize landrace diversity from Peru. Results: Comparison of three image analysis methods showed that a Mask R-CNN trained on a diverse set of maize cob images was highly superior to classical image analysis using the Felzenszwalb-Huttenlocher algorithm and a Window-based CNN due to its robustness to image quality and object segmentation accuracy (r=0.99). We integrated Mask R-CNN into a high-throughput pipeline to segment both maize cobs and rulers in images and perform an automated quantitative analysis of eight phenotypic traits, including diameter, length, ellipticity, asymmetry, aspect ratio and average RGB values for cob color. Statistical analysis identified key training parameters for efficient iterative model updating. We also show that a small number of 10-20 images is sufficient to update the initial Mask R-CNN model to process new types of cob images. To demonstrate an application of the pipeline we analyzed phenotypic variation in 19,867 maize cobs extracted from 3,449 images of 2,484 accessions from the maize genebank of Peru to identify phenotypically homogeneous and heterogeneous genebank accessions using multivariate clustering. Conclusions: Single Mask R-CNN model and associated analysis pipeline are widely applicable tools for maize cob phenotyping in contexts like genebank phenomics or plant breeding.


PLoS ONE ◽  
2018 ◽  
Vol 13 (4) ◽  
pp. e0196615 ◽  
Author(s):  
Unseok Lee ◽  
Sungyul Chang ◽  
Gian Anantrio Putra ◽  
Hyoungseok Kim ◽  
Dong Hwan Kim

Sign in / Sign up

Export Citation Format

Share Document