scholarly journals RhizoVision Crown: An Integrated Hardware and Software Platform for Root Crown Phenotyping

2019 ◽  
Author(s):  
Anand Seethepalli ◽  
Haichao Guo ◽  
Xiuwei Liu ◽  
Marcus Griffiths ◽  
Hussien Almtarfi ◽  
...  

ABSTRACTRoot crown phenotyping measures the top portion of crop root systems and can be used for marker-assisted breeding, genetic mapping, and understanding how roots influence soil resource acquisition. Several imaging protocols and image analysis programs exist, but they are not optimized for high-throughput, repeatable, and robust root crown phenotyping. The RhizoVision Crown platform integrates an imaging unit, image capture software, and image analysis software that are optimized for reliable extraction of measurements from large numbers of root crowns. The hardware platform utilizes a back light and a monochrome machine vision camera to capture root crown silhouettes. RhizoVision Imager and RhizoVision Analyzer are free, open-source software that streamline image capture and image analysis with intuitive graphical user interfaces. RhizoVision Analyzer was physically validated using copper wire and features were extensively validated using 10,464 ground-truth simulated images of dicot and monocot root systems. This platform was then used to phenotype soybean and wheat root crowns. A total of 2,799 soybean (Glycine max) root crowns of 187 lines and 1,753 wheat (Triticum aestivum) root crowns of 186 lines were phenotyped. Principal component analysis indicated similar correlations among features in both species. The maximum heritability was 0.74 in soybean and 0.22 in wheat, indicating differences in species and populations need to be considered. The integrated RhizoVision Crown platform facilitates high-throughput phenotyping of crop root crowns, and sets a standard by which open plant phenotyping platforms can be benchmarked.

2020 ◽  
Vol 2020 ◽  
pp. 1-15 ◽  
Author(s):  
Anand Seethepalli ◽  
Haichao Guo ◽  
Xiuwei Liu ◽  
Marcus Griffiths ◽  
Hussien Almtarfi ◽  
...  

Root crown phenotyping measures the top portion of crop root systems and can be used for marker-assisted breeding, genetic mapping, and understanding how roots influence soil resource acquisition. Several imaging protocols and image analysis programs exist, but they are not optimized for high-throughput, repeatable, and robust root crown phenotyping. The RhizoVision Crown platform integrates an imaging unit, image capture software, and image analysis software that are optimized for reliable extraction of measurements from large numbers of root crowns. The hardware platform utilizes a backlight and a monochrome machine vision camera to capture root crown silhouettes. The RhizoVision Imager and RhizoVision Analyzer are free, open-source software that streamline image capture and image analysis with intuitive graphical user interfaces. The RhizoVision Analyzer was physically validated using copper wire, and features were extensively validated using 10,464 ground-truth simulated images of dicot and monocot root systems. This platform was then used to phenotype soybean and wheat root crowns. A total of 2,799 soybean (Glycine max) root crowns of 187 lines and 1,753 wheat (Triticum aestivum) root crowns of 186 lines were phenotyped. Principal component analysis indicated similar correlations among features in both species. The maximum heritability was 0.74 in soybean and 0.22 in wheat, indicating that differences in species and populations need to be considered. The integrated RhizoVision Crown platform facilitates high-throughput phenotyping of crop root crowns and sets a standard by which open plant phenotyping platforms can be benchmarked.


Plant Methods ◽  
2021 ◽  
Vol 17 (1) ◽  
Author(s):  
Shuo Zhou ◽  
Xiujuan Chai ◽  
Zixuan Yang ◽  
Hongwu Wang ◽  
Chenxue Yang ◽  
...  

Abstract Background Maize (Zea mays L.) is one of the most important food sources in the world and has been one of the main targets of plant genetics and phenotypic research for centuries. Observation and analysis of various morphological phenotypic traits during maize growth are essential for genetic and breeding study. The generally huge number of samples produce an enormous amount of high-resolution image data. While high throughput plant phenotyping platforms are increasingly used in maize breeding trials, there is a reasonable need for software tools that can automatically identify visual phenotypic features of maize plants and implement batch processing on image datasets. Results On the boundary between computer vision and plant science, we utilize advanced deep learning methods based on convolutional neural networks to empower the workflow of maize phenotyping analysis. This paper presents Maize-IAS (Maize Image Analysis Software), an integrated application supporting one-click analysis of maize phenotype, embedding multiple functions: (I) Projection, (II) Color Analysis, (III) Internode length, (IV) Height, (V) Stem Diameter and (VI) Leaves Counting. Taking the RGB image of maize as input, the software provides a user-friendly graphical interaction interface and rapid calculation of multiple important phenotypic characteristics, including leaf sheath points detection and leaves segmentation. In function Leaves Counting, the mean and standard deviation of difference between prediction and ground truth are 1.60 and 1.625. Conclusion The Maize-IAS is easy-to-use and demands neither professional knowledge of computer vision nor deep learning. All functions for batch processing are incorporated, enabling automated and labor-reduced tasks of recording, measurement and quantitative analysis of maize growth traits on a large dataset. We prove the efficiency and potential capability of our techniques and software to image-based plant research, which also demonstrates the feasibility and capability of AI technology implemented in agriculture and plant science.


2011 ◽  
Vol 12 (1) ◽  
pp. 148 ◽  
Author(s):  
Anja Hartmann ◽  
Tobias Czauderna ◽  
Roberto Hoffmann ◽  
Nils Stein ◽  
Falk Schreiber

2021 ◽  
Vol 64 (6) ◽  
pp. 1999-2010
Author(s):  
Lirong Xiang ◽  
Lie Tang ◽  
Jingyao Gai ◽  
Le Wang

HighlightsA custom-built camera module named PhenoStereo was developed for high-throughput field-based plant phenotyping.Novel integration of strobe lights facilitated application of PhenoStereo in various environmental conditions.Image-derived stem diameters were found to have high correlations with ground truth, which outperformed any previously reported sensing approach.PhenoStereo showed promising potential to characterize a broad spectrum of plant phenotypes.Abstract. The stem diameter of sorghum plants is an important trait for evaluation of stalk strength and biomass potential, but it is a challenging sensing task to automate in the field due to the complexity of the imaging object and the environment. In recent years, stereo vision has offered a viable three-dimensional (3D) solution due to its high spatial resolution and wide selection of camera modules. However, the performance of in-field stereo imaging for plant phenotyping is adversely affected by textureless regions, occlusion of plants, variable outdoor lighting, and wind conditions. In this study, a portable stereo imaging module named PhenoStereo was developed for high-throughput field-based plant phenotyping. PhenoStereo features a self-contained embedded design, which makes it capable of capturing images at 14 stereoscopic frames per second. In addition, a set of customized strobe lights is integrated to overcome lighting variations and enable the use of high shutter speed to overcome motion blur. PhenoStereo was used to acquire a set of sorghum plant images, and an automated point cloud data processing pipeline was developed to automatically extract the stems and then quantify their diameters via an optimized 3D modeling process. The pipeline employed a mask region convolutional neural network (Mask R-CNN) for detecting stalk contours and a semi-global block matching (SGBM) stereo matching algorithm for generating disparity maps. The correlation coefficient (r) between the image-derived stem diameters and the ground truth was 0.97 with a mean absolute error (MAE) of 1.44 mm, which outperformed any previously reported sensing approach. These results demonstrate that, with proper customization, stereo vision can be an effective sensing method for field-based plant phenotyping using high-fidelity 3D models reconstructed from stereoscopic images. Based on the results from sorghum plant stem diameter sensing, this proposed stereo sensing approach can likely be extended to characterize a broad range of plant phenotypes, such as the leaf angle and tassel shape of maize plants and the seed pods and stem nodes of soybean plants. Keywords: Field-based high-throughput phenotyping, Point cloud, Stem diameter, Stereo vision.


2019 ◽  
Vol 10 ◽  
Author(s):  
Sruti Das Choudhury ◽  
Ashok Samal ◽  
Tala Awada

2019 ◽  
Author(s):  
Xu Wang ◽  
Hong Xuan ◽  
Byron Evers ◽  
Sandesh Shrestha ◽  
Robert Pless ◽  
...  

ABSTRACTBackgroundPrecise measurement of plant traits with precision and speed on large populations has emerged as a critical bottleneck in connecting genotype to phenotype in genetics and breeding. This bottleneck limits advancements in understanding plant genomes and the development of improved, high-yielding crop varieties.ResultsHere we demonstrate the application of deep learning on proximal imaging from a mobile field vehicle to directly score plant morphology and developmental stages in wheat under field conditions. We developed and trained a convolutional neural network with image datasets labeled from expert visual scores and used this ‘breeder-trained’ network to directly score wheat morphology and developmental stages. For both morphological (awned) and phenological (flowering time) traits, we demonstrate high heritability and extremely high accuracy against the ‘ground-truth’ values from visual scoring. Using the traits scored by the network, we tested genotype-to-phenotype association using the deep learning phenotypes and uncovered novel epistatic interactions for flowering time. Enabled by the time-series high-throughput phenotyping, we describe a new phenotype as the rate of flowering and show heritable genetic control.ConclusionsWe demonstrated a field-based high-throughput phenotyping approach using deep learning that can directly score morphological and developmental phenotypes in genetic populations. Most powerfully, the deep learning approach presented here gives a conceptual advancement in high-throughput plant phenotyping as it can potentially score any trait in any plant species through leveraging expert knowledge from breeders, geneticist, pathologists and physiologists.


2021 ◽  
Author(s):  
Lydia Kienbaum ◽  
Miguel Correa Abondano ◽  
Raul H. Blas Sevillano ◽  
Karl J Schmid

Background: Maize cobs are an important component of crop yield that exhibit a high diversity in size, shape and color in native landraces and modern varieties. Various phenotyping approaches were developed to measure maize cob parameters in a high throughput fashion. More recently, deep learning methods like convolutional neural networks (CNN) became available and were shown to be highly useful for high-throughput plant phenotyping. We aimed at comparing classical image segmentation with deep learning methods for maize cob image segmentation and phenotyping using a large image dataset of native maize landrace diversity from Peru. Results: Comparison of three image analysis methods showed that a Mask R-CNN trained on a diverse set of maize cob images was highly superior to classical image analysis using the Felzenszwalb-Huttenlocher algorithm and a Window-based CNN due to its robustness to image quality and object segmentation accuracy (r=0.99). We integrated Mask R-CNN into a high-throughput pipeline to segment both maize cobs and rulers in images and perform an automated quantitative analysis of eight phenotypic traits, including diameter, length, ellipticity, asymmetry, aspect ratio and average RGB values for cob color. Statistical analysis identified key training parameters for efficient iterative model updating. We also show that a small number of 10-20 images is sufficient to update the initial Mask R-CNN model to process new types of cob images. To demonstrate an application of the pipeline we analyzed phenotypic variation in 19,867 maize cobs extracted from 3,449 images of 2,484 accessions from the maize genebank of Peru to identify phenotypically homogeneous and heterogeneous genebank accessions using multivariate clustering. Conclusions: Single Mask R-CNN model and associated analysis pipeline are widely applicable tools for maize cob phenotyping in contexts like genebank phenomics or plant breeding.


Author(s):  
Bikram Pratap Banerjee ◽  
German Spangenberg ◽  
Surya Kant

Phenotypic characterization of crop genotypes is an essential yet challenging aspect of crop management and agriculture research. Digital sensing technologies are rapidly advancing plant phenotyping and speeding-up crop breeding outcomes. However, off-the-shelf sensors might not be fully applicable and suitable for agriculture research due to diversity in crop species and specific needs during plant breeding selections. Customized sensing systems with specialized sensor hardware and software architecture provide a powerful and low-cost solution. This study designed and developed a fully integrated Raspberry Pi-based LiDAR sensor named CropBioMass (CBM), enabled by internet of things to provide a complete end-to-end pipeline. The CBM is a low-cost sensor, provides high-throughput seamless data collection in field, small data footprint, injection of data onto the remote server, and automated data processing. Phenotypic traits of crop fresh biomass, dry biomass, and plant height estimated by CBM data had high correlation with ground truth manual measurements in wheat field trial. The CBM is readily applicable for high-throughput plant phenotyping, crop monitoring, and management for precision agricultural applications.


PLoS ONE ◽  
2018 ◽  
Vol 13 (4) ◽  
pp. e0196615 ◽  
Author(s):  
Unseok Lee ◽  
Sungyul Chang ◽  
Gian Anantrio Putra ◽  
Hyoungseok Kim ◽  
Dong Hwan Kim

Sign in / Sign up

Export Citation Format

Share Document