Joint Plant Instance Detection and Leaf Count Estimation for In-Field Plant Phenotyping

Author(s):  
Jan Weyler ◽  
Andres Milioto ◽  
Tillmann Falck ◽  
Jens Behley ◽  
Cyrill Stachniss
Sensors ◽  
2020 ◽  
Vol 20 (16) ◽  
pp. 4550
Author(s):  
Huajian Liu ◽  
Brooke Bruning ◽  
Trevor Garnett ◽  
Bettina Berger

The accurate and high throughput quantification of nitrogen (N) content in wheat using non-destructive methods is an important step towards identifying wheat lines with high nitrogen use efficiency and informing agronomic management practices. Among various plant phenotyping methods, hyperspectral sensing has shown promise in providing accurate measurements in a fast and non-destructive manner. Past applications have utilised non-imaging instruments, such as spectrometers, while more recent approaches have expanded to hyperspectral cameras operating in different wavelength ranges and at various spectral resolutions. However, despite the success of previous hyperspectral applications, some important research questions regarding hyperspectral sensors with different wavelength centres and bandwidths remain unanswered, limiting wide application of this technology. This study evaluated the capability of hyperspectral imaging and non-imaging sensors to estimate N content in wheat leaves by comparing three hyperspectral cameras and a non-imaging spectrometer. This study answered the following questions: (1) How do hyperspectral sensors with different system setups perform when conducting proximal sensing of N in wheat leaves and what aspects have to be considered for optimal results? (2) What types of photonic detectors are most sensitive to N in wheat leaves? (3) How do the spectral resolutions of different instruments affect N measurement in wheat leaves? (4) What are the key-wavelengths with the highest correlation to N in wheat? Our study demonstrated that hyperspectral imaging systems with satisfactory system setups can be used to conduct proximal sensing of N content in wheat with sufficient accuracy. The proposed approach could reduce the need for chemical analysis of leaf tissue and lead to high-throughput estimation of N in wheat. The methodologies here could also be validated on other plants with different characteristics. The results can provide a reference for users wishing to measure N content at either plant- or leaf-scales using hyperspectral sensors.


Plant Methods ◽  
2021 ◽  
Vol 17 (1) ◽  
Author(s):  
Shuo Zhou ◽  
Xiujuan Chai ◽  
Zixuan Yang ◽  
Hongwu Wang ◽  
Chenxue Yang ◽  
...  

Abstract Background Maize (Zea mays L.) is one of the most important food sources in the world and has been one of the main targets of plant genetics and phenotypic research for centuries. Observation and analysis of various morphological phenotypic traits during maize growth are essential for genetic and breeding study. The generally huge number of samples produce an enormous amount of high-resolution image data. While high throughput plant phenotyping platforms are increasingly used in maize breeding trials, there is a reasonable need for software tools that can automatically identify visual phenotypic features of maize plants and implement batch processing on image datasets. Results On the boundary between computer vision and plant science, we utilize advanced deep learning methods based on convolutional neural networks to empower the workflow of maize phenotyping analysis. This paper presents Maize-IAS (Maize Image Analysis Software), an integrated application supporting one-click analysis of maize phenotype, embedding multiple functions: (I) Projection, (II) Color Analysis, (III) Internode length, (IV) Height, (V) Stem Diameter and (VI) Leaves Counting. Taking the RGB image of maize as input, the software provides a user-friendly graphical interaction interface and rapid calculation of multiple important phenotypic characteristics, including leaf sheath points detection and leaves segmentation. In function Leaves Counting, the mean and standard deviation of difference between prediction and ground truth are 1.60 and 1.625. Conclusion The Maize-IAS is easy-to-use and demands neither professional knowledge of computer vision nor deep learning. All functions for batch processing are incorporated, enabling automated and labor-reduced tasks of recording, measurement and quantitative analysis of maize growth traits on a large dataset. We prove the efficiency and potential capability of our techniques and software to image-based plant research, which also demonstrates the feasibility and capability of AI technology implemented in agriculture and plant science.


Author(s):  
Anna Langstroff ◽  
Marc C. Heuermann ◽  
Andreas Stahl ◽  
Astrid Junker

AbstractRising temperatures and changing precipitation patterns will affect agricultural production substantially, exposing crops to extended and more intense periods of stress. Therefore, breeding of varieties adapted to the constantly changing conditions is pivotal to enable a quantitatively and qualitatively adequate crop production despite the negative effects of climate change. As it is not yet possible to select for adaptation to future climate scenarios in the field, simulations of future conditions in controlled-environment (CE) phenotyping facilities contribute to the understanding of the plant response to special stress conditions and help breeders to select ideal genotypes which cope with future conditions. CE phenotyping facilities enable the collection of traits that are not easy to measure under field conditions and the assessment of a plant‘s phenotype under repeatable, clearly defined environmental conditions using automated, non-invasive, high-throughput methods. However, extrapolation and translation of results obtained under controlled environments to field environments is ambiguous. This review outlines the opportunities and challenges of phenotyping approaches under controlled environments complementary to conventional field trials. It gives an overview on general principles and introduces existing phenotyping facilities that take up the challenge of obtaining reliable and robust phenotypic data on climate response traits to support breeding of climate-adapted crops.


2021 ◽  
Vol 13 (13) ◽  
pp. 2622
Author(s):  
Haozhou Wang ◽  
Yulin Duan ◽  
Yun Shi ◽  
Yoichiro Kato ◽  
Seish Ninomiya ◽  
...  

Unmanned aerial vehicle (UAV) and structure from motion (SfM) photogrammetry techniques are widely used for field-based, high-throughput plant phenotyping nowadays, but some of the intermediate processes throughout the workflow remain manual. For example, geographic information system (GIS) software is used to manually assess the 2D/3D field reconstruction quality and cropping region of interests (ROIs) from the whole field. In addition, extracting phenotypic traits from raw UAV images is more competitive than directly from the digital orthomosaic (DOM). Currently, no easy-to-use tools are available to implement previous tasks for commonly used commercial SfM software, such as Pix4D and Agisoft Metashape. Hence, an open source software package called easy intermediate data processor (EasyIDP; MIT license) was developed to decrease the workload in intermediate data processing mentioned above. The functions of the proposed package include 1) an ROI cropping module, assisting in reconstruction quality assessment and cropping ROIs from the whole field, and 2) an ROI reversing module, projecting ROIs to relative raw images. The result showed that both cropping and reversing modules work as expected. Moreover, the effects of ROI height selection and reversed ROI position on raw images to reverse calculation were discussed. This tool shows great potential for decreasing workload in data annotation for machine learning applications.


Author(s):  
A. B. Utkin ◽  
A. Cartaxana ◽  
A. Figueiredo ◽  
J. Marques da Silva

Agriculture ◽  
2021 ◽  
Vol 11 (7) ◽  
pp. 626
Author(s):  
Tinashe Zenda ◽  
Songtao Liu ◽  
Anyi Dong ◽  
Huijun Duan

Sulphur plays crucial roles in plant growth and development, with its functions ranging from being a structural constituent of macro-biomolecules to modulating several physiological processes and tolerance to abiotic stresses. In spite of these numerous sulphur roles being well acknowledged, agriculture has paid scant regard for sulphur nutrition, until only recently. Serious problems related to soil sulphur deficiencies have emerged and the intensification of food, fiber, and animal production is escalating to feed the ever-increasing human population. In the wake of huge demand for high quality cereal and vegetable diets, sulphur can play a key role in augmenting the production, productivity, and quality of crops. Additionally, in light of the emerging problems of soil fertility exhaustion and climate change-exacerbated environmental stresses, sulphur assumes special importance in crop production, particularly under intensively cropped areas. Here, citing several relevant examples, we highlight, in addition to its plant biological and metabolism functions, how sulphur can significantly enhance crop productivity and quality, as well as acclimation to abiotic stresses. By this appraisal, we also aim to stimulate readers interests in crop sulphur research by providing priorities for future pursuance, including bettering our understanding of the molecular processes and dynamics of sulphur availability and utilization in plants, dissecting the role of soil rhizospherical microbes in plant sulphur transformations, enhancing plant phenotyping and diagnosis for nutrient deficiencies, and matching site-specific crop sulphur demands with fertilizer amendments in order to reduce nutrient use inefficiencies in both crop and livestock production systems. This will facilitate the proper utilization of sulphur in crop production and eventually enhance sustainable and environmentally friend food production.


2021 ◽  
Vol 13 (7) ◽  
pp. 1380
Author(s):  
Sébastien Dandrifosse ◽  
Alexis Carlier ◽  
Benjamin Dumont ◽  
Benoît Mercatoris

Multimodal images fusion has the potential to enrich the information gathered by multi-sensor plant phenotyping platforms. Fusion of images from multiple sources is, however, hampered by the technical lock of image registration. The aim of this paper is to provide a solution to the registration and fusion of multimodal wheat images in field conditions and at close range. Eight registration methods were tested on nadir wheat images acquired by a pair of red, green and blue (RGB) cameras, a thermal camera and a multispectral camera array. The most accurate method, relying on a local transformation, aligned the images with an average error of 2 mm but was not reliable for thermal images. More generally, the suggested registration method and the preprocesses necessary before fusion (plant mask erosion, pixel intensity averaging) would depend on the application. As a consequence, the main output of this study was to identify four registration-fusion strategies: (i) the REAL-TIME strategy solely based on the cameras’ positions, (ii) the FAST strategy suitable for all types of images tested, (iii) and (iv) the ACCURATE and HIGHLY ACCURATE strategies handling local distortion but unable to deal with images of very different natures. These suggestions are, however, limited to the methods compared in this study. Further research should investigate how recent cutting-edge registration methods would perform on the specific case of wheat canopy.


Sensors ◽  
2020 ◽  
Vol 20 (11) ◽  
pp. 3208 ◽  
Author(s):  
Liangju Wang ◽  
Yunhong Duan ◽  
Libo Zhang ◽  
Tanzeel U. Rehman ◽  
Dongdong Ma ◽  
...  

The normalized difference vegetation index (NDVI) is widely used in remote sensing to monitor plant growth and chlorophyll levels. Usually, a multispectral camera (MSC) or hyperspectral camera (HSC) is required to obtain the near-infrared (NIR) and red bands for calculating NDVI. However, these cameras are expensive, heavy, difficult to geo-reference, and require professional training in imaging and data processing. On the other hand, the RGBN camera (NIR sensitive RGB camera, simply modified from standard RGB cameras by removing the NIR rejection filter) have also been explored to measure NDVI, but the results did not exactly match the NDVI from the MSC or HSC solutions. This study demonstrates an improved NDVI estimation method with an RGBN camera-based imaging system (Ncam) and machine learning algorithms. The Ncam consisted of an RGBN camera, a filter, and a microcontroller with a total cost of only $70 ~ 85. This new NDVI estimation solution was compared with a high-end hyperspectral camera in an experiment with corn plants under different nitrogen and water treatments. The results showed that the Ncam with two-band-pass filter achieved high performance (R2 = 0.96, RMSE = 0.0079) at estimating NDVI with the machine learning model. Additional tests showed that besides NDVI, this low-cost Ncam was also capable of predicting corn plant nitrogen contents precisely. Thus, Ncam is a potential option for MSC and HSC in plant phenotyping projects.


Sign in / Sign up

Export Citation Format

Share Document