Image based species identification of Globodera quarantine nematodes using computer vision and deep learning

Author(s):  
Romain Thevenoux ◽  
Van Linh LE ◽  
Heloïse Villessèche ◽  
Alain Buisson ◽  
Marie Beurton-Aimar ◽  
...  
IAWA Journal ◽  
2020 ◽  
Vol 41 (4) ◽  
pp. 660-680 ◽  
Author(s):  
Frederic Lens ◽  
Chao Liang ◽  
Yuanhao Guo ◽  
Xiaoqin Tang ◽  
Mehrdad Jahanbanifard ◽  
...  

Abstract Wood anatomy is one of the most important methods for timber identification. However, training wood anatomy experts is time-consuming, while at the same time the number of senior wood anatomists with broad taxonomic expertise is declining. Therefore, we want to explore how a more automated, computer-assisted approach can support accurate wood identification based on microscopic wood anatomy. For our exploratory research, we used an available image dataset that has been applied in several computer vision studies, consisting of 112 — mainly neotropical — tree species representing 20 images of transverse sections for each species. Our study aims to review existing computer vision methods and compare the success of species identification based on (1) several image classifiers based on manually adjusted texture features, and (2) a state-of-the-art approach for image classification based on deep learning, more specifically Convolutional Neural Networks (CNNs). In support of previous studies, a considerable increase of the correct identification is accomplished using deep learning, leading to an accuracy rate up to 95.6%. This remarkably high success rate highlights the fundamental potential of wood anatomy in species identification and motivates us to expand the existing database to an extensive, worldwide reference database with transverse and tangential microscopic images from the most traded timber species and their look-a-likes. This global reference database could serve as a valuable future tool for stakeholders involved in combatting illegal logging and would boost the societal value of wood anatomy along with its collections and experts.


2021 ◽  
Vol 109 (5) ◽  
pp. 863-890
Author(s):  
Yannis Panagakis ◽  
Jean Kossaifi ◽  
Grigorios G. Chrysos ◽  
James Oldfield ◽  
Mihalis A. Nicolaou ◽  
...  

Plant Methods ◽  
2021 ◽  
Vol 17 (1) ◽  
Author(s):  
Shuo Zhou ◽  
Xiujuan Chai ◽  
Zixuan Yang ◽  
Hongwu Wang ◽  
Chenxue Yang ◽  
...  

Abstract Background Maize (Zea mays L.) is one of the most important food sources in the world and has been one of the main targets of plant genetics and phenotypic research for centuries. Observation and analysis of various morphological phenotypic traits during maize growth are essential for genetic and breeding study. The generally huge number of samples produce an enormous amount of high-resolution image data. While high throughput plant phenotyping platforms are increasingly used in maize breeding trials, there is a reasonable need for software tools that can automatically identify visual phenotypic features of maize plants and implement batch processing on image datasets. Results On the boundary between computer vision and plant science, we utilize advanced deep learning methods based on convolutional neural networks to empower the workflow of maize phenotyping analysis. This paper presents Maize-IAS (Maize Image Analysis Software), an integrated application supporting one-click analysis of maize phenotype, embedding multiple functions: (I) Projection, (II) Color Analysis, (III) Internode length, (IV) Height, (V) Stem Diameter and (VI) Leaves Counting. Taking the RGB image of maize as input, the software provides a user-friendly graphical interaction interface and rapid calculation of multiple important phenotypic characteristics, including leaf sheath points detection and leaves segmentation. In function Leaves Counting, the mean and standard deviation of difference between prediction and ground truth are 1.60 and 1.625. Conclusion The Maize-IAS is easy-to-use and demands neither professional knowledge of computer vision nor deep learning. All functions for batch processing are incorporated, enabling automated and labor-reduced tasks of recording, measurement and quantitative analysis of maize growth traits on a large dataset. We prove the efficiency and potential capability of our techniques and software to image-based plant research, which also demonstrates the feasibility and capability of AI technology implemented in agriculture and plant science.


Sensors ◽  
2021 ◽  
Vol 21 (2) ◽  
pp. 343
Author(s):  
Kim Bjerge ◽  
Jakob Bonde Nielsen ◽  
Martin Videbæk Sepstrup ◽  
Flemming Helsing-Nielsen ◽  
Toke Thomas Høye

Insect monitoring methods are typically very time-consuming and involve substantial investment in species identification following manual trapping in the field. Insect traps are often only serviced weekly, resulting in low temporal resolution of the monitoring data, which hampers the ecological interpretation. This paper presents a portable computer vision system capable of attracting and detecting live insects. More specifically, the paper proposes detection and classification of species by recording images of live individuals attracted to a light trap. An Automated Moth Trap (AMT) with multiple light sources and a camera was designed to attract and monitor live insects during twilight and night hours. A computer vision algorithm referred to as Moth Classification and Counting (MCC), based on deep learning analysis of the captured images, tracked and counted the number of insects and identified moth species. Observations over 48 nights resulted in the capture of more than 250,000 images with an average of 5675 images per night. A customized convolutional neural network was trained on 2000 labeled images of live moths represented by eight different classes, achieving a high validation F1-score of 0.93. The algorithm measured an average classification and tracking F1-score of 0.71 and a tracking detection rate of 0.79. Overall, the proposed computer vision system and algorithm showed promising results as a low-cost solution for non-destructive and automatic monitoring of moths.


2021 ◽  
Vol 172 ◽  
pp. 105685
Author(s):  
Dillam Díaz-Romero ◽  
Wouter Sterkens ◽  
Simon Van den Eynde ◽  
Toon Goedemé ◽  
Wim Dewulf ◽  
...  

Author(s):  
Phakawat Pattarapongsin ◽  
Bipul Neupane ◽  
Jirayus Vorawan ◽  
Harit Sutthikulsombat ◽  
Teerayut Horanont

Sign in / Sign up

Export Citation Format

Share Document