scholarly journals On the Classification of a Greenhouse Environment for a Rose Crop Based on AI-Based Surrogate Models

2021 ◽  
Vol 13 (21) ◽  
pp. 12166
Author(s):  
Showkat Ahmad Bhat ◽  
Nen-Fu Huang ◽  
Imtiyaz Hussain ◽  
Farzana Bibi ◽  
Uzair Sajjad ◽  
...  

A precise microclimate control for dynamic climate changes in greenhouses allows the industry and researchers to develop a simple, robust, reliable, and intelligent model. Accordingly, the objective of this investigation was to develop a method that can accurately define the most suitable environment in the greenhouse for an optimal yield of roses. Herein, an optimal and highly accurate BO-DNN surrogate model was developed (based on 300 experimental data points) for a quick and reliable classification of the rose yield environment considering some of the most influential variables including soil humidity, temperature and humidity of air, CO2 concentration, and light intensity (lux) into its architecture. Initially, two BO techniques (GP and GBRT) are used for the tuning process of the hyper-parameters (such as learning rate, batch size, number of dense nodes, number of dense neurons, number of input nodes, activation function, etc.). After that, an optimal and simple combination of the hyper-parameters was selected to develop a DNN algorithm based on 300 data points, which was further used to classify the rose yield environment (the rose yield environments were classified into four classes such as soil without water, correct environment, too hot, and very cold environments). The very high accuracy of the proposed surrogate model (0.98) originated from the introduction of the most vital soil and meteorological parameters as the inputs of the model. The proposed method can help in identifying intelligent greenhouse environments for efficient crop yields.

2021 ◽  
Vol 34 (1) ◽  
Author(s):  
Zhe Yang ◽  
Dejan Gjorgjevikj ◽  
Jianyu Long ◽  
Yanyang Zi ◽  
Shaohui Zhang ◽  
...  

AbstractSupervised fault diagnosis typically assumes that all the types of machinery failures are known. However, in practice unknown types of defect, i.e., novelties, may occur, whose detection is a challenging task. In this paper, a novel fault diagnostic method is developed for both diagnostics and detection of novelties. To this end, a sparse autoencoder-based multi-head Deep Neural Network (DNN) is presented to jointly learn a shared encoding representation for both unsupervised reconstruction and supervised classification of the monitoring data. The detection of novelties is based on the reconstruction error. Moreover, the computational burden is reduced by directly training the multi-head DNN with rectified linear unit activation function, instead of performing the pre-training and fine-tuning phases required for classical DNNs. The addressed method is applied to a benchmark bearing case study and to experimental data acquired from a delta 3D printer. The results show that its performance is satisfactory both in detection of novelties and fault diagnosis, outperforming other state-of-the-art methods. This research proposes a novel fault diagnostics method which can not only diagnose the known type of defect, but also detect unknown types of defects.


2018 ◽  
Vol 2018 ◽  
pp. 1-13 ◽  
Author(s):  
Chao-qun Zhao ◽  
Long Chen ◽  
Hong Cai ◽  
Wei-li Yao ◽  
Qun Zhou ◽  
...  

Objective. This study aimed to analyze the differential metabolites and their metabolic pathways from the serum of patients with hepatitis B cirrhosis, with two typical patterns of Gan Dan Shi Re (GDSR) and Gan Shen Yin Xu (GSYX) based on the theory of traditional Chinese medicine (TCM). It also investigated the variation in the internal material basis for the two types of patterns and provided an objective basis for classifying TCM patterns using metabolomic techniques. Methods. The serum samples taken from 111 qualified patients (40 GDSR cases, 41 GSYX cases, and 30 Latent Pattern (LP) cases with no obvious pattern characters) and 60 healthy volunteers were tested to identify the differential substances relevant to hepatitis B cirrhosis and the two typical TCM patterns under the gas chromatography–time-of-flight mass spectrometry platform. The relevant metabolic pathways of differential substances were analyzed using multidimensional statistical analysis. Results. After excluding the influence of LP groups, six common substances were found in GDSR and GSYX patterns, which were mainly involved in the metabolic pathways of glycine, serine, threonine, and phenylalanine. Eight specific metabolites involved in the metabolic pathways of linoleic, glycine, threonine, and serine existed in the two patterns. Conclusions. The data points on the metabolic spectrum were found to be well distributed among the differential substances between the two typical TCM patterns of patients with hepatitis B cirrhosis using metabolomic techniques. The differential expression of these substances between GDSR and GSYX patterns provided an important objective basis for the scientific nature of TCM pattern classification at the metabolic level.


2004 ◽  
Vol 26 (3) ◽  
pp. 125-134
Author(s):  
Armin Gerger ◽  
Patrick Bergthaler ◽  
Josef Smolle

Aims. In tissue counter analysis (TCA) digital images of complex histologic sections are dissected into elements of equal size and shape, and digital information comprising grey level, colour and texture features is calculated for each element. In this study we assessed the feasibility of TCA for the quantitative description of amount and also of distribution of immunostained material. Methods. In a first step, our system was trained for differentiating between background and tissue on the one hand and between immunopositive and so‐called other tissue on the other. In a second step, immunostained slides were automatically screened and the procedure was tested for the quantitative description of amount of cytokeratin (CK) and leukocyte common antigen (LCA) immunopositive structures. Additionally, fractal analysis was applied to all cases describing the architectural distribution of immunostained material. Results. The procedure yielded reproducible assessments of the relative amounts of immunopositive tissue components when the number and percentage of CK and LCA stained structures was assessed. Furthermore, a reliable classification of immunopositive patterns was found by means of fractal dimensionality. Conclusions. Tissue counter analysis combined with classification trees and fractal analysis is a fully automated and reproducible approach for the quantitative description in immunohistology.


2020 ◽  
Author(s):  
Zhe Yang ◽  
Dejan Gjorgjevikj ◽  
Jian-Yu Long ◽  
Yan-Yang Zi ◽  
Shao-Hui Zhang ◽  
...  

Abstract Novelty detection is a challenging task for the machinery fault diagnosis. A novel fault diagnostic method is developed for dealing with not only diagnosing the known type of defect, but also detecting novelties, i.e. the occurrence of new types of defects which have never been recorded. To this end, a sparse autoencoder-based multi-head Deep Neural Network (DNN) is presented to jointly learn a shared encoding representation for both unsupervised reconstruction and supervised classification of the monitoring data. The detection of novelties is based on the reconstruction error. Moreover, the computational burden is reduced by directly training the multi-head DNN with rectified linear unit activation function, instead of performing the pre-training and fine-tuning phases required for classical DNNs. The addressed method is applied to a benchmark bearing case study and to experimental data acquired from a delta 3D printer. The results show that it is able to accurately diagnose known types of defects, as well as to detect unknown defects, outperforming other state-of-the-art methods.


Foods ◽  
2020 ◽  
Vol 9 (3) ◽  
pp. 355 ◽  
Author(s):  
Sara Barbieri ◽  
Karolina Brkić Bubola ◽  
Alessandra Bendini ◽  
Milena Bučar-Miklavčič ◽  
Florence Lacoste ◽  
...  

A set of 334 commercial virgin olive oil (VOO) samples were evaluated by six sensory panels during the H2020 OLEUM project. Sensory data were elaborated with two main objectives: (i) to classify and characterize samples in order to use them for possible correlations with physical–chemical data and (ii) to monitor and improve the performance of panels. After revision of the IOC guidelines in 2018, this work represents the first published attempt to verify some of the recommended quality control tools to increase harmonization among panels. Specifically, a new “decision tree” scheme was developed, and some IOC quality control procedures were applied. The adoption of these tools allowed for reliable classification of 289 of 334 VOOs; for the remaining 45, misalignments between panels of first (on the category, 21 cases) or second type (on the main perceived defect, 24 cases) occurred. In these cases, a “formative reassessment” was necessary. At the end, 329 of 334 VOOs (98.5%) were classified, thus confirming the effectiveness of this approach to achieve a better proficiency. The panels showed good performance, but the need to adopt new reference materials that are stable and reproducible to improve the panel’s skills and agreement also emerged.


Insects ◽  
2020 ◽  
Vol 11 (8) ◽  
pp. 458
Author(s):  
Sijing Ye ◽  
Shuhan Lu ◽  
Xuesong Bai ◽  
Jinfeng Gu

Locusts are agricultural pests found in many parts of the world. Developing efficient and accurate locust information acquisition techniques helps in understanding the relation between locust distribution density and structural changes in locust communities. It also helps in understanding the hydrothermal and vegetation growth conditions that affect locusts in their habitats in various parts of the world as well as in providing rapid and accurate warnings on locust plague outbreak. This study is a preliminary attempt to explore whether the batch normalization-based convolutional neural network (CNN) model can be applied used to perform automatic classification of East Asian migratory locust (AM locust), Oxya chinensis (rice locusts), and cotton locusts. In this paper, we present a way of applying the CNN technique to identify species and instars of locusts using the proposed ResNet-Locust-BN model. This model is based on the ResNet architecture and involves introduction of a BatchNorm function before each convolution layer to improve the network’s stability, convergence speed, and classification accuracy. Subsequently, locust image data collected in the field were used as input to train the model. By performing comparison experiments of the activation function, initial learning rate, and batch size, we selected ReLU as the preferred activation function. The initial learning rate and batch size were set to 0.1 and 32, respectively. Experiments performed to evaluate the accuracy of the proposed ResNet-Locust-BN model show that the model can effectively distinguish AM locust from rice locusts (93.60% accuracy) and cotton locusts (97.80% accuracy). The model also performed well in identifying the growth status information of AM locusts (third-instar (77.20% accuracy), fifth-instar (88.40% accuracy), and adult (93.80% accuracy)) with an overall accuracy of 90.16%. This is higher than the accuracy scores obtained by using other typical models: AlexNet (73.68%), GoogLeNet (69.12%), ResNet 18 (67.60%), ResNet 50 (80.84%), and VggNet (81.70%). Further, the model has good robustness and fast convergence rate.


2020 ◽  
Vol 497 (4) ◽  
pp. 4843-4856 ◽  
Author(s):  
James S Kuszlewicz ◽  
Saskia Hekker ◽  
Keaton J Bell

ABSTRACT Long, high-quality time-series data provided by previous space missions such as CoRoT and Kepler have made it possible to derive the evolutionary state of red giant stars, i.e. whether the stars are hydrogen-shell burning around an inert helium core or helium-core burning, from their individual oscillation modes. We utilize data from the Kepler mission to develop a tool to classify the evolutionary state for the large number of stars being observed in the current era of K2, TESS, and for the future PLATO mission. These missions provide new challenges for evolutionary state classification given the large number of stars being observed and the shorter observing duration of the data. We propose a new method, Clumpiness, based upon a supervised classification scheme that uses ‘summary statistics’ of the time series, combined with distance information from the Gaia mission to predict the evolutionary state. Applying this to red giants in the APOKASC catalogue, we obtain a classification accuracy of $\sim 91{{\ \rm per\ cent}}$ for the full 4 yr of Kepler data, for those stars that are either only hydrogen-shell burning or also helium-core burning. We also applied the method to shorter Kepler data sets, mimicking CoRoT, K2, and TESS achieving an accuracy $\gt 91{{\ \rm per\ cent}}$ even for the 27 d time series. This work paves the way towards fast, reliable classification of vast amounts of relatively short-time-span data with a few, well-engineered features.


Zootaxa ◽  
2010 ◽  
Vol 2400 (1) ◽  
pp. 66 ◽  
Author(s):  
D. J. WILLIAMS ◽  
P. J. GULLAN

Since Cockerell (1905) erected the family-group name Pseudococcini, the name has become widely used for all mealybugs. Lobdell (1930) raised the status of the group to family level as the Pseudococcidae, but it was not until Borchsenius (1949) and Ferris (1950) accepted the family level that the rank of Pseudococcidae became more widely accepted within the superfamily Coccoidea. Various tribes and subtribes have been introduced without any reliable classification of the family.


Sign in / Sign up

Export Citation Format

Share Document