2019 ◽  
Vol 2019 ◽  
pp. 1-10
Author(s):  
Jintao Wang ◽  
Mingxia Shen ◽  
Longshen Liu ◽  
Yi Xu ◽  
Cedric Okinda

Digestive diseases are one of the common broiler diseases that significantly affect production and animal welfare in broiler breeding. Droppings examination and observation are the most precise techniques to detect the occurrence of digestive disease infections in birds. This study proposes an automated broiler digestive disease detector based on a deep Convolutional Neural Network model to classify fine-grained abnormal broiler droppings images as normal and abnormal (shape, color, water content, and shape&water). Droppings images were collected from 10,000 25-35-day-old Ross broiler birds reared in multilayer cages with automatic droppings conveyor belts. For comparative purposes, Faster R-CNN and YOLO-V3 deep Convolutional Neural Networks were developed. The performance of YOLO-V3 was improved by optimizing the anchor box. Faster R-CNN achieved 99.1% recall and 93.3% mean average precision, while YOLO-V3 achieved 88.7% recall and 84.3% mean average precision on the testing data set. The proposed detector can provide technical support for the detection of digestive diseases in broiler production by automatically and nonintrusively recognizing and classifying chicken droppings.


2021 ◽  
pp. 1-10
Author(s):  
Wei Liu ◽  
Wenlong Feng ◽  
Mengxing Huang ◽  
Huirui Han ◽  
Guilai Han ◽  
...  

The Hainan Island has a generally high biological diversity with a wide variety of plant species, some of which are listed as endemic to the island. It is time-consuming and difficult, even for the botanist experts to determine the name of species based on observations. Automated plant identification enables experts to process significantly greater numbers of plants with higher efficiencies in shorter periods of time. However, plant recognition is a kind of fine-grained visual recognition problem, which is relatively harder than conventional image recognition. In this paper, we employ a Deep Convolutional Neural Network (DCNN) trained on the ImageNet database, which contains millions of images, and then transfer the learning information for automated plant identification based on flower and fruit images. First, we modify the last three layers of the pre-trained network in order to adapt ResNet-50 model to our classification task, and replace the fully connected layer in the original pre-trained network with another fully connected layers, in which the output size represents the class of plants. Secondly, we use transfer experience and fine-tuned pre-trained DCNN for experiments using flower and fruit images. Finally, we evaluate the proposed network on two available botanical datasets: the Oxford flowers dataset with 102 classes and the HNPlant flowers and fruits dataset with 20 classes, and determine the optimal values of the associated hyperparameters to improve the overall performance. Experiment results demonstrate that the highest classification accuracies exhibited by the proposed model on the Oxford-102 and HNPlant-20 datasets are 92.4% and 95.0%, respectively, thus establishing their effectiveness and superiority.


2020 ◽  
Vol 2020 (4) ◽  
pp. 4-14
Author(s):  
Vladimir Budak ◽  
Ekaterina Ilyina

The article proposes the classification of lenses with different symmetrical beam angles and offers a scale as a spot-light’s palette. A collection of spotlight’s images was created and classified according to the proposed scale. The analysis of 788 pcs of existing lenses and reflectors with different LEDs and COBs carried out, and the dependence of the axial light intensity from beam angle was obtained. A transfer training of new deep convolutional neural network (CNN) based on the pre-trained GoogleNet was performed using this collection. GradCAM analysis showed that the trained network correctly identifies the features of objects. This work allows us to classify arbitrary spotlights with an accuracy of about 80 %. Thus, light designer can determine the class of spotlight and corresponding type of lens with its technical parameters using this new model based on CCN.


Sign in / Sign up

Export Citation Format

Share Document