scholarly journals A Deep Learning-Based Vision System Combining Detection and Tracking for Fast On-Line Citrus Sorting

2021 ◽  
Vol 12 ◽  
Author(s):  
Yaohui Chen ◽  
Xiaosong An ◽  
Shumin Gao ◽  
Shanjun Li ◽  
Hanwen Kang

Defective citrus fruits are manually sorted at the moment, which is a time-consuming and cost-expensive process with unsatisfactory accuracy. In this paper, we introduce a deep learning-based vision system implemented on a citrus processing line for fast on-line sorting. For the citrus fruits rotating randomly on the conveyor, a convolutional neural network-based detector was developed to detect and temporarily classify the defective ones, and a SORT algorithm-based tracker was adopted to record the classification information along their paths. The true categories of the citrus fruits were identified through the tracked historical information, resulting in high detection precision of 93.6%. Moreover, the linear Kalman filter model was applied to predict the future path of the fruits, which can be used to guide the robot arms to pick out the defective ones. Ultimately, this research presents a practical solution to realize on-line citrus sorting featuring low costs, high efficiency, and accuracy.

2020 ◽  
Vol 286 ◽  
pp. 110102 ◽  
Author(s):  
Shuxiang Fan ◽  
Jiangbo Li ◽  
Yunhe Zhang ◽  
Xi Tian ◽  
Qingyan Wang ◽  
...  

2021 ◽  
Vol 12 (1) ◽  
Author(s):  
Ling-Ping Cen ◽  
Jie Ji ◽  
Jian-Wei Lin ◽  
Si-Tong Ju ◽  
Hong-Jie Lin ◽  
...  

AbstractRetinal fundus diseases can lead to irreversible visual impairment without timely diagnoses and appropriate treatments. Single disease-based deep learning algorithms had been developed for the detection of diabetic retinopathy, age-related macular degeneration, and glaucoma. Here, we developed a deep learning platform (DLP) capable of detecting multiple common referable fundus diseases and conditions (39 classes) by using 249,620 fundus images marked with 275,543 labels from heterogenous sources. Our DLP achieved a frequency-weighted average F1 score of 0.923, sensitivity of 0.978, specificity of 0.996 and area under the receiver operating characteristic curve (AUC) of 0.9984 for multi-label classification in the primary test dataset and reached the average level of retina specialists. External multihospital test, public data test and tele-reading application also showed high efficiency for multiple retinal diseases and conditions detection. These results indicate that our DLP can be applied for retinal fundus disease triage, especially in remote areas around the world.


Sensors ◽  
2021 ◽  
Vol 21 (2) ◽  
pp. 343
Author(s):  
Kim Bjerge ◽  
Jakob Bonde Nielsen ◽  
Martin Videbæk Sepstrup ◽  
Flemming Helsing-Nielsen ◽  
Toke Thomas Høye

Insect monitoring methods are typically very time-consuming and involve substantial investment in species identification following manual trapping in the field. Insect traps are often only serviced weekly, resulting in low temporal resolution of the monitoring data, which hampers the ecological interpretation. This paper presents a portable computer vision system capable of attracting and detecting live insects. More specifically, the paper proposes detection and classification of species by recording images of live individuals attracted to a light trap. An Automated Moth Trap (AMT) with multiple light sources and a camera was designed to attract and monitor live insects during twilight and night hours. A computer vision algorithm referred to as Moth Classification and Counting (MCC), based on deep learning analysis of the captured images, tracked and counted the number of insects and identified moth species. Observations over 48 nights resulted in the capture of more than 250,000 images with an average of 5675 images per night. A customized convolutional neural network was trained on 2000 labeled images of live moths represented by eight different classes, achieving a high validation F1-score of 0.93. The algorithm measured an average classification and tracking F1-score of 0.71 and a tracking detection rate of 0.79. Overall, the proposed computer vision system and algorithm showed promising results as a low-cost solution for non-destructive and automatic monitoring of moths.


2003 ◽  
Vol 47 (10) ◽  
pp. 175-181 ◽  
Author(s):  
G. Buitrón ◽  
M.-E. Schoeb ◽  
J. Moreno

The operation of a sequencing batch bioreactor is evaluated when high concentration peaks of a toxic compound (4-chlorophenol, 4CP) are introduced into the reactor. A control strategy based on the dissolved oxygen concentration, measured on line, is utilized. To detect the end of the reaction period, the automated system search for the moment when the dissolved oxygen has passed by a minimum, as a consequence of the metabolic activity of the microorganisms and right after to a maximum due to the saturation of the water (similar to the self-cycling fermentation, SCF, strategy). The dissolved oxygen signal was sent to a personal computer via data acquisition and control using MATLAB and the SIMULINK package. The system operating under the automated strategy presented a stable operation when the acclimated microorganisms (to an initial concentration of 350 mg 4CP/L), were exposed to a punctual concentration peaks of 600 mg 4CP/L. The 4CP concentrations peaks superior or equals to 1,050 mg/L only disturbed the system from a short to a medium term (one month). The 1,400 mg/L peak caused a shutdown in the metabolic activity of the microorganisms that led to the reactor failure. The biomass acclimated with the SCF strategy can partially support the variations of the toxic influent since, at the moment in which the influent become inhibitory, there is a failure of the system.


Mechatronics ◽  
2006 ◽  
Vol 16 (5) ◽  
pp. 243-247 ◽  
Author(s):  
Zhenwei Su ◽  
Gui Yun Tian ◽  
Chunhua Gao

Author(s):  
Ahmad Jahanbakhshi ◽  
Yousef Abbaspour-Gilandeh ◽  
Kobra Heidarbeigi ◽  
Mohammad Momeny

2003 ◽  
Vol 75 (14) ◽  
pp. 3596-3605 ◽  
Author(s):  
Yufeng Shen ◽  
Ronald J. Moore ◽  
Rui Zhao ◽  
Josip Blonder ◽  
Deanna L. Auberry ◽  
...  

2019 ◽  
Author(s):  
Y. Miyatake ◽  
N. Sekine ◽  
K. Toprasertpong ◽  
S. Takagi ◽  
M. Takenaka

Sign in / Sign up

Export Citation Format

Share Document