scholarly journals Spatial Resolution Enhancement of Brillouin Optical Correlation-Domain Reflectometry Using Convolutional Neural Network: Proof of Concept

IEEE Access ◽  
2021 ◽  
pp. 1-1
Author(s):  
Jelah N. Caceres ◽  
Kohei Noda ◽  
Guangtao Zhu ◽  
Heeyoung Lee ◽  
Kentaro Nakamura ◽  
...  
2019 ◽  
Vol 11 (7) ◽  
pp. 771 ◽  
Author(s):  
Weidong Hu ◽  
Yade Li ◽  
Wenlong Zhang ◽  
Shi Chen ◽  
Xin Lv ◽  
...  

Satellite microwave radiometer data is affected by many degradation factors during the imaging process, such as the sampling interval, antenna pattern and scan mode, etc., leading to spatial resolution reduction. In this paper, a deep residual convolutional neural network (CNN) is proposed to solve these degradation problems by learning the end-to-end mapping between low-and high-resolution images. Unlike traditional methods that handle each degradation factor separately, our network jointly learns both the sampling interval limitation and the comprehensive degeneration factors, including the antenna pattern, receiver sensitivity and scan mode, during the training process. Moreover, due to the powerful mapping capability of the deep residual CNN, our method achieves better resolution enhancement results both quantitatively and qualitatively than the methods in literature. The microwave radiation imager (MWRI) data from the Fengyun-3C (FY-3C) satellite has been used to demonstrate the validity and the effectiveness of the method.


2021 ◽  
Vol 13 (10) ◽  
pp. 250
Author(s):  
Luis A. Corujo ◽  
Emily Kieson ◽  
Timo Schloesser ◽  
Peter A. Gloor

Creating intelligent systems capable of recognizing emotions is a difficult task, especially when looking at emotions in animals. This paper describes the process of designing a “proof of concept” system to recognize emotions in horses. This system is formed by two elements, a detector and a model. The detector is a fast region-based convolutional neural network that detects horses in an image. The model is a convolutional neural network that predicts the emotions of those horses. These two elements were trained with multiple images of horses until they achieved high accuracy in their tasks. In total, 400 images of horses were collected and labeled to train both the detector and the model while 40 were used to test the system. Once the two components were validated, they were combined into a testable system that would detect equine emotions based on established behavioral ethograms indicating emotional affect through the head, neck, ear, muzzle, and eye position. The system showed an accuracy of 80% on the validation set and 65% on the test set, demonstrating that it is possible to predict emotions in animals using autonomous intelligent systems. Such a system has multiple applications including further studies in the growing field of animal emotions as well as in the veterinary field to determine the physical welfare of horses or other livestock.


2017 ◽  
Vol 56 (3) ◽  
Author(s):  
Kenneth P. Smith ◽  
Anthony D. Kang ◽  
James E. Kirby

ABSTRACTMicroscopic interpretation of stained smears is one of the most operator-dependent and time-intensive activities in the clinical microbiology laboratory. Here, we investigated application of an automated image acquisition and convolutional neural network (CNN)-based approach for automated Gram stain classification. Using an automated microscopy platform, uncoverslipped slides were scanned with a 40× dry objective, generating images of sufficient resolution for interpretation. We collected 25,488 images from positive blood culture Gram stains prepared during routine clinical workup. These images were used to generate 100,213 crops containing Gram-positive cocci in clusters, Gram-positive cocci in chains/pairs, Gram-negative rods, or background (no cells). These categories were targeted for proof-of-concept development as they are associated with the majority of bloodstream infections. Our CNN model achieved a classification accuracy of 94.9% on a test set of image crops. Receiver operating characteristic (ROC) curve analysis indicated a robust ability to differentiate between categories with an area under the curve of >0.98 for each. After training and validation, we applied the classification algorithm to new images collected from 189 whole slides without human intervention. Sensitivity and specificity were 98.4% and 75.0% for Gram-positive cocci in chains and pairs, 93.2% and 97.2% for Gram-positive cocci in clusters, and 96.3% and 98.1% for Gram-negative rods. Taken together, our data support a proof of concept for a fully automated classification methodology for blood-culture Gram stains. Importantly, the algorithm was highly adept at identifying image crops with organisms and could be used to present prescreened, classified crops to technologists to accelerate smear review. This concept could potentially be extended to all Gram stain interpretive activities in the clinical laboratory.


Sign in / Sign up

Export Citation Format

Share Document