scholarly journals MAXIMASK and MAXITRACK: Two new tools for identifying contaminants in astronomical images using convolutional neural networks

2020 ◽  
Vol 634 ◽  
pp. A48
Author(s):  
M. Paillassa ◽  
E. Bertin ◽  
H. Bouy

In this work, we propose two convolutional neural network classifiers for detecting contaminants in astronomical images. Once trained, our classifiers are able to identify various contaminants, such as cosmic rays, hot and bad pixels, persistence effects, satellite or plane trails, residual fringe patterns, nebulous features, saturated pixels, diffraction spikes, and tracking errors in images. They encompass a broad range of ambient conditions, such as seeing, image sampling, detector type, optics, and stellar density. The first classifier, MAXIMASK, performs semantic segmentation and generates bad pixel maps for each contaminant, based on the probability that each pixel belongs to a given contaminant class. The second classifier, MAXITRACK, classifies entire images and mosaics, by computing the probability for the focal plane to be affected by tracking errors. We gathered training and testing data from real data originating from various modern charged-coupled devices and near-infrared cameras, that are augmented with image simulations. We quantified the performance of both classifiers and show that MAXIMASK achieves state-of-the-art performance for the identification of cosmic ray hits. Thanks to a built-in Bayesian update mechanism, both classifiers can be tuned to meet specific science goals in various observational contexts.

2005 ◽  
Vol 22 (3) ◽  
pp. 249-256 ◽  
Author(s):  
Catherine L. Farage ◽  
Kevin A. Pimbblet

AbstractTo maximise data output from single-shot astronomical images, the rejection of cosmic rays is important. We present the results of a benchmark trial comparing various cosmic ray rejection algorithms. The procedures assess relative performances and characteristics of the processes in cosmic ray detection, rates of false detections of true objects, and the quality of image cleaning and reconstruction. The cosmic ray rejection algorithms developed by Rhoads (2000, PASP, 112, 703), van Dokkum (2001, PASP, 113, 1420), Pych (2004, PASP, 116, 148), and the IRAF task XZAP by Dickinson are tested using both simulated and real data. It is found that detection efficiency is independent of the density of cosmic rays in an image, being more strongly affected by the density of real objects in the field. As expected, spurious detections and alterations to real data in the cleaning process are also significantly increased by high object densities. We find the Rhoads' linear filtering method to produce the best performance in the detection of cosmic ray events; however, the popular van Dokkum algorithm exhibits the highest overall performance in terms of detection and cleaning.


2017 ◽  
Vol 10 (6) ◽  
pp. 2077-2091 ◽  
Author(s):  
Sabina Assan ◽  
Alexia Baudic ◽  
Ali Guemri ◽  
Philippe Ciais ◽  
Valerie Gros ◽  
...  

Abstract. Due to increased demand for an understanding of CH4 emissions from industrial sites, the subject of cross sensitivities caused by absorption from multiple gases on δ13CH4 and C2H6 measured in the near-infrared spectral domain using CRDS has become increasingly important. Extensive laboratory tests are presented here, which characterize these cross sensitivities and propose corrections for the biases they induce. We found methane isotopic measurements to be subject to interference from elevated C2H6 concentrations resulting in heavier δ13CH4 by +23.5 ‰ per ppm C2H6 ∕ ppm CH4. Measured C2H6 is subject to absorption interference from a number of other trace gases, predominantly H2O (with an average linear sensitivity of 0.9 ppm C2H6 per  % H2O in ambient conditions). Yet, this sensitivity was found to be discontinuous with a strong hysteresis effect and we suggest removing H2O from gas samples prior to analysis. The C2H6 calibration factor was calculated using a GC and measured as 0.5 (confirmed up to 5 ppm C2H6). Field tests at a natural gas compressor station demonstrated that the presence of C2H6 in gas emissions at an average level of 0.3 ppm shifted the isotopic signature by 2.5 ‰, whilst after calibration we find that the average C2H6 : CH4 ratio shifts by +0.06. These results indicate that, when using such a CRDS instrument in conditions of elevated C2H6 for CH4 source determination, it is imperative to account for the biases discussed within this study.


10.14311/1023 ◽  
2008 ◽  
Vol 48 (3) ◽  
Author(s):  
M. Řeřábek

The properties of UWFC (Ultra Wide-Field Camera) astronomical systems along with specific visual data in astronomical images contribute to a comprehensive evaluation of the acquired image data. These systems contain many different kinds of optical aberrations which have a negatively effect on image quality and imaging system transfer characteristics, and reduce the precision of astronomical measurement. It is very important to figure two main questions out. At first: In which astrometric depend on optical aberrations? And at second: How optical aberrations affect the transfer characteristics of the whole optical system. If we define the PSF (Point Spread Function) [2] of an optical system, we can use some suitable methods for restoring the original image. Optical aberration models for LSI/LSV (Linear Space Invariant/Variant) [2] systems are presented in this paper. These models are based on Seidel and Zernike approximating polynomials [1]. Optical aberration models serve as suitable tool for estimating and fitting the wavefront aberration of a real optical system. Real data from the BOOTES (Burst Observer and Optical Transient Exploring System) experiment is used for our simulations. Problems related to UWFC imaging systems, especially a restoration method in the presence of space variant PSF are described in this paper. A model of the space variant imaging system and partially of the space variant optical system has been implemented in MATLAB. The “brute force” method has been used for restoration of the testing images. The results of different deconvolution algorithms are demonstrated in this paper. This approach could help to improve the precision of astronomic measurements. 


2021 ◽  
pp. 393-403
Author(s):  
Carmelo Pino ◽  
Renato Sortino ◽  
Eva Sciacca ◽  
Simone Riggi ◽  
Concetto Spampinato

1987 ◽  
Vol 120 ◽  
pp. 425-425
Author(s):  
B. N. Khare ◽  
B.G.J.P.T. Murray ◽  
C. Sagan ◽  
W. R. Thompson ◽  
E. T. Arakawa

There is now evidence that at least some cometary nuclei are dark and red. Cometary ices prepared from combinations of CH4 with H2O and sometimes NH3 were irradiated at 77 K by corona discharge. CH4 - containing ice reddened and darkened at a dose ~1011 erg cm−2 over a period of ~1 hour. Upon evaporation of the now yellowish, irradiated ice, a slightly yellowish colored solid film remains on the walls of the container at room temperature. Transmission measurements of this organic film (called cometary tholin) were made from 0.2 μm to 50 μm wavelength. Strong UV absorption is seen from 0.45 μm to 0.2 μm. Above 0.45 μm, the spectrum remains flat to ~1.3 μm in the near infrared, except for a very small feature near 1.15 μm. A medium sized feature appears centered at 1.4 μm with shoulders at both sides and a nearby weaker feature at 1.52 μm. A strong feature appears at 1.9 μm accompanied by a smaller feature at 1.78 μm. in the region 2.5 μm to 50 μm, the infrared spectrum was taken by dispersing the film in a CsI matrix. Bands are found at 2.92(M), 3.36(S), 3.40(S), 3.46(M), 3.48(M), 5.75(M), 5.99(S), 6.21(M), 6.83(M), 7.30(S), 7.81(W), 8.89(M), 9.26(M), 20.00(W), 22.22(W), and 28.57(W) micrometers, suggesting complex organics including alkane, alkene, aldehyde, and carboxylic acid functional groups. These results are also relevant to UV and cosmic ray processing of interstellar grains, and to icy bodies in the outer solar system.


Sensors ◽  
2020 ◽  
Vol 20 (8) ◽  
pp. 2244
Author(s):  
J. M. Jurado ◽  
J. L. Cárdenas ◽  
C. J. Ogayar ◽  
L. Ortega ◽  
F. R. Feito

The characterization of natural spaces by the precise observation of their material properties is highly demanded in remote sensing and computer vision. The production of novel sensors enables the collection of heterogeneous data to get a comprehensive knowledge of the living and non-living entities in the ecosystem. The high resolution of consumer-grade RGB cameras is frequently used for the geometric reconstruction of many types of environments. Nevertheless, the understanding of natural spaces is still challenging. The automatic segmentation of homogeneous materials in nature is a complex task because there are many overlapping structures and an indirect illumination, so the object recognition is difficult. In this paper, we propose a method based on fusing spatial and multispectral characteristics for the unsupervised classification of natural materials in a point cloud. A high-resolution camera and a multispectral sensor are mounted on a custom camera rig in order to simultaneously capture RGB and multispectral images. Our method is tested in a controlled scenario, where different natural objects coexist. Initially, the input RGB images are processed to generate a point cloud by applying the structure-from-motion (SfM) algorithm. Then, the multispectral images are mapped on the three-dimensional model to characterize the geometry with the reflectance captured from four narrow bands (green, red, red-edge and near-infrared). The reflectance, the visible colour and the spatial component are combined to extract key differences among all existing materials. For this purpose, a hierarchical cluster analysis is applied to pool the point cloud and identify the feature pattern for every material. As a result, the tree trunk, the leaves, different species of low plants, the ground and rocks can be clearly recognized in the scene. These results demonstrate the feasibility to perform a semantic segmentation by considering multispectral and spatial features with an unknown number of clusters to be detected on the point cloud. Moreover, our solution is compared to other method based on supervised learning in order to test the improvement of the proposed approach.


2021 ◽  
Vol 2021 ◽  
pp. 1-13
Author(s):  
Bin Huang ◽  
Jiaqi Lin ◽  
Jinming Liu ◽  
Jie Chen ◽  
Jiemin Zhang ◽  
...  

Separating printed or handwritten characters from a noisy background is valuable for many applications including test paper autoscoring. The complex structure of Chinese characters makes it difficult to obtain the goal because of easy loss of fine details and overall structure in reconstructed characters. This paper proposes a method for separating Chinese characters based on generative adversarial network (GAN). We used ESRGAN as the basic network structure and applied dilated convolution and a novel loss function that improve the quality of reconstructed characters. Four popular Chinese fonts (Hei, Song, Kai, and Imitation Song) on real data collection were tested, and the proposed design was compared with other semantic segmentation approaches. The experimental results showed that the proposed method effectively separates Chinese characters from noisy background. In particular, our methods achieve better results in terms of Intersection over Union (IoU) and optical character recognition (OCR) accuracy.


2020 ◽  
Vol 2020 (1) ◽  
pp. 82-86
Author(s):  
Sorour Mohajerani ◽  
Mark S. Drew ◽  
Parvaneh Saeedi

Removing the effect of illumination variation in images has been proved to be beneficial in many computer vision applications such as object recognition and semantic segmentation. Although generating illumination-invariant images has been studied in the literature before, it has not been investigated on real 4-channel (4D) data. In this study, we examine the quality of illumination-invariant images generated from red, green, blue, and near-infrared (RGBN) data. Our experiments show that the near-infrared channel substantively contributes toward removing illumination. As shown in our numerical and visual results, the illumination-invariant image obtained by RGBN data is superior compared to that obtained by RGB alone.


Sign in / Sign up

Export Citation Format

Share Document