Creating RGB Images from Hyperspectral Images Using a Color Matching Function

Author(s):  
Magnus Magnusson ◽  
Jakob Sigurdsson ◽  
Sveinn Eirikur Armansson ◽  
Magnus O. Ulfarsson ◽  
Hilda Deborah ◽  
...  
2021 ◽  
Vol 65 (5) ◽  
Author(s):  
Shaohui Mei ◽  
Yunhao Geng ◽  
Junhui Hou ◽  
Qian Du

1983 ◽  
Vol 8 (2) ◽  
pp. 121-121
Author(s):  
David L. Macadam

2021 ◽  
Vol 12 ◽  
Author(s):  
Tianying Yan ◽  
Wei Xu ◽  
Jiao Lin ◽  
Long Duan ◽  
Pan Gao ◽  
...  

Cotton is a significant economic crop. It is vulnerable to aphids (Aphis gossypii Glovers) during the growth period. Rapid and early detection has become an important means to deal with aphids in cotton. In this study, the visible/near-infrared (Vis/NIR) hyperspectral imaging system (376–1044 nm) and machine learning methods were used to identify aphid infection in cotton leaves. Both tall and short cotton plants (Lumianyan 24) were inoculated with aphids, and the corresponding plants without aphids were used as control. The hyperspectral images (HSIs) were acquired five times at an interval of 5 days. The healthy and infected leaves were used to establish the datasets, with each leaf as a sample. The spectra and RGB images of each cotton leaf were extracted from the hyperspectral images for one-dimensional (1D) and two-dimensional (2D) analysis. The hyperspectral images of each leaf were used for three-dimensional (3D) analysis. Convolutional Neural Networks (CNNs) were used for identification and compared with conventional machine learning methods. For the extracted spectra, 1D CNN had a fine classification performance, and the classification accuracy could reach 98%. For RGB images, 2D CNN had a better classification performance. For HSIs, 3D CNN performed moderately and performed better than 2D CNN. On the whole, CNN performed relatively better than conventional machine learning methods. In the process of 1D, 2D, and 3D CNN visualization, the important wavelength ranges were analyzed in 1D and 3D CNN visualization, and the importance of wavelength ranges and spatial regions were analyzed in 2D and 3D CNN visualization. The overall results in this study illustrated the feasibility of using hyperspectral imaging combined with multi-dimensional CNN to detect aphid infection in cotton leaves, providing a new alternative for pest infection detection in plants.


2016 ◽  
Vol 16 (4) ◽  
pp. 39
Author(s):  
Kunihiro Hatakeyama ◽  
Tsubasa Kamei ◽  
Yuki Kawashima ◽  
Takehiro Nagai ◽  
Yasuki Yamauchi

2021 ◽  
Author(s):  
Min Huang ◽  
Yu Li ◽  
Yu Wang ◽  
Xiu Li ◽  
Minchen Wei

2006 ◽  
Vol 23 (3-4) ◽  
pp. 351-356 ◽  
Author(s):  
KINJIRO AMANO ◽  
DAVID H. FOSTER ◽  
SÉRGIO M.C. NASCIMENTO

Observers can generally make reliable judgments of surface color in natural scenes despite changes in an illuminant that is out of view. This ability has sometimes been attributed to observers' estimating the spectral properties of the illuminant in order to compensate for its effects. To test this hypothesis, two surface-color-matching experiments were performed with images of natural scenes obtained from high-resolution hyperspectral images. In the first experiment, the sky illuminating the scene was directly visible to the observer, and its color was manipulated. In the second experiment, a large gray sphere was introduced into the scene so that its illumination by the sun and sky was also directly visible to the observer, and the color of that illumination was manipulated. Although the degree of color constancy varied across this and other variations of the images, there was no reliable effect of illuminant color. Even when the sky was eliminated from view, color constancy did not worsen. Judging surface color in natural scenes seems to be independent of an explicit illuminant cue.


Author(s):  
Pratish Pushparaj ◽  
Nishthavan Dahiya ◽  
Mayank Dabas

2019 ◽  
Vol 2019 (1) ◽  
pp. 320-325 ◽  
Author(s):  
Wenyu Bao ◽  
Minchen Wei

Great efforts have been made to develop color appearance models to predict color appearance of stimuli under various viewing conditions. CIECAM02, the most widely used color appearance model, and many other color appearance models were all developed based on corresponding color datasets, including LUTCHI data. Though the effect of adapting light level on color appearance, which is known as "Hunt Effect", is well known, most of the corresponding color datasets were collected within a limited range of light levels (i.e., below 700 cd/m2), which was much lower than that under daylight. A recent study investigating color preference of an artwork under various light levels from 20 to 15000 lx suggested that the existing color appearance models may not accurately characterize the color appearance of stimuli under extremely high light levels, based on the assumption that the same preference judgements were due to the same color appearance. This article reports a psychophysical study, which was designed to directly collect corresponding colors under two light levels— 100 and 3000 cd/m2 (i.e., ≈ 314 and 9420 lx). Human observers completed haploscopic color matching for four color stimuli (i.e., red, green, blue, and yellow) under the two light levels at 2700 or 6500 K. Though the Hunt Effect was supported by the results, CIECAM02 was found to have large errors under the extremely high light levels, especially when the CCT was low.


2013 ◽  
Vol 11 (1) ◽  
pp. 8-13
Author(s):  
V. Behar ◽  
V. Bogdanova

Abstract In this paper the use of a set of nonlinear edge-preserving filters is proposed as a pre-processing stage with the purpose to improve the quality of hyperspectral images before object detection. The capability of each nonlinear filter to improve images, corrupted by spatially and spectrally correlated Gaussian noise, is evaluated in terms of the average Improvement factor in the Peak Signal to Noise Ratio (IPSNR), estimated at the filter output. The simulation results demonstrate that this pre-processing procedure is efficient only in case the spatial and spectral correlation coefficients of noise do not exceed the value of 0.6


Sign in / Sign up

Export Citation Format

Share Document