reflectance map
Recently Published Documents


TOTAL DOCUMENTS

23
(FIVE YEARS 6)

H-INDEX

6
(FIVE YEARS 1)

2021 ◽  
pp. 1060-1063
Author(s):  
Robert J. Woodham
Keyword(s):  

Author(s):  
Kayan K. Katrak ◽  
Rithvik Chandan ◽  
Sirisha Lanka ◽  
G. M. Chitra ◽  
S. S. Shylaja
Keyword(s):  

Author(s):  
J. Shin ◽  
Y. Cho ◽  
H. Lee ◽  
S. Yoon ◽  
H. Ahn ◽  
...  

Abstract. Radiometric calibration has become important pre-processing with increasing use of unmanned aerial vehicle (UAV) images in various applications. In order to convert the digital number (DN) to reflectance, vicarious radiometric calibration is widely used including relative radiometric calibration. Some UAV sensor systems can measure irradiance for precise relative radiometric calibration. However, most of UAV sensor systems cannot measure irradiance and therefore precise relative radiometric calibration is needed to produce reflectance map with vicarious calibration. In this study, an optimal image selection method is proposed to improve quality of relative radiometric calibration. The method, relative calibration by the optimal path (RCOP), uses filtered tie points acquired in geometric calibration based on selection optimal image by Dijkstra algorithm. About 100 multispectral images were acquired with a RedEdge-M camera and a fixed-wing UAV. The reflectance map was produced using RCOP and vicarious calibration using ground reference panels. A validation data was processed using irradiance for precise relative radiometric calibration. As a result, the RCOP method showed root mean square error (RMSE) of 0.03–0.10 reflectance to validation data. Therefore, the proposed method can be used to produce precise reflectance map by vicarious calibration.


2020 ◽  
Vol 12 (11) ◽  
pp. 1726 ◽  
Author(s):  
Jung-Il Shin ◽  
Yeong-Min Cho ◽  
Pyung-Chae Lim ◽  
Hae-Min Lee ◽  
Ho-Yong Ahn ◽  
...  

As the use of unmanned aerial vehicle (UAV) images rapidly increases so does the need for precise radiometric calibration. For UAV images, relative radiometric calibration is required in addition to the traditional vicarious radiometric calibration due to the small field of view. For relative radiometric calibration, some UAVs install irradiance sensors, but most do not. For UAVs without them, an intelligent scheme for relative radiometric calibration must be applied. In this study, a relative radiometric calibration method is proposed to improve the quality of a reflectance map without irradiance measurements. The proposed method, termed relative calibration by the optimal path (RCOP), uses tie points acquired during geometric calibration to define the optimal paths. A calibrated image from RCOP was compared to validation data calibrated with irradiance measurements. As a result, the RCOP method produces seamless mosaicked images with uniform brightness and reflectance patterns. Therefore, the proposed method can be used as a precise relative radiometric calibration method for UAV images.


2020 ◽  
Author(s):  
Holger Sihler ◽  
Sreffen Beirle ◽  
Christian Borger ◽  
Thomas Wagner

<pre class="western" lang="en-GB"><span lang="en-GB">We present results of effective cloud fractions retrieved from measurements of the TROPOspheric Monitoring Instrument (TROPOMI) using the Mainz Iterative Cloud Retrieval Utilities (MICRU) algorithm. Cloud fraction (CF) data is used to study the distribution of clouds in general. Furthermore, CF is a crucial input parameter for retrievals of tropospheric trace gases from satellite measurements in the UV/vis spectral region because CF errors may even dominate vertical column density (VCD) retrieval errors of tropospheric trace gases.</span> <span lang="en-GB">The MICRU algorithm has been specifically developed to retrieve small cloud fractions (CF<20%) at high accuracy in order to improve retrievals of tropospheric trace gases. Here, MICRU is applied to TROPOMI data offering a more than 100 times higher spatial resolution compared to GOME-2 (Global Ozone Monitoring Experiment-2), on which it was previously applied. Hence, MICRU CF can be used as an alternative to the operational CF product.</span> <span lang="en-GB">The most important feature of MICRU is the derivation of the minimum reflectance map from the measurements themselves. The algorithm builds on the assumption that the surface is dark compared to clouds, and it is therefore limited to regions not permanently covered by clouds, ice or snow. In particular, the MICRU algorithm applies four parameters to constrain interferences with surface BRDF effects like sun glitter and shadowing. Our approach features a lower threshold map parameterised by time, viewing zenith angle, scattering angle, and reflection angle. </span> <span lang="en-GB">We demonstrate that MICRU, compared to the operational cloud fraction algorithms OCRA and FRESCO, interferences less with viewing angle, solar glitter, and shore lines and, hence, significantly improves the determination of cloud fractions. Furthermore, CF features made visible by the unprecedented spatial resolution of TROPOMI are discussed.</span></pre>


2014 ◽  
pp. 671-674
Author(s):  
Robert J. Woodham
Keyword(s):  

2011 ◽  
Vol 121-126 ◽  
pp. 887-891
Author(s):  
Bin Xie ◽  
Fan Guo ◽  
Zi Xing Cai

In this paper, we propose a new defog algorithm based on fog veil subtraction to remove fog from a single image. The proposed algorithm first estimates the illumination component of the image by applying smoothing to the degraded image, and then obtains the uniform distributed fog veil through a mean calculation of the illumination component. Next, we multiply the uniform veil by the original image to obtain a depth-like map and extract its intensity component to produce a fog veil whose distribution is according with real fog density of the scene. Once the fog veil is calculated, the reflectance map can be obtained by subtracting the veil from the degraded image. Finally, we apply an adaptive contrast stretching to the reflectance map to obtain an enhanced result. This algorithm can be easily extended to video domains and is verified by both real-scene photographs and videos.


2010 ◽  
Vol 1 (3) ◽  
pp. 464-464
Author(s):  
J. Seyama ◽  
T. Sato

2006 ◽  
Vol 73 (2) ◽  
pp. 123-138 ◽  
Author(s):  
Tianli Yu ◽  
Ning Xu ◽  
Narendra Ahuja

Sign in / Sign up

Export Citation Format

Share Document