scholarly journals A Distributed Modular Data Processing Chain Applied to Simulated Satellite Ozone Observations

2021 ◽  
Vol 13 (2) ◽  
pp. 210
Author(s):  
Marco Gai ◽  
Flavio Barbara ◽  
Simone Ceccherini ◽  
Ugo Cortesi ◽  
Samuele Del Bianco ◽  
...  

Remote sensing of the atmospheric composition from current and future satellites, such as the Sentinel missions of the Copernicus programme, yields an unprecedented amount of data to monitor air quality, ozone, UV radiation and other climate variables. Hence, full exploitation of the growing wealth of information delivered by spaceborne observing systems requires addressing the technological challenges for developing new strategies and tools that are capable to deal with these huge data volumes. The H2020 AURORA (Advanced Ultraviolet Radiation and Ozone Retrieval for Applications) project investigated a novel approach for synergistic use of ozone profile measurements acquired at different frequencies (ultraviolet, visible, thermal infrared) by sensors onboard Geostationary Equatorial Orbit (GEO) and Low Earth Orbit (LEO) satellites in the framework of the Copernicus Sentinel-4 and Sentinel-5 missions. This paper outlines the main features of the technological infrastructure, designed and developed to support the AURORA data processing chain as a distributed data processing and describes in detail the key components of the infrastructure and the software prototype. The latter demonstrates the technical feasibility of the automatic execution of the full processing chain with simulated data. The Data Processing Chain (DPC) presented in this work thus replicates a processing system that, starting from the operational satellite retrievals, carries out their fusion and results in the assimilation of the fused products. These consist in ozone vertical profiles from which further modules of the chain deliver tropospheric ozone and UV radiation at the Earth’s surface. The conclusions highlight the relevance of this novel approach to the synergistic use of operational satellite data and underline that the infrastructure uses general-purpose technologies and is open for applications in different contexts.

2020 ◽  
Author(s):  
Nicola Zoppetti ◽  
Simone Ceccherini ◽  
Flavio Barbara ◽  
Samuele Del Bianco ◽  
Marco Gai ◽  
...  

<p>Remote sounding of atmospheric composition makes use of satellite measurements with very heterogeneous characteristics. In particular, the determination of vertical profiles of gases in the atmosphere can be performed using measurements acquired in different spectral bands and with different observation geometries. The most rigorous way to combine heterogeneous measurements of the same quantity in a single Level 2 (L2) product is simultaneous retrieval. The main drawback of simultaneous retrieval is its complexity, due to the necessity to embed the forward models of different instruments into the same retrieval application. To overcome this shortcoming, we developed a data fusion method, referred to as Complete Data Fusion (CDF), to provide an efficient and adaptable alternative to simultaneous retrieval. In general, the CDF input is any number of profiles retrieved with the optimal estimation technique, characterized by their a priori information, covariance matrix (CM), and averaging kernel (AK) matrix. The output of the CDF is a single product also characterized by an a priori, a CM and an AK matrix, which collect all the available information content. To account for the geo-temporal differences and different vertical grids of the fusing profiles, a coincidence and an interpolation error have to be included in the error budget.<br>In the first part of the work, the CDF method is applied to ozone profiles simulated in the thermal infrared and ultraviolet bands, according to the specifications of the Sentinel 4 (geostationary) and Sentinel 5 (low Earth orbit) missions of the Copernicus program. The simulated data have been produced in the context of the Advanced Ultraviolet Radiation and Ozone Retrieval for Applications (AURORA) project funded by the European Commission in the framework of the Horizon 2020 program. The use of synthetic data and the assumption of negligible systematic error in the simulated measurements allow studying the behavior of the CDF in ideal conditions. The use of synthetic data allows evaluating the performance of the algorithm also in terms of differences between the products of interest and the reference truth, represented by the atmospheric scenario used in the procedure to simulate the L2 products. This analysis aims at demonstrating the potential benefits of the CDF for the synergy of products measured by different platforms in a close future realistic scenario, when the Sentinel 4, 5/5p ozone profiles will be available.<br>In the second part of this work, the CDF is applied to a set of real measurements of ozone acquired by GOME-2 onboard the MetOp-B platform. The quality of the CDF products, obtained for the first time from operational products, is compared with that of the original GOME-2 products. This aims to demonstrate the concrete applicability of the CDF to real data and its possible use to generate Level-3 (or higher) gridded products.<br>The results discussed in this presentation offer a first consolidated picture of the actual and potential value of an innovative technique for post-retrieval processing and generation of Level-3 (or higher) products from the atmospheric Sentinel data.</p>


2016 ◽  
Author(s):  
Thierry Leblanc ◽  
Robert J. Sica ◽  
J. Anne E. van Gijsel ◽  
Alexander Haefele ◽  
Guillaume Payen ◽  
...  

Abstract. A standardized approach for the definition, propagation and reporting of uncertainty in the temperature lidar data products contributing to the Network for the Detection for Atmospheric Composition Change (NDACC) database is proposed. One important aspect of the proposed approach is the ability to propagate all independent uncertainty components in parallel through the data processing chain. The individual uncertainty components are then combined together at the very last stage of processing to form the temperature combined standard uncertainty. The identified individual uncertainty components comprise signal detection uncertainty, uncertainty due to saturation correction, background noise extraction, the merging of multiple channels, the absorption cross-sections of ozone and NO2, the molecular extinction cross-sections, the a priori use of ancillary air, ozone, and NO2 number density, the a priori use of ancillary temperature to tie-on the top of the profile, the acceleration of gravity, and the molecular mass of air. The expression of the individual uncertainty components and their step-by-step propagation through the temperature data processing chain are thoroughly estimated. All sources of uncertainty except detection noise imply correlated terms in the vertical dimension, which means that covariance terms must be taken into account when vertical filtering is applied and when temperature is integrated form the top of the profile. Quantitatively, the uncertainty budget is presented in a generic form (i.e., as a function of instrument performance and wavelength), so that any NDACC temperature lidar investigator can easily estimate the expected impact of individual uncertainty components in the case of their own instrument. An example of a full uncertainty budget obtained from actual measurements by the JPL lidar at the Mauna Loa Observatory is also provided.


Author(s):  
K. Pramod Kumar ◽  
P. Mahendra ◽  
V. Ramakrishna rReddy ◽  
T. Tirupathi ◽  
A. Akilan ◽  
...  

In the last decade, the remote sensing community has observed a significant growth in number of satellites, sensors and their resolutions, thereby increasing the volume of data to be processed each day. Satellite data processing is a complex and time consuming activity. It consists of various tasks, such as decode, decrypt, decompress, radiometric normalization, stagger corrections, ephemeris data processing for geometric corrections etc., and finally writing of the product in the form of an image file. Each task in the processing chain is sequential in nature and has different computing needs. Conventionally the processes are cascaded in a well organized workflow to produce the data products, which are executed on general purpose high-end servers / workstations in an offline mode. Hence, these systems are considered to be ineffective for real-time applications that require quick response and just-intime decision making such as disaster management, home land security and so on. <br><br> This paper discusses anovel approach to processthe data online (as the data is being acquired) using a heterogeneous computing platform namely XSTREAM which has COTS hardware of CPUs, GPUs and FPGA. This paper focuses on the process architecture, re-engineering aspects and mapping of tasks to the right computing devicewithin the XSTREAM system, which makes it an ideal cost-effective platform for acquiring, processing satellite payload data in real-time and displaying the products in original resolution for quick response. The system has been tested for IRS CARTOSAT and RESOURCESAT series of satellites which have maximum data downlink speed of 210 Mbps.


1969 ◽  
Vol 08 (04) ◽  
pp. 192-197 ◽  
Author(s):  
R. D. Yoder

General Purpose Information Processing Systems provide the capabilities of filing and retrieving discrete data. These capabilities are independent of the source or nature of the data, and of the format and content of the reports to be generated from it. MEDATA is an example of such a system. The concepts of the MEDATA system were written at UCSD in PL/1, for processing on a time-shared interactive utility computing service.PL/1 proved to be a desirable programming language for this application.-The service provided by the time-shared utility has been satisfactory and the concept of performing this type of data processing on such a utility appears to be sound. The system is now operational on a service basis and is being used for a variety of purposes. In addition to providing service, it will serve as a model for teaching, and as a basis for further research into information processing systems.


Atmosphere ◽  
2018 ◽  
Vol 9 (11) ◽  
pp. 454 ◽  
Author(s):  
Ugo Cortesi ◽  
Simone Ceccherini ◽  
Samuele Del Bianco ◽  
Marco Gai ◽  
Cecilia Tirelli ◽  
...  

With the launch of the Sentinel-5 Precursor (S-5P, lifted-off on 13 October 2017), Sentinel-4 (S-4) and Sentinel-5 (S-5)(from 2021 and 2023 onwards, respectively) operational missions of the ESA/EU Copernicus program, a massive amount of atmospheric composition data with unprecedented quality will become available from geostationary (GEO) and low Earth orbit (LEO) observations. Enhanced observational capabilities are expected to foster deeper insight than ever before on key issues relevant for air quality, stratospheric ozone, solar radiation, and climate. A major potential strength of the Sentinel observations lies in the exploitation of complementary information that originates from simultaneous and independent satellite measurements of the same air mass. The core purpose of the AURORA (Advanced Ultraviolet Radiation and Ozone Retrieval for Applications) project is to investigate this exploitation from a novel approach for merging data acquired in different spectral regions from on board the GEO and LEO platforms. A data processing chain is implemented and tested on synthetic observations. A new data algorithm combines the ultraviolet, visible and thermal infrared ozone products into S-4 and S-5(P) fused profiles. These fused products are then ingested into state-of-the-art data assimilation systems to obtain a unique ozone profile in analyses and forecasts mode. A comparative evaluation and validation of fused products assimilation versus the assimilation of the operational products will seek to demonstrate the improvements achieved by the proposed approach. This contribution provides a first general overview of the project, and discusses both the challenges of developing a technological infrastructure for implementing the AURORA concept, and the potential for applications of AURORA derived products, such as tropospheric ozone and UV surface radiation, in sectors such as air quality monitoring and health.


1974 ◽  
Vol 13 (03) ◽  
pp. 125-140 ◽  
Author(s):  
Ch. Mellner ◽  
H. Selajstder ◽  
J. Wolodakski

The paper gives a report on the Karolinska Hospital Information System in three parts.In part I, the information problems in health care delivery are discussed and the approach to systems design at the Karolinska Hospital is reported, contrasted, with the traditional approach.In part II, the data base and the data processing system, named T1—J 5, are described.In part III, the applications of the data base and the data processing system are illustrated by a broad description of the contents and rise of the patient data base at the Karolinska Hospital.


2014 ◽  
Vol 28 (2) ◽  
pp. 337-360
Author(s):  
Inyong Shin ◽  
Hyunho Kim

2010 ◽  
Vol 24 (6) ◽  
pp. 569-573
Author(s):  
Changhai Zhao ◽  
Qiuhua Wan ◽  
Shujie Wang ◽  
Xinran Lu

Sign in / Sign up

Export Citation Format

Share Document