scholarly journals The Euclid Data Processing Challenges

2016 ◽  
Vol 12 (S325) ◽  
pp. 73-82 ◽  
Author(s):  
Pierre Dubath ◽  
Nikolaos Apostolakos ◽  
Andrea Bonchi ◽  
Andrey Belikov ◽  
Massimo Brescia ◽  
...  

AbstractEuclid is a Europe-led cosmology space mission dedicated to a visible and near infrared survey of the entire extra-galactic sky. Its purpose is to deepen our knowledge of the dark content of our Universe. After an overview of the Euclid mission and science, this contribution describes how the community is getting organized to face the data analysis challenges, both in software development and in operational data processing matters. It ends with a more specific account of some of the main contributions of the Swiss Science Data Center (SDC-CH).

2016 ◽  
Vol 12 (S325) ◽  
pp. 253-258
Author(s):  
R. A. Street

AbstractDespite a flood of discoveries over the last ~ 20 years, our knowledge of the exoplanet population is incomplete owing to a gap between the sensitivities of different detection techniques. However, a census of exoplanets at all separations from their host stars is essential to fully understand planet formation mechanisms. Microlensing offers an effective way to bridge the gap around 1–10 AU and is therefore one of the major science goals of the Wide Field Infrared Survey Telescope (WFIRST) mission. WFIRST’s survey of the Galactic Bulge is expected to discover ~ 20,000 microlensing events, including ~ 3000 planets, which represents a substantial data analysis challenge with the modeling software currently available. This paper highlights areas where further work is needed. The community is encouraged to join new software development efforts aimed at making the modeling of microlensing events both more accessible and rigorous.


2018 ◽  
Vol 62 ◽  
pp. 02002
Author(s):  
Yuryi Polozov ◽  
Nadezhda Fetisova

Algorithms for ionospheric data processing are presented in the paper. The algorithms are implemented in the real-time mode of ionospheric parameter analysis. They are a component of “Aurora” software system for geophysical data analysis. The algorithms allow us to estimate the state of the ionosphere in the region of Kamchatka Peninsula and to detect ionospheric anomalies. Assessment of the algorithms efficiency has shown that it is possible to use them to detect ionospheric anomalies that may occur on the eve of magnetic storms. The research is supported by the Russian Science Foundation Grant (Project No. 14-11-00194).


2019 ◽  
Author(s):  
Michel M. Verstraete ◽  
Linda A. Hunt ◽  
Hugo De Lemos ◽  
Larry Di Girolamo

Abstract. The Multi-angle Imaging SpectroRadiometer (MISR) is one of the five instruments hosted on-board the NASA Terra platform, launched on 18 December 1999. This instrument has been operational since 24 February 2000 and is still acquiring Earth Observation data as of this writing. The primary missions of MISR are to document the state and properties of the atmosphere, and in particular the clouds and aerosols it contains, as well as the planetary surface, on the basis of 36 data channels gathered by each of its nine cameras (pointing in different directions along the orbital track) in four spectral bands (blue, green, red and near-infrared). The Radiometric Camera-by-Camera Cloud Mask (RCCM) is derived from the calibrated measurements at the nominal top of the atmosphere, and is provided separately for each of the nine cameras. This RCCM data product is permanently archived at the NASA Atmospheric Science Data Center (ASDC) in Langley, VA, USA and is openly accessible (Diner et al., 1999 and https://doi.org/10.5067/Terra/MISR/MIRCCM_L2.004). For various technical reasons described in this paper, this RCCM product exhibits missing data, even though an estimate of the clear or cloudy status of the environment at each individual observed location can be deduced from the available measurements. The aims of this paper are (1) to describe how to replace most missing values by estimates and (2) to briefly describe the software to process MISR RCCM data products, which is openly available to the community from the GitHub web site (https://github.com/mmverstraete or https://doi.org/10.5281/zenodo.3240018). Limited amounts of updated MISR RCCM data products are also archived in South Africa and can be made available upon request.


2020 ◽  
Author(s):  
Nick Cox ◽  
Jeronimo Bernard-Salas ◽  
Stephane Ferron ◽  
Jean-Luc Vergely ◽  
Laurent Blanot ◽  
...  

<p>In the era of big data and cloud storage and computing, new ways for scientists to approach their research are emerging, which impact directly how science progresses and discoveries are made. This development has led the European Space Agency (ESA) to establish a reference framework for space mission operation and exploitation by scientific communities: the ESA Datalabs (EDL). The guiding principle of the EDL concept is to move the user to the data and tools, and to enable users to publish applications (e.g. processors, codes, pipelines, analysis and visualisation tools) within a trusted environment, close to the scientific data, and permitting the whole scientific community to discover new science products in an open and FAIR approach.</p> <p>In this context we will present a proto-type science application (aka Sci-App) for the exploration and visualization of Mars and Venus using the SPICAM/V Level-2 data available from the ESA Planetary Science Archive (PSA). This demonstrator facilitates the extraction and compilation of scientific data from the PSA and ease their integration with other tools through VO interoperability thus increasing their scientific impact. The tool’s key modular functionalities are 1) interactive data query and retrieval (i.e. search archive metadata), 2) interactive visualisation (i.e. geospatial info of query results, data content display of spectra, atmospheric vertical profiles), 3) data manipulation (i.e. create local maps or data cubes), and 4) data analysis (in combination with other connected VO tools). The application allows users to select, visualise and analyse both Level 2A products, which consist of e.g. transmission and radiance spectra, and level 2B products, which consist of retrieved physical parameters, such as atmospheric aerosol properties and vertical density profiles for (trace) gases in the Martian or Venusian atmosphere.</p> <p>Our goal is to deploy the (containerised) Sci-App to the EDL and similar initiatives for uptake by the space science community. In the future, we expect to incorporate access to other Mars/Venus atmospheric data sets, particularly the measurements obtained with the NOMAD and ACS instruments on the ExoMars Trace Gas Orbiter. The community can also use this application as a starting point for their own tool development for other data products/missions.</p>


2020 ◽  
pp. 3-8
Author(s):  
Jala Aghazada

Data warehouse (DW) is the basis of systems for operational data analysis (OLAP-Online Analytical Processing). Data extracted from different sources transforms and load in DW. Proper organization of this process, which is called ETL (Extract, Transform, Load) has important significance in creation of DW and analytical data processing. Forms of organization, methods of realization and modeling of ETL processes are considered in this paper.


2019 ◽  
Vol 631 ◽  
pp. A116 ◽  
Author(s):  
P. Giommi ◽  
C. H. Brandt ◽  
U. Barres de Almeida ◽  
A. M. T. Pollock ◽  
F. Arneodo ◽  
...  

Aims. Open Universe for Blazars is a set of high-transparency multi-frequency data products for blazar science, and the tools designed to generate them. Blazars are drawing growing interest following the consolidation of their position as the most abundant type of source in the extragalactic very high-energy γ-ray sky, and because of their status as prime candidate sources in the nascent field of multi-messenger astrophysics. As such, blazar astrophysics is becoming increasingly data driven, depending on the integration and combined analysis of large quantities of data from the entire span of observational astrophysics techniques. The project was therefore chosen as one of the pilot activities within the United Nations Open Universe Initiative, whose objective is to stimulate a large increase in the accessibility and ease of utilisation of space science data for the worldwide benefit of scientific research, education, capacity building, and citizen science. Methods. Our aim is to deliver innovative data science tools for multi-messenger astrophysics. In this work we report on a data analysis pipeline called Swift-DeepSky based on the Swift XRTDAS software and the XIMAGE package, encapsulated into a Docker container. Swift-DeepSky downloads and reads low-level data, generates higher level products, detects X-ray sources, and estimates several intensity and spectral parameters for each detection, thus facilitating the generation of complete and up-to-date science-ready catalogues from an entire space-mission data set. Results. As a first application of our innovative approach, we present the results of a detailed X-ray image analysis based on Swift-DeepSky that was run on all Swift-XRT observations including a known blazar, carried out during the first 14 years of operations of the Neil Gehrels Swift Observatory. Short exposures executed within one week of each other have been added to increase sensitivity, which ranges between ∼1 × 10−12 and ∼1 × 10−14 erg cm−2 s−1 (0.3–10.0 keV). After cleaning for problematic fields, the resulting database includes over 27 000 images integrated in different X-ray bands, and a catalogue, called 1OUSXB, that provides intensity and spectral information for 33 396 X-ray sources, 8896 of which are single or multiple detections of 2308 distinct blazars. All the results can be accessed online in a variety of ways, from the Open Universe portal through Virtual Observatory services, via the VOU-Blazar tool and the SSDC SED builder. One of the most innovative aspects of this work is that the results can be easily reproduced and extended by anyone using the Docker version of the Swift-DeepSky pipeline, which runs on Linux, Mac, and Windows machines, and does not require any specific experience in X-ray data analysis.


Author(s):  
Hong-Bo Jin ◽  
Peng Xu ◽  

TAIJI-1 is a micro-gravity experiment spacecraft. The mission target is to verify some key techniques of the spacecraft payloads for the gravitational wave detection, which involve laser interferometer, gravity reference sensor, drag-free control technology, micro-propulsion system, super-quiescent spacecraft platform, etc. The verification of the data processing pipeline required by the next stage of TAIJI Program is also performed. In order to benefit from the joint observatory between TAIJI and LISA in the future, as a space mission, the science operations refer to the existing ESA and NASA standard models, which include the Mission Operations Center (MOC), the Science Operations Center (SOC), the Data Processing Center (DPC), etc. The data processing pipeline connects between SOC and DPC. The SOC obtains the level-0 data from MOC and DPC performs the data processing and distributes the level-2 and level-3 data to SOC. For TAIJI-1 mission, SOC and DPC are two subsystems, which are included into the named science application system (SAS). That is the one of the six-function systems, which operate Chinese space mission. MOC is relevant to the ground support system (GSS) and spacecraft control system, that are also the ones of six-function systems of Chinese space mission. The on-orbit experiment plannings are transported from SAS to GSS, that is similar to from SOC to MOC in NASA standard models. The computer construction and computer software are the basic elements of the SAS, that are constructed completely, before the TAIJI-1 was launched. After TAIJI-1 enters the orbit, the data processing pipeline begins to work and the experimental items of TAIJI-1 are performed in the pipeline. The basic functions, performances and optimization functions of the detection devices in the payloads are verified completely 3 months after launch. In the same time, the methods of data analysis and processing are also verified. As a result, the required indicators of key techniques of the spacecraft for the gravitational wave detection are justified. The data processing pipeline is also reasonable. The relevant codes for data analysis and processing will benefit the next stage of TAIJI Program.


2013 ◽  
Vol 53 (A) ◽  
pp. 641-645 ◽  
Author(s):  
Carlotta Pittori

We present an overview of the main AGILE Data Center activities and the AGILE scientific highlights during the first 5 years of operations. AGILE is an ASI space mission in joint collaboration with INAF, INFN and CIFS, dedicated to the observation of the gamma-ray Universe. The AGILE satellite was launched on April 23rd, 2007, and is devoted to gamma-ray astrophysics in the 30MeV ÷ 50 GeV energy range, with simultaneous X-ray imaging capability in the 18 ÷ 60 keV band. Despite the small size and budget, AGILE has produced several important scientific results, including the unexpected discovery of strong and rapid gamma-ray flares from the Crab Nebula over daily timescales. This discovery won AGILE PI and the AGILE Team the prestigious Bruno Rossi Prize for 2012, an international award in the field of high energy astrophysics. Thanks to its sky monitoring capability and fast ground segment alert system, AGILE is substantially improving our knowledge of the gamma-ray sky, also making a crucial contribution to the study of the terrestrial gamma-ray flashes (TGFs) detected in the Earth atmosphere. The AGILE Data Center, part of the ASI Science Data Center (ASDC) located in Frascati, Italy, is in charge of all the science oriented activities related to the analysis, archiving and distribution of AGILE data.


Sign in / Sign up

Export Citation Format

Share Document