The research unit NEROGRAV: first results on stochastic modeling for gravity field determination with real GRACE and GRACE-FO data

Author(s):  
Michael Murböck ◽  
Panafidina Natalia ◽  
Dahle Christoph ◽  
Neumayer Karl-Hans ◽  
Flechtner Frank ◽  
...  

<p>The central hypothesis of the Research Unit (RU) NEROGRAV (New Refined Observations of Climate Change from Spaceborne Gravity Missions), funded for three years by the German Research Foundation DFG, reads: only by concurrently improving and better understanding of sensor data, background models, and processing strategies of satellite gravimetry, the resolution, accuracy, and long-term consistency of mass transport series from satellite gravimetry can be significantly increased; and only in that case the potential of future technological sensor developments can be fully exploited. Two of the individual projects (IPs) within the RU work on stochastic modeling for GRACE and GRACE-FO gravity field determination. TU München and TU Berlin are responsible for IP4 (OSTPAG: optimized space-time parameterization for GRACE and GRACE-FO data analysis), where besides optimal parameterization the focus is on the stochastic modeling of the key observations, i.e. GRACE and GRACE-FO inter-satellite ranging and accelerometer observations, in a simulation (TU München) and real data (TU Berlin) environment. IP5 (ISTORE: improved stochastic modeling in GRACE/GRACE-FO real data processing), which GFZ is responsible for, works on the optimal utilization of the stochastic properties of the main GRACE and GRACE-FO observation types and the main background models.</p><p>This presentation gives first insights into the TU Berlin and GFZ results of these two IPs which are both related on stochastic modeling for real data processing based on GFZ GRACE and GRACE-FO RL06 processing. We present the analyses of K-band inter-satellite range observations and corresponding residuals of three test years of GRACE and GRACE-FO real data in the time and frequency domain. Based on the residual analysis we show results of the effects of different filter matrices, which take into account the stochastic properties of the range observations in order to decorrelate them. The stochastic modeling of the background models starts with Monte-Carlo simulations on background model errors of atmospheric and oceanic mass variations. Different representations of variance-covariance matrices of this model information are tested as input for real GRACE data processing and their effect on gravity field determination are analyzed.</p>

2020 ◽  
Author(s):  
Natalia Panafidina ◽  
Michael Murböck ◽  
Christoph Dahle ◽  
Karl Hans Neumayer ◽  
Frank Flechtner ◽  
...  

<p><span lang="en-US">The central hypothesis of the Research Unit (RU) NEROGRAV reads: only by concurrently improving and better understanding of sensor data, background models, and processing strategies of satellite gravimetry, the resolution, accuracy, and long-term consistency of mass transport series from satellite gravimetry can be significantly increased; and only in that case the potential of future technological sensor developments can be fully exploited. Two of the individual projects (IPs) within the RU work on stochastic modeling for GRACE and GRACE-FO gravity field determination. TU München and TU Berlin are responsible for IP4 (OSTPAG: optimized space-time parameterization for GRACE and GRACE-FO data analysis), where besides optimal parameterization the focus is on the stochastic modeling of the key observations, i.e. GRACE and GRACE-FO inter-satellite ranging and accelerometer observations, in a simulation (TU München) and real data (TU Berlin) environment. IP5 (ISTORE: improved stochastic modeling in GRACE/GRACE-FO real data processing), which GFZ is responsible for, works on the optimal utilization of the stochastic properties of the main GRACE and GRACE-FO observation types and the main background models. </span></p> <p><span lang="en-US">This presentation gives first insights into the TU Berlin and GFZ results of these two IPs which are both related on stochastic modeling for real data processing based on GFZ GRACE and GRACE-FO RL06 processing. We present analysis of ranging observations and corresponding residuals of three test years of GRACE and GRACE-FO real data in the time and frequency domain. Based on the residual analysis we show results of the effects of different filter matrices, which take into account the stochastic properties of the ranging observations in order to decorrelate them. The stochastic modeling of the background models in IP5 starts with Monte-Carlo simulations on background model errors of atmospheric and oceanic mass variations. Different representations of variance-covariance matrices of this model information are tested as input for real GRACE data processing and their effect on gravity field determination are analyzed.</span></p>


2021 ◽  
Author(s):  
Natalia Panafidina ◽  
Rolf Koenig ◽  
Karl Neumayer ◽  
Christoph Dahle ◽  
Frank Flechtner

<p><span>I</span><span>n </span><span>GRACE data </span><span>processing</span><span> </span><span>t</span><span>he geophysical </span><span>background </span><span>models, which are needed to compute </span><span>the </span><span>monthly gravity field solutions, </span><span>usually </span><span>e</span><span>nter as</span><span> error-free. </span><span>This</span><span> </span><span>means that model errors could influence and distort the gravity field solution</span><span>.</span></p><p><span>The geophysical models </span><span>which influence the solution the most</span><span> a</span><span>re</span><span> the </span><span>atmosphere and ocean dealiasing product (AOD1B) and the ocean tide model. </span><span>In this presentation we focus on the </span><span>ocean tide model and on incorporati</span><span>ng</span><span> </span><span>its </span><span>stochastic information </span><span>in data processing</span><span>. </span></p><p><span>We use </span><span>the FES2014 ocean tide model presented as a spherical harmonic expansion till degree and order 180. The information about its uncertainties and the correlations between different spherical harmonics is provided by the research unit NEROGRAV (New Refined Observations of Climate Change from Spaceborne Gravity Missions). In a first step, the stochastic properties of the tide model are considered to be static and are expressed as variance-covariance matrices (VCM) of the spherical harmonics of the 8 main tidal waves till degree and order 30. The incorporation of this stochastic information is done by setting up the respective ocean tide harmonics as parameters to be solved for. Since ocean tides cannot be freely estimated within monthly GRACE solutions, the provided VCMs for the 8 tidal waves are used for constraining the tidal parameters.</span></p><p><span>T</span><span>his procedure was used to compute monthly gravity field solutions for the year 2007. For a comparison, we computed also monthly gravity fields without taking into account the stochastic information on ocean tides. In this contibution we present and discuss the first results of this comparison.</span></p>


Author(s):  
Frank Flechtner ◽  
Christoph Reigber ◽  
Reiner Rummel ◽  
Georges Balmino

AbstractSince Kepler, Newton and Huygens in the seventeenth century, geodesy has been concerned with determining the figure, orientation and gravitational field of the Earth. With the beginning of the space age in 1957, a new branch of geodesy was created, satellite geodesy. Only with satellites did geodesy become truly global. Oceans were no longer obstacles and the Earth as a whole could be observed and measured in consistent series of measurements. Of particular interest is the determination of the spatial structures and finally the temporal changes of the Earth's gravitational field. The knowledge of the gravitational field represents the natural bridge to the study of the physics of the Earth's interior, the circulation of our oceans and, more recently, the climate. Today, key findings on climate change are derived from the temporal changes in the gravitational field: on ice mass loss in Greenland and Antarctica, sea level rise and generally on changes in the global water cycle. This has only become possible with dedicated gravity satellite missions opening a method known as satellite gravimetry. In the first forty years of space age, satellite gravimetry was based on the analysis of the orbital motion of satellites. Due to the uneven distribution of observatories over the globe, the initially inaccurate measuring methods and the inadequacies of the evaluation models, the reconstruction of global models of the Earth's gravitational field was a great challenge. The transition from passive satellites for gravity field determination to satellites equipped with special sensor technology, which was initiated in the last decade of the twentieth century, brought decisive progress. In the chronological sequence of the launch of such new satellites, the history, mission objectives and measuring principles of the missions CHAMP, GRACE and GOCE flown since 2000 are outlined and essential scientific results of the individual missions are highlighted. The special features of the GRACE Follow-On Mission, which was launched in 2018, and the plans for a next generation of gravity field missions are also discussed.


2017 ◽  
Vol 24 (6) ◽  
pp. 1283-1295 ◽  
Author(s):  
Tomáš Faragó ◽  
Petr Mikulík ◽  
Alexey Ershov ◽  
Matthias Vogelgesang ◽  
Daniel Hänschke ◽  
...  

An open-source framework for conducting a broad range of virtual X-ray imaging experiments,syris, is presented. The simulated wavefield created by a source propagates through an arbitrary number of objects until it reaches a detector. The objects in the light path and the source are time-dependent, which enables simulations of dynamic experiments,e.g.four-dimensional time-resolved tomography and laminography. The high-level interface ofsyrisis written in Python and its modularity makes the framework very flexible. The computationally demanding parts behind this interface are implemented in OpenCL, which enables fast calculations on modern graphics processing units. The combination of flexibility and speed opens new possibilities for studying novel imaging methods and systematic search of optimal combinations of measurement conditions and data processing parameters. This can help to increase the success rates and efficiency of valuable synchrotron beam time. To demonstrate the capabilities of the framework, various experiments have been simulated and compared with real data. To show the use case of measurement and data processing parameter optimization based on simulation, a virtual counterpart of a high-speed radiography experiment was created and the simulated data were used to select a suitable motion estimation algorithm; one of its parameters was optimized in order to achieve the best motion estimation accuracy when applied on the real data.syriswas also used to simulate tomographic data sets under various imaging conditions which impact the tomographic reconstruction accuracy, and it is shown how the accuracy may guide the selection of imaging conditions for particular use cases.


2007 ◽  
Vol 16 (01) ◽  
pp. 138-140
Author(s):  
S. Diouny ◽  
K. Balar ◽  
M. Bennani Othmani

SummaryIn 2005, Medical Informatics Laboratory (CMIL) became an independent research unit within the Faculty of Medicine and Pharmacy of Casablanca. CMIL is currently run by three persons (a university professor, a data processing specialist and a pedagogical assistant). The objectives of CMIL are to promote research and develop quality in the field of biomedical data processing and health, and integrate new technologies into medical education and biostatistics. It has four units: Telehealth Unit, Network Unit, Biostatistics Unit, Medical data processing Unit.The present article seeks to give a comprehensive account of Casablanca Medical informatics laboratory (CMIL) activities. For ease of exposition, the article consists of four sections: Section I discusses the background of CMIL; section II is devoted to educational activities; section III addresses professional activities; and section IV lists projects that CMIL is involved in.Since its creation, CMIL has been involved in a number of national and international projects, which have a bearing on Telemedicine applications, E-learning skills and data management in medical studies in Morocco.It is our belief that the skills and knowledge gained in the past few years would certainly enrich our research activities, and improve the situation of research in Medical informatics in Morocco.


Author(s):  
Júlio Hoffimann ◽  
Maciel Zortea ◽  
Breno de Carvalho ◽  
Bianca Zadrozny

Statistical learning theory provides the foundation to applied machine learning, and its various successful applications in computer vision, natural language processing and other scientific domains. The theory, however, does not take into account the unique challenges of performing statistical learning in geospatial settings. For instance, it is well known that model errors cannot be assumed to be independent and identically distributed in geospatial (a.k.a. regionalized) variables due to spatial correlation; and trends caused by geophysical processes lead to covariate shifts between the domain where the model was trained and the domain where it will be applied, which in turn harm the use of classical learning methodologies that rely on random samples of the data. In this work, we introduce the geostatistical (transfer) learning problem, and illustrate the challenges of learning from geospatial data by assessing widely-used methods for estimating generalization error of learning models, under covariate shift and spatial correlation. Experiments with synthetic Gaussian process data as well as with real data from geophysical surveys in New Zealand indicate that none of the methods are adequate for model selection in a geospatial context. We provide general guidelines regarding the choice of these methods in practice while new methods are being actively researched.


2009 ◽  
pp. 34-37
Author(s):  
Niraj Manandhar ◽  
Rene Forsberg

This paper sets out to describe the developments of geopotential models and its role in gravity field determination. The paper also focuses in different geopotential models those are available and in use from 1980 onwards till at present with major emphasis placed on WGS84 EGM96 geopotential model.


2019 ◽  
Vol 2019 ◽  
pp. 1-12 ◽  
Author(s):  
Le-tian Zeng ◽  
Chun-hui Yang ◽  
Mao-sheng Huang ◽  
Yue-long Zhao

In the signal processing software testing for synthetic aperture radar (SAR), the verification for algorithms is professional and has a very high proportion. However, existing methods can only perform a degree of validation for algorithms, exerting an adverse effect on the effectiveness of the software testing. This paper proposes a procedure-based approach for algorithm validation. Firstly, it describes the processing procedures of polar format algorithm (PFA) under the motion-error circumstance, based on which it analyzes the possible questions that may exist in the actual situation. By data simulation, the SAR echoes are generated flexibly and efficiently. Then, algorithm simulation is utilized to focus on the demonstrations for the approximations adopted in the algorithm. Combined with real data processing, the bugs concealed are excavated further, implementing a comprehensive validation for PFA. Simulated experiments and real data processing validate the correctness and effectiveness of the proposed algorithm.


2014 ◽  
Vol 89 (1) ◽  
pp. 33-48 ◽  
Author(s):  
Adrian Jäggi ◽  
H. Bock ◽  
U. Meyer ◽  
G. Beutler ◽  
J. van den IJssel

Geophysics ◽  
2006 ◽  
Vol 71 (1) ◽  
pp. V1-V6 ◽  
Author(s):  
Moshe Reshef ◽  
Shahar Arad ◽  
Evgeny Landa

Multiple attenuation during data processing does not guarantee a multiple-free final section. Multiple identification plays an important role in seismic interpretation. A target-oriented method for predicting 3D multiples on stacked or migrated cubes in the time domain is presented. The method does not require detailed knowledge of the subsurface geological model or access to prestack data and is valid for both surface-related and interbed multiples. The computational procedure is based on kinematic properties of the data and uses Fermat's principle to define the multiples. Since no prestack data are required, the method can calculate 3D multiples even when only multi-2D survey data are available. The accuracy and possible use of the method are demonstrated on synthetic and real data examples.


Sign in / Sign up

Export Citation Format

Share Document