Improved Retrieval Methods for Sentinel-3 SAR Altimetry over Coastal and Open Ocean and recommendations for implementation: ESA SCOOP Project Results

Author(s):  
David Cotton ◽  
Thomas Moreau ◽  
Mònica Roca ◽  
Christine Gommenginger ◽  
Mathilde Cancet ◽  
...  

<p>SCOOP (SAR Altimetry Coastal & Open Ocean Performance) is a project funded under the ESA SEOM (Scientific Exploitation of Operational Missions) Programme Element, to characterise the expected performance of Sentinel-3 SRAL SAR mode altimeter products, and then to develop and evaluate enhancements to the baseline processing scheme in terms of improvements to ocean measurements. Another objective is to develop and evaluate an improved Wet Troposphere correction for Sentinel-3.</p><p>The SCOOP studies are based on two 2-year test data sets derived from CryoSat-2 FBR data, produced for 10 regions. The first Test Data Set was processed with algorithms equivalent to the Sentinel-3 baseline, and the second with algorithms expected to provide an improved performance.</p><p>We present results from the SCOOP project that demonstrate the excellent performance of SRAL at the coast in terms of measurement precision, with noise in Sea Surface Height 20Hz measurements of less than 5cm to within 5km of the coast.</p><p>We then report the development and testing of new processing approaches designed to improve performance, including, for L1B to L2:</p><ul><li>Application of zero-padding</li> <li>Application of intra-burst Hamming windowing</li> <li>Exact beam forming in the azimuthal direction</li> <li>Restriction of stack processing to within a specified range of look angles.</li> <li>Along-track antenna compensation</li> </ul><p> </p><p>And for L1B to L2</p><ul><li>Application of alternative re-trackers for SAR and RDSAR.</li> </ul><p> </p><p>Based on the results of this assessment, a second test data set was generated and we present an assessment of the performance of this second Test Data Set generated, and compare it to that of the original Test Data Set.</p><p>Regarding the WTC for Sentinel-3A, the correction from the on-board MWR has been assessed by means of comparison with independent data sets such as the GPM Microwave Imager (GMI), Jason-2, Jason-3 and Global Navigation Satellite Systems (GNSS) derived WTC at coastal stations. GNSS-derived path Delay Plus (GPD+) corrections have been derived for S3A. Results indicate good overall performance of S3A MWR and GPD+ WTC improvements over MWR-derived WTC, particularly in coastal and polar regions.</p><p> </p><p>Based on the outcomes of this study we provide recommendations for improving SAR mode altimeter processing and priorities for future research.</p>

2021 ◽  
Vol 95 (2) ◽  
Author(s):  
Mirjam Bilker-Koivula ◽  
Jaakko Mäkinen ◽  
Hannu Ruotsalainen ◽  
Jyri Näränen ◽  
Timo Saari

AbstractPostglacial rebound in Fennoscandia causes striking trends in gravity measurements of the area. We present time series of absolute gravity data collected between 1976 and 2019 on 12 stations in Finland with different types of instruments. First, we determine the trends at each station and analyse the effect of the instrument types. We estimate, for example, an offset of 6.8 μgal for the JILAg-5 instrument with respect to the FG5-type instruments. Applying the offsets in the trend analysis strengthens the trends being in good agreement with the NKG2016LU_gdot model of gravity change. Trends of seven stations were found robust and were used to analyse the stabilization of the trends in time and to determine the relationship between gravity change rates and land uplift rates as measured with global navigation satellite systems (GNSS) as well as from the NKG2016LU_abs land uplift model. Trends calculated from combined and offset-corrected measurements of JILAg-5- and FG5-type instruments stabilized in 15 to 20 years and at some stations even faster. The trends of FG5-type instrument data alone stabilized generally within 10 years. The ratio between gravity change rates and vertical rates from different data sets yields values between − 0.206 ± 0.017 and − 0.227 ± 0.024 µGal/mm and axis intercept values between 0.248 ± 0.089 and 0.335 ± 0.136 µGal/yr. These values are larger than previous estimates for Fennoscandia.


2021 ◽  
Author(s):  
David Cotton ◽  

<p><strong>Introduction</strong></p><p>HYDROCOASTAL is a two year project funded by ESA, with the objective to maximise exploitation of SAR and SARin altimeter measurements in the coastal zone and inland waters, by evaluating and implementing new approaches to process SAR and SARin data from CryoSat-2, and SAR altimeter data from Sentinel-3A and Sentinel-3B. Optical data from Sentinel-2 MSI and Sentinel-3 OLCI instruments will also be used in generating River Discharge products.</p><p>New SAR and SARin processing algorithms for the coastal zone and inland waters will be developed and implemented and evaluated through an initial Test Data Set for selected regions. From the results of this evaluation a processing scheme will be implemented to generate global coastal zone and river discharge data sets.</p><p>A series of case studies will assess these products in terms of their scientific impacts.</p><p>All the produced data sets will be available on request to external researchers, and full descriptions of the processing algorithms will be provided</p><p> </p><p><strong>Objectives</strong></p><p>The scientific objectives of HYDROCOASTAL are to enhance our understanding  of interactions between the inland water and coastal zone, between the coastal zone and the open ocean, and the small scale processes that govern these interactions. Also the project aims to improve our capability to characterize the variation at different time scales of inland water storage, exchanges with the ocean and the impact on regional sea-level changes</p><p>The technical objectives are to develop and evaluate  new SAR  and SARin altimetry processing techniques in support of the scientific objectives, including stack processing, and filtering, and retracking. Also an improved Wet Troposphere Correction will be developed and evaluated.</p><p><strong>Project  Outline</strong></p><p>There are four tasks to the project</p><ul><li>Scientific Review and Requirements Consolidation: Review the current state of the art in SAR and SARin altimeter data processing as applied to the coastal zone and to inland waters</li> <li>Implementation and Validation: New processing algorithms with be implemented to generate a Test Data sets, which will be validated against models, in-situ data, and other satellite data sets. Selected algorithms will then be used to generate global coastal zone and river discharge data sets</li> <li>Impacts Assessment: The impact of these global products will be assess in a series of Case Studies</li> <li>Outreach and Roadmap: Outreach material will be prepared and distributed to engage with the wider scientific community and provide recommendations for development of future missions and future research.</li> </ul><p> </p><p><strong>Presentation</strong></p><p>The presentation will provide an overview to the project, present the different SAR altimeter processing algorithms that are being evaluated in the first phase of the project, and early results from the evaluation of the initial test data set.</p><p> </p>


2021 ◽  
Author(s):  
Estel Cardellach ◽  
Weiqiang Li ◽  
Dallas Masters ◽  
Takayuki Yuasa ◽  
Franck Borde ◽  
...  

<p>Recently, different studies have shown evidence of signals transmitted by the Global Navigation Satellite Systems (GNSS), coherently reflected over some parts of the ocean, and received from cubesats. In particular, strong coherent scattering has been reported in regions with low water surface roughness as those near continental masses and in atolls. Over open ocean, few coherent signals were reported to be found, although the data sets were somewhat limited and certainly not exhaustive. The level of coherence in reflected GNSS signals depends on the roughness of the  surface (i.e. significant wave height and small scale ripples and waves induced by the wind), the viewing geometry (i.e. incidence angle, or equivalently, elevation angle of the GNSS satellite as seen from the point of reflection), propagation effects (namely ionospheric disturbances) and on the frequency (i.e. particular GNSS band, like L1/E1, L2 or L5/E5). These coherent measurements over ocean follow earlier evidence of coherent GNSS reflections over sea ice which date back to 2005, the time of UK-DMC mission. More recently, Sea Ice Thickness (SIT) retrievals have also been carried out with this technique, at an accuracy comparable to that of SMOS.</p><p>All the observations referred so far were done at a single frequency, L1/E1. So, there is an interest to explore the coherence at the other main GNSS bands, i.e. L2 and L5/E5 as well as to the widelane combinations between them (linear combinations of carrier-phase measurements, of longer effective wavelength). Spire Global radio occultation cubesats work at L1 and L2 frequency bands, and therefore provide unique dual-frequency raw data sets of reflected signals over open ocean, sea ice and inland water bodies. With these, it is possible to study the coherence of these targets at each of the bands and at their widelane combination, as well as the performance of altimetric retrievals at grazing angles of observation (very slant geometries, which facilitate coherence properties of the scattering). The dual-frequency observations can correct the ionospheric effects, and their widelane combinations, of longer effective wavelength, might expand the conditions for coherence. The fact that this new approach is fully compatible with small GNSS radio occultation payloads and missions, might represent a low cost source of precise altimetry to complement larger dedicated missions.</p><p>An ESA research study involving Spire Global and IEEC aims at studying this new potential altimetric technique. Raw data acquisitions from limb-looking antennas of Spire’s cubesat constellation were selected to be geographically and time collocated with ESA Sentinel 3A and 3B passes in order to compare the results of coherence and altimetry. For this study, the raw data at two frequencies, acquired at 6.2 Mbps, are shifted to intermediate frequencies and downloaded to the ground without any further processing. In-house software receivers are then applied to generate the reflected echoes or waveforms, and to track the phase of the carrier signals. Precise altimetry (a few cm in 20 ms integration) is then possible from these observables. The results of this activity will be shown, focusing on altimetric retrievals over large lakes.</p>


Geosciences ◽  
2018 ◽  
Vol 8 (8) ◽  
pp. 292 ◽  
Author(s):  
Daniele Sampietro ◽  
Ahmed Mansi ◽  
Martina Capponi

Airborne gravimetry represents nowadays probably the most efficient technique to collect gravity observations close to the Earth’s surface. In the 1990s, thanks to the development of the Global Navigation Satellite Systems (GNSS), which has made accurate navigational data available, this technique started to spread worldwide because of its capability to provide measurements in a fast and cost-effective way. Differently from other techniques such as shipborne gravimetry, it has the advantage to provide gravity measurements also in challenging environments which can be difficult to access otherwise, like mountainous areas, rain forests and polar regions. For such reasons, airborne gravimetry is used for various applications related to the regional gravity field modelling: from the computation of high accurate local geoid for geodetic applications to geophysical ones, specifically related to oil and gas exploration activities or more in general for regional geological studies. Depending on the different kinds of application and the final required accuracy, the definition of the main characteristics of the airborne survey, e.g., the planar distance between consecutive flight tracks, the aircraft velocity, etc., can be a difficult task. In this work, we present a new software package, which would help in properly accomplishing the survey design task. Basically, the developed software solution allows for generating a realistic (from the observation noise point of view) gravimetric signal, and, after that, to predict the accuracy and spatial resolution of the final retrievable gravimetric field, in terms of gravity disturbances, given the flight main characteristics. The proposed procedure is suited for airborne survey planning in order to be able to optimize the design of the survey according to the required final accuracy. With the aim to evaluate the influence of the various survey parameters on the expected accuracy of the airborne survey, different numerical tests have been performed on simulated and real datasets. For instance, it has been shown that if the observation noise is not properly modeled in the data filtering step, the survey results degrade about 25%, while not acquiring control lines during the survey will basically reduce the final accuracy by a factor of two.


2021 ◽  
Author(s):  
Louise Bloch ◽  
Christoph M. Friedrich

Abstract Background: The prediction of whether Mild Cognitive Impaired (MCI) subjects will prospectively develop Alzheimer's Disease (AD) is important for the recruitment and monitoring of subjects for therapy studies. Machine Learning (ML) is suitable to improve early AD prediction. The etiology of AD is heterogeneous, which leads to noisy data sets. Additional noise is introduced by multicentric study designs and varying acquisition protocols. This article examines whether an automatic and fair data valuation method based on Shapley values can identify subjects with noisy data. Methods: An ML-workow was developed and trained for a subset of the Alzheimer's Disease Neuroimaging Initiative (ADNI) cohort. The validation was executed for an independent ADNI test data set and for the Australian Imaging, Biomarker and Lifestyle Flagship Study of Ageing (AIBL) cohort. The workow included volumetric Magnetic Resonance Imaging (MRI) feature extraction, subject sample selection using data Shapley, Random Forest (RF) and eXtreme Gradient Boosting (XGBoost) for model training and Kernel SHapley Additive exPlanations (SHAP) values for model interpretation. This model interpretation enables clinically relevant explanation of individual predictions. Results: The XGBoost models which excluded 116 of the 467 subjects from the training data set based on their Logistic Regression (LR) data Shapley values outperformed the models which were trained on the entire training data set and which reached a mean classification accuracy of 58.54 % by 14.13 % (8.27 percentage points) on the independent ADNI test data set. The XGBoost models, which were trained on the entire training data set reached a mean accuracy of 60.35 % for the AIBL data set. An improvement of 24.86 % (15.00 percentage points) could be reached for the XGBoost models if those 72 subjects with the smallest RF data Shapley values were excluded from the training data set. Conclusion: The data Shapley method was able to improve the classification accuracies for the test data sets. Noisy data was associated with the number of ApoEϵ4 alleles and volumetric MRI measurements. Kernel SHAP showed that the black-box models learned biologically plausible associations.


Author(s):  
Gihong Kim ◽  
Bonghee Hong

The testing of RFID information services requires a test data set of business events comprising object, aggregation, quantity and transaction events. To generate business events, we need to address the performance issues in creating a large volume of event data. This paper proposes a new model for the tag life cycle and a fast generation algorithm for this model. We present the results of experiments with the generation algorithm, showing that it outperforms previous methods.


2013 ◽  
Vol 31 (4) ◽  
pp. 231-252 ◽  
Author(s):  
Rajat Gupta ◽  
Matthew Gregg ◽  
Hu Du ◽  
Katie Williams

PurposeTo critically compare three future weather year (FWY) downscaling approaches, based on the 2009 UK Climate Projections, used for climate change impact and adaptation analysis in building simulation software.Design/methodology/approachThe validity of these FWYs is assessed through dynamic building simulation modelling to project future overheating risk in typical English homes in 2050s and 2080s.FindingsThe modelling results show that the variation in overheating projections is far too significant to consider the tested FWY data sets equally suitable for the task.Research and practical implicationsIt is recommended that future research should consider harmonisation of the downscaling approaches so as to generate a unified data set of FWYs to be used for a given location and climate projection. If FWY are to be used in practice, live projects will need viable and reliable FWY on which to base their adaptation decisions. The difference between the data sets tested could potentially lead to different adaptation priorities specifically with regard to time series and adaptation phasing through the life of a building.Originality/valueThe paper investigates the different results derived from FWY application to building simulation. The outcome and implications are important considerations for research and practice involved in FWY data use in building simulation intended for climate change adaptation modelling.


2015 ◽  
Vol 5 (3) ◽  
pp. 350-380 ◽  
Author(s):  
Abdifatah Ahmed Haji ◽  
Sanni Mubaraq

Purpose – The purpose of this paper is to examine the impact of corporate governance and ownership structure attributes on firm performance following the revised code on corporate governance in Malaysia. The study presents a longitudinal assessment of the compliance and implications of the revised code on firm performance. Design/methodology/approach – Two data sets consisting of before (2006) and after (2008-2010) the revised code are examined. Drawing from the largest companies listed on Bursa Malaysia (BM), the first data set contains 92 observations in the year 2006 while the second data set comprises of 282 observations drawn from the largest companies listed on BM over a three-year period, from 2008-2010. Both accounting (return on assets and return on equity) and market performance (Tobin’s Q) measures were used to measure firm performance. Multiple and panel data regression analyses were adopted to analyze the data. Findings – The study shows that there were still cases of non-compliance to the basic requirements of the code such as the one-third independent non-executive director (INDs) requirement even after the revised code. While the regression models indicate marginal significance of board size and independent directors before the revised code, the results indicate all corporate governance variables have a significant negative relationship with at least one of the measures of corporate performance. Independent chairperson, however, showed a consistent positive impact on firm performance both before and after the revised code. In addition, ownership structure elements were found to have a negative relationship with either accounting or market performance measures, with institutional ownership showing a consistent negative impact on firm performance. Firm size and leverage, as control variables, were significant in determining corporate performance. Research limitations/implications – One limitation is the use of separate measures of corporate governance attributes, as opposed to a corporate governance index (CGI). As a result, the study constructs a CGI based on the recommendations of the revised code and proposes for future research use. Practical implications – Some of the largest companies did not even comply with basic requirements such as the “one-third INDs” mandatory requirement. Hence, the regulators may want to reinforce the requirements of the code and also detail examples of good governance practices. The results, which show a consistent positive relationship between the presence of an independent chairperson and firm performance in both data sets, suggest listed companies to consider appointing an independent chairperson in the corporate leadership. The regulatory authorities may also wish to note this phenomenon when drafting any future corporate governance codes. Originality/value – This study offers new insights of the implications of regulatory changes on the relationship between corporate governance attributes and firm performance from the perspective of a developing country. The development of a CGI for future research is a novel approach of this study.


2019 ◽  
Vol 11 (24) ◽  
pp. 2973 ◽  
Author(s):  
Telmo Vieira ◽  
M. Joana Fernandes ◽  
Clara Lázaro

Wet path delay (WPD) for satellite altimetry has been provided from external sources, raising the need of converting this value between different altitudes. The only expression available for this purpose considers the same altitude reduction, irrespective of geographic location and time. The focus of this study is the modelling of the WPD altitude dependence, aiming at developing improved expressions. Using ERA5 pressure level fields (2010–2013), WPD vertical profiles were computed globally. At each location and for each vertical profile, an exponential function was fitted using least squares, determining the corresponding decay coefficient. The time evolution of these coefficients reveals regions where they are highly variable, making this modelling more difficult, and regions where an annual signal exists. The output of this modelling consists of a set of so-called University of Porto (UP) coefficients, dependent on geographic location and time. An assessment with ERA5 data (2014) shows that for the location where the Kouba coefficient results in a maximum Root Mean Square (RMS) error of 3.2 cm, using UP coefficients this value is 1.2 cm. Independent comparisons with WPD derived from Global Navigation Satellite Systems and radiosondes show that the use of UP coefficients instead of Kouba’s leads to a decrease in the RMS error larger than 1 cm.


2021 ◽  
Vol 79 (1) ◽  
Author(s):  
Romana Haneef ◽  
Sofiane Kab ◽  
Rok Hrzic ◽  
Sonsoles Fuentes ◽  
Sandrine Fosse-Edorh ◽  
...  

Abstract Background The use of machine learning techniques is increasing in healthcare which allows to estimate and predict health outcomes from large administrative data sets more efficiently. The main objective of this study was to develop a generic machine learning (ML) algorithm to estimate the incidence of diabetes based on the number of reimbursements over the last 2 years. Methods We selected a final data set from a population-based epidemiological cohort (i.e., CONSTANCES) linked with French National Health Database (i.e., SNDS). To develop this algorithm, we adopted a supervised ML approach. Following steps were performed: i. selection of final data set, ii. target definition, iii. Coding variables for a given window of time, iv. split final data into training and test data sets, v. variables selection, vi. training model, vii. Validation of model with test data set and viii. Selection of the model. We used the area under the receiver operating characteristic curve (AUC) to select the best algorithm. Results The final data set used to develop the algorithm included 44,659 participants from CONSTANCES. Out of 3468 variables from SNDS linked to CONSTANCES cohort were coded, 23 variables were selected to train different algorithms. The final algorithm to estimate the incidence of diabetes was a Linear Discriminant Analysis model based on number of reimbursements of selected variables related to biological tests, drugs, medical acts and hospitalization without a procedure over the last 2 years. This algorithm has a sensitivity of 62%, a specificity of 67% and an accuracy of 67% [95% CI: 0.66–0.68]. Conclusions Supervised ML is an innovative tool for the development of new methods to exploit large health administrative databases. In context of InfAct project, we have developed and applied the first time a generic ML-algorithm to estimate the incidence of diabetes for public health surveillance. The ML-algorithm we have developed, has a moderate performance. The next step is to apply this algorithm on SNDS to estimate the incidence of type 2 diabetes cases. More research is needed to apply various MLTs to estimate the incidence of various health conditions.


Sign in / Sign up

Export Citation Format

Share Document