GNSS data quality check in the EPN network

Author(s):  
Andras Fabian ◽  
Carine Bruyninx ◽  
Juliette Legrand ◽  
Anna .Miglio

<p>Global Navigation Satellite Systems (GNSS) are a widely spread cost effective technique for geodetic applications and monitoring the Earth’s atmosphere. Therefore, the density of the GNSS networks have grown considerable since the last decade. Each of the networks collects huge amounts of data from permanently operating GNSS stations. The quality of the data is variable, depending on the evaluated time period and satellite system. Conventionally, the quality information is extracted from daily estimates of different types of GNSS parameters such as number of data gaps, multipath level, number of cycle slips, number of dual frequency observations with respect to the expected number, and from their combinations.</p><p>The EUREF Permanent GNSS Network Central Bureau (EPN CB, Bruyninx et al., 2019) is operationally collecting and analysing the quality of more than 300 GNSS stations and investigates the main reason of any quality degradation. EPN CB is currently operating a semi-automatic (followed by a manual) data-monitoring tool to detect the quality degradations and investigate the source of the problems. In the upcoming years, this data-monitoring tool will be used to also monitor the GNSS component of the European Plate Observing System (EPOS) expected to include more than 3000 GNSS stations. This anticipated inflation of GNSS stations to be monitored will make it increasingly challenging to select the high quality GNSS data. EPN CB’s current system requires time-consuming semi-automatic inspection of data quality and it is not designed to handle the larger amounts of data. In addition, the current system does not exploit correlations between the daily data quality, time series and the GNSS station metadata (such as equipment type and receiver firmware) often common to many stations.</p><p>In this poster, we will first present the currently used method of GNSS data quality checking and its limitations. Based on more than 20 years of GNSS observations collected in the EPN, we will show typical cases of correlations between the time series of data quality metrics and GNSS station metadata. Then, we will set up the requirements and design the new GNSS data quality monitoring system capable of handling more than 300 stations. Based on the collected EPN samples and the typical cases, we will introduce ongoing improvements taking advantage of artificial intelligence techniques, show the possible design of the neutral network, and present supervised training of the neutral network.</p><p>Bruyninx C., Legrand J., Fabian A., Pottiaux E. (2019) GNSS Metadata and Data Validation in the EUREF Permanent Network. GPS Sol., 23(4), https://doi: 10.1007/s10291-019-0880-9</p>

2021 ◽  
Author(s):  
Katarzyna Stępniak ◽  
Grzegorz Krzan

<p>Water vapour is a key variable of the water cycle and plays a special role in many atmospheric processes controlling the weather and climate. Nowadays, extreme weather events, such as storms, floods, landslides, heat waves and droughts are the main concerns of society. The Global Navigation Satellite System (GNSS) is one of the few tools that can be used as an atmospheric water vapour sensor and, simultaneously, provide continuous, unbiased, precise and robust atmosphere condition information. A significant impact on the tropospheric parameter determination in the processing of satellite observations has undoubtedly GNSS antenna phase centers model.</p><p><br>Therefore, the aim of our study is to investigate the impact of different GNSS antenna calibration models on the quality of the tropospheric parameter series applied for climate applications. We analyse the zenith total delays (ZTD) obtained from GNSS data processing and afterwards converted integrated water vapour (IWV). Three years of GNSS data collected at 40 European Reference Frame (EUREF) Permanent GNSS Network (EPN) stations were processed with the NAPEOS software. Precise Point Positioning (PPP) technique utilizing European Space Agency (ESA) precise satellite orbits and clocks was used to estimate the parameters. Several different processing variants were processed and inter-compared. The first group of solutions was obtained by applying the International GNSS Service (IGS) type-mean Phase Center Correction (PCC) models. In the second and third groups of solutions, PCC models from respectively individual field robot calibration and calibration in an anechoic chamber were used. All solutions were processed using GPS and Galileo observations. Moreover, in order to validate and assess the quality of the GNSS solutions, the tropospheric parameters obtained from ERA5 reanalysis were compared with GNSS estimates. </p><p><br>In general, the results of the study show that the NAPEOS software can provide high quality GNSS tropospheric delay time series. The initial results indicate that the impact of applying different PCC model calibrations is not negligible. ZTD estimates obtained from variants using ROBOT and IGS14 calibration are closer to ERA5 than estimates from variants that used calibrations in an anechoic chamber. In addition, multi-GNSS processing variants are closer to ERA5 than GPS only or Galileo only processing variants. The results also depend on the equipment (receiver and antenna) of the stations. Validation against the data from climate reanalysis confirms that all GNSS approaches provide high-quality ZTD estimates. Furthermore, there is a high agreement in the IWV distributions between GNSS and ERA5.</p>


2017 ◽  
Vol 21 (3) ◽  
pp. 147-156 ◽  
Author(s):  
Ibrahim Tiryakioglu ◽  
Hakan Yavasoglu ◽  
Mehmet Ali Ugur Ugur ◽  
Caglar Ozkaymak ◽  
Mustafa Yilmaz ◽  
...  

The eastern Anatolia provides one of the best examples of an area of rapid deformation and intense contraction that is the consequence of an active continental collision between the Arabian and Eurasian plates leading to large and devastating earthquakes. The latest evidence of the active tectonism in the region is revealed by two remarkable seismic events; Van-Tabanli (Mw 7.2, October 23, 2011) and Van-Edremit (Mw 5.6, November 9, 2011) earthquakes. The study of the earthquake cycle and observation of geodetic and seismic deformation in this region is very important to hazard assessments. In this study, the inter-seismic, co-seismic, and post-seismic movements caused by the above-mentioned earthquakes were investigated using the time series of 2300 days of Global Navigation Satellite Systems (GNSS) observations of the local stations selected from the network of the Continuously Operating Reference Stations, Turkey (CORS-TR). For the inter-seismic period, approximately 1100 daily data were obtained from 21 CORS-TR stations (prior to the earthquakes between October 1, 2008 and October 23, 2011) and evaluated using the GAMIT/GLOBK software. The behaviour of these stations was investigated by processing 1 Hz data from the GNSS stations during the earthquakes on the GAMIT/TRACK software. In addition to October 23 and November 9, the GNSS data on one day before and after the earthquakes was assessed to determine co-seismic deformations. During the October 23 earthquake, hanging-wall deformation of about 60 mm was detected in the SW direction at the MURA station. However, at the VAAN station, deformation of 200 mm (value predicted by time series) was observed in the footwall block in the NW direction. There were not any significant changes at the stations during the November 9 earthquake. For the post-seismic period, the GNSS data from 2012 to 2015 was evaluated. According to the observations, post-seismic deformation continued at the stations close to the epicenter of the earthquake.


2020 ◽  
Author(s):  
Periklis-Konstantinos Diamantidis ◽  
Grzegorz Klopotek ◽  
Rüdiger Haas

<div>The emergence of BeiDou and Galileo as operational Global Navigation Satellite Systems (GNSS), in addition to Global Positioning System (GPS) and GLONASS which are already in use, opens up possibilities in delivering geodetic products with higher precision. Apart from ensuring the homogeneity of the derived products, multi-GNSS analysis takes the advantage of new frequencies and an improved sky coverage. This should lead to better phase ambiguity resolution and an improved estimation of target parameters such as zenith wet delays (ZWD), troposphere gradients (GRD) and station positions. The International GNSS Service (IGS) has realised this potential by initiating the Multi-GNSS Experiment (MGEX) which provides orbit, clock and observation data for all operational GNSS. Correspondingly, the multi-technique space geodetic analysis software c5++ has been augmented with a MGEX-compliant GNSS module. Based on this new module and the Precise Point Positioning (PPP) approach using six-month of data, an assessment of the derived geodetic products is carried out for several GNSS receivers located at the Onsala core site. More specifically, we perform both single- and multi-GNSS data analysis using Kalman filter and least-squares methods and assess the quality of the derived station positions, ZWD and GRD. A combined solution using all GNSS together is carried out and the improvement with respect to station position repeatabilities is assessed for each station. Inter-system biases, which homogenise the different time scale that each GNSS operates in and are necessary for the multi-GNSS combination, are estimated and presented. Finally, the applied inter-system weighting is discussed as well as its impact on the derived geodetic products.</div>


2008 ◽  
Vol 47 (4) ◽  
pp. 1006-1016 ◽  
Author(s):  
Guang-Yu Shi ◽  
Tadahiro Hayasaka ◽  
Atsumu Ohmura ◽  
Zhi-Hua Chen ◽  
Biao Wang ◽  
...  

Abstract Solar radiation is one of the most important factors affecting climate and the environment. Routine measurements of irradiance are valuable for climate change research because of long time series and areal coverage. In this study, a set of quality assessment (QA) algorithms is used to test the quality of daily solar global, direct, and diffuse radiation measurements taken at 122 observatories in China during 1957–2000. The QA algorithms include a physical threshold test (QA1), a global radiation sunshine duration test (QA2), and a standard deviation test applied to time series of annually averaged solar global radiation (QA3). The results show that the percentages of global, direct, and diffuse solar radiation data that fail to pass QA1 are 3.07%, 0.01%, and 2.52%, respectively; the percentages of global solar radiation data that fail to pass the QA2 and QA3 are 0.77% and 0.49%, respectively. The method implemented by the Global Energy Balance Archive is also applied to check the data quality of solar radiation in China. Of the 84 stations with a time series longer that 20 yr, suspect data at 35 of the sites were found. Based on data that passed the QA tests, trends in ground solar radiation and the effect of the data quality assessment on the trends are analyzed. There is a decrease in ground solar global and direct radiation in China over the years under study. Although the quality assessment process has significant effects on the data from individual stations and/or time periods, it does not affect the long-term trends in the data.


JAMIA Open ◽  
2021 ◽  
Vol 4 (3) ◽  
Author(s):  
Ali S Afshar ◽  
Yijun Li ◽  
Zixu Chen ◽  
Yuxuan Chen ◽  
Jae Hun Lee ◽  
...  

Abstract Physiological data, such as heart rate and blood pressure, are critical to clinical decision-making in the intensive care unit (ICU). Vital signs data, which are available from electronic health records, can be used to diagnose and predict important clinical outcomes; While there have been some reports on the data quality of nurse-verified vital sign data, little has been reported on the data quality of higher frequency time-series vital signs acquired in ICUs, that would enable such predictive modeling. In this study, we assessed the data quality issues, defined as the completeness, accuracy, and timeliness, of minute-by-minute time series vital signs data within the MIMIC-III data set, captured from 16009 patient-ICU stays and corresponding to 9410 unique adult patients. We measured data quality of four time-series vital signs data streams in the MIMIC-III data set: heart rate (HR), respiratory rate (RR), blood oxygen saturation (SpO2), and arterial blood pressure (ABP). Approximately, 30% of patient-ICU stays did not have at least 1 min of data during the time-frame of the ICU stay for HR, RR, and SpO2. The percentage of patient-ICU stays that did not have at least 1 min of ABP data was ∼56%. We observed ∼80% coverage of the total duration of the ICU stay for HR, RR, and SpO2. Finally, only 12.5%%, 9.9%, 7.5%, and 4.4% of ICU lengths of stay had ≥ 99% data available for HR, RR, SpO2, and ABP, respectively, that would meet the three data quality requirements we looked into in this study. Our findings on data completeness, accuracy, and timeliness have important implications for data scientists and informatics researchers who use time series vital signs data to develop predictive models of ICU outcomes.


2018 ◽  
Vol 35 (10) ◽  
pp. 1897-1911
Author(s):  
Ya-Chien Feng ◽  
Frédéric Fabry

AbstractTo properly use radar refractivity data quantitatively, good knowledge of its errors is required. The data quality of refractivity critically depends on the phase measurements of ground targets that are used for the refractivity estimation. In this study, the observational error structure of refractivity is first estimated based on quantifying the uncertainties of phase measurements, data processing, and the refractivity estimation method. New correlations between the time series of phase measurements at different elevation angles and between polarizations are developed to assess the bulk phase variability of individual targets. Then, the observational error of refractivity is obtained by simulating the uncertainties of phase measurements through the original refractivity estimation method. Resulting errors in refractivity are found to be smaller than 1 N-unit in areas densely populated with reliable point-like stationary ground targets but grow as the target density becomes sparse.


2017 ◽  
Author(s):  
Weikai Li ◽  
Lishan Qiao ◽  
Zhengxia Wang ◽  
Dinggang Shen

AbstractFunctional brain network (FBN) has been becoming an increasingly important measurement for exploring the cerebral working mechanism and mining informative biomarkers for assisting diagnosis of some neurodegenerative disorders. Despite its potential performance in discovering the valuable patterns hidden in the brains, the estimated FBNs are often heavily influenced by the quality of the observed data (e.g., BOLD signal series). In practice, a preprocessing pipeline is usually employed for improving the data quality prior to the FBN estimation; but, even so, some data points in the time series are still not clean enough, possibly including original artifacts (e.g., micro head motion), non-resting functional disturbing (e.g., mind-wandering), and new “noises” caused by the preprocessing pipeline per se. Therefore, not all data points in the time series can contribute to the subsequent FBN estimation. To address this issue, in this paper, we propose a novel FBN estimation method by introducing a latent variable as an indicator of the data quality, and develop an alternating optimization algorithm for scrubbing the data and estimating FBN simultaneously in a single framework. As a result, we can obtain more accurate FBNs with the self-scrubbing data. To illustrate the effectiveness of the proposed method, we conduct experiments on two publicly available datasets to identify mild cognitive impairment (MCI) patients from normal control (NC) subjects based on the estimated FBNs. Experimental results show that the proposed FBN modelling method can achieve higher classification accuracy, significantly outperforming the baseline methods.


2017 ◽  
Vol 2017 ◽  
pp. 1-11 ◽  
Author(s):  
Angel J. Lopez ◽  
Ivana Semanjski ◽  
Sidharta Gautama ◽  
Daniel Ochoa

Human travel behaviour has been addressed in many transport studies, where travel survey methods have been widely used to collect self-reported insights of daily mobility patterns. However, since the introduction of Global Navigation Satellite Systems (GNSS) and more recently smartphones with built-in GNSS, researchers have adopted these ubiquitous devices as tools for collecting mobility behaviour data. Although most studies recognize the applicability of this technology, it still has limitations. These are rarely addressed in a quantified manner. Often the quality of the collected data tends to be overestimated and these errors propagate into the aggregated results providing incomplete knowledge of the levels of confidence of the results and conclusions. In this study, we focus on the completeness aspects of data quality using GNSS data from four campaigns in the Flanders region of Belgium. The empirical results are based on mobility behaviour data collected through smartphones and include more than 450 participants over a period of twenty-nine months. Our findings show which transport mode is affected the most and how land use affects the quality of the collected data. In addition, we provide insights into the time to first fix that can be used for a better estimation of travel patterns.


Author(s):  
Estefania Rabaneda Romero
Keyword(s):  

2017 ◽  
Vol 4 (1) ◽  
pp. 25-31 ◽  
Author(s):  
Diana Effendi

Information Product Approach (IP Approach) is an information management approach. It can be used to manage product information and data quality analysis. IP-Map can be used by organizations to facilitate the management of knowledge in collecting, storing, maintaining, and using the data in an organized. The  process of data management of academic activities in X University has not yet used the IP approach. X University has not given attention to the management of information quality of its. During this time X University just concern to system applications used to support the automation of data management in the process of academic activities. IP-Map that made in this paper can be used as a basis for analyzing the quality of data and information. By the IP-MAP, X University is expected to know which parts of the process that need improvement in the quality of data and information management.   Index term: IP Approach, IP-Map, information quality, data quality. REFERENCES[1] H. Zhu, S. Madnick, Y. Lee, and R. Wang, “Data and Information Quality Research: Its Evolution and Future,” Working Paper, MIT, USA, 2012.[2] Lee, Yang W; at al, Journey To Data Quality, MIT Press: Cambridge, 2006.[3] L. Al-Hakim, Information Quality Management: Theory and Applications. Idea Group Inc (IGI), 2007.[4] “Access : A semiotic information quality framework: development and comparative analysis : Journal ofInformation Technology.” [Online]. Available: http://www.palgravejournals.com/jit/journal/v20/n2/full/2000038a.html. [Accessed: 18-Sep-2015].[5] Effendi, Diana, Pengukuran Dan Perbaikan Kualitas Data Dan Informasi Di Perguruan Tinggi MenggunakanCALDEA Dan EVAMECAL (Studi Kasus X University), Proceeding Seminar Nasional RESASTEK, 2012, pp.TIG.1-TI-G.6.


Sign in / Sign up

Export Citation Format

Share Document