Creation and Error Analysis of High Resolution DEM Based on Source Data Sets of Various Accuracy

Author(s):  
Jari Pohjola ◽  
Jari Turunen ◽  
Tarmo Lipping ◽  
Ari Ikonen
2020 ◽  
Author(s):  
Dick M. A. Schaap ◽  
Thierry Schmitt

<p>Access to marine data is a key issue for the <strong>EU</strong> <strong>Marine Strategy Framework Directive</strong> and the <strong>EU</strong> <strong>Marine Knowledge 2020 agenda </strong>and includes the <strong>European Marine Observation and Data Network (EMODnet)</strong> initiative. EMODnet aims at assembling European marine data, data products and metadata from diverse sources in a uniform way.</p><p>The EMODnet Bathymetry project is active since 2008 and has developed Digital Terrain Models (DTM) for the European seas, which are published at a regular interval, each time improving quality and precision, and expanding functionalities for viewing, using, and downloading. The DTMs are produced from survey and aggregated data sets that are referenced with metadata adopting the SeaDataNet Catalogue services. SeaDataNet is a network of major oceanographic data centres around the European seas that manage, operate and further develop a pan-European infrastructure for marine and ocean data management. The latest EMODnet Bathymetry DTM release also includes Satellite Derived Bathymetry and has a grid resolution of 1/16 arcminute (circa 125 meters), covering all European sea regions. Use has been made of circa 9400 gathered survey datasets, composite DTMs and SDB bathymetry. Catalogues and the EMODnet DTM are published at the dedicated EMODnet Bathymetry portal including a versatile DTM viewing and downloading service.  </p><p>As part of the expansion and innovation, more focus has been directed towards bathymetry for near coastal waters and coastal zones. And Satellite Derived Bathymetry data have been produced and included to fill gaps in coverage of the coastal zones. The Bathymetry Viewing and Download service has been upgraded to provide a multi-resolution map and including versatile 3D viewing. Moreover, best-estimates have been determined of the European coastline for a range of tidal levels (HAT, MHW, MSL, Chart Datum, LAT), thereby making use of a tidal model for Europe. In addition, a Quality Index layer has been formulated with indicators derived from the source data and which can be queried in the The Bathymetry Viewing and Download service. Finally, extra functonality has been added to the mechanism for downloading DTM tiles in various formats and special high-resolution DTMs for interesting areas.  </p><p>This results in many users visiting the portal, browsing the DTM Viewer, downloading the DTM tiles and making use of the OGC Web services for using the EMODnet Bathymetry in their applications.</p><p>The presentation will highlight key details of the EMODnet Bathymetry DTM production process and the Bathymetry portal with its extensive functionality.</p>


2021 ◽  
Vol 4 (1) ◽  
pp. 251524592092800
Author(s):  
Erin M. Buchanan ◽  
Sarah E. Crain ◽  
Ari L. Cunningham ◽  
Hannah R. Johnson ◽  
Hannah Stash ◽  
...  

As researchers embrace open and transparent data sharing, they will need to provide information about their data that effectively helps others understand their data sets’ contents. Without proper documentation, data stored in online repositories such as OSF will often be rendered unfindable and unreadable by other researchers and indexing search engines. Data dictionaries and codebooks provide a wealth of information about variables, data collection, and other important facets of a data set. This information, called metadata, provides key insights into how the data might be further used in research and facilitates search-engine indexing to reach a broader audience of interested parties. This Tutorial first explains terminology and standards relevant to data dictionaries and codebooks. Accompanying information on OSF presents a guided workflow of the entire process from source data (e.g., survey answers on Qualtrics) to an openly shared data set accompanied by a data dictionary or codebook that follows an agreed-upon standard. Finally, we discuss freely available Web applications to assist this process of ensuring that psychology data are findable, accessible, interoperable, and reusable.


2021 ◽  
Author(s):  
Xin Xu ◽  
Zongren Dai ◽  
Yifang Wang ◽  
Mingfang Li ◽  
Yidong Tan

<div> <p>An optical rotary sensor based on laser self-mixing interferometry is proposed, which enables noncontact and full-circle rotation measurement of non-cooperative targets with high resolution and sensitivity. The prototype demonstrates that the resolution is 0.1μrad and the linearity is 2.33×10<sup>-4</sup>. Stability of the prototype is 2μrad over 3600s and the repeatability error is below 0.84°under 9-gruop full-circle tests. The theoretical resolution reaches up to 16nrad. Random rotation has been successfully traced with a bionic hand to simulate the tremor process. Error analysis and limitation discussion have been also carried out in the paper. Although the accuracy needs further improvement compared with the best rotary sensor, this method has its unique advantages of non-cooperative target sensing, high sensitivity and electromagnetic immunity. Hence, the optical rotary sensor provides a promising alternative in precise rotation measurement, tremor tracing and nano-motion monitoring.</p> </div> <b><br></b>


2011 ◽  
Vol 30 (3) ◽  
pp. 13-18
Author(s):  
Piotr Dzieszko

Analogue aerial-photopraphs external orientation reconstruction based on geoportal data For acquisition of source data for geoinformation analyses is necessary to do some field works. This way of data acquisition is time-consuming. In this case, photogrammetric and remote sensed methods can be more effective choice. Especially orthophotomap extracting is more effective process in creation of geodata. It is good foundation for further analysis and nice extension of existing geographical information systems. Despite fast growth of photogrammetry there are plenty of analogue, archival airphotos which can be used for geoinformation analysis. They are quiet up to date and scanned in very high resolution which means they can be used for really reliable analysis. The problem is very important because many of analogue, archival air photos do not contain photogrammetric warp. The aim of this paper is expression of applicability of geoportal webpage, which is part of INSPIRE directive, that can be used for external orientation reconstruction when there is no other georeference data.


2021 ◽  
Author(s):  
Jouke de Baar ◽  
Gerard van der Schrier ◽  
Irene Garcia-Marti ◽  
Else van den Besselaar

&lt;p&gt;&lt;strong&gt;Objective&lt;/strong&gt;&lt;/p&gt;&lt;p&gt;The purpose of the European Copernicus Climate Change Service (C3S) is to support society by providing information about the past, present and future climate. For the service related to &lt;em&gt;in-situ&lt;/em&gt; observations, one of the objectives is to provide high-resolution (0.1x0.1 and 0.25x0.25 degrees) gridded wind speed fields. The gridded wind fields are based on ECA&amp;D daily average station observations for the period 1970-2020.&lt;/p&gt;&lt;p&gt;&lt;strong&gt;Research question&lt;/strong&gt;&amp;#160;&lt;/p&gt;&lt;p&gt;We address the following research questions: [1] How efficiently can we provide the gridded wind fields as a statistically reliable ensemble, in order to represent the uncertainty of the gridding? [2] How efficiently can we exploit high-resolution geographical auxiliary variables (e.g. digital elevation model, terrain roughness) to augment the station data from a sparse network, in order to provide gridded wind fields with high-resolution local features?&lt;/p&gt;&lt;p&gt;&lt;strong&gt;Approach&lt;/strong&gt;&lt;/p&gt;&lt;p&gt;In our analysis, we apply greedy forward selection linear regression (FSLR) to include the high-resolution effects of the auxiliary variables on monthly-mean data. These data provide a &amp;#8216;background&amp;#8217; for the daily estimates. We apply cross-validation to avoid FSLR over-fitting and use full-cycle bootstrapping to create FSLR ensemble members. Then, we apply Gaussian process regression (GPR) to regress the daily anomalies. We consider the effect of the spatial distribution of station locations on the GPR gridding uncertainty.&lt;/p&gt;&lt;p&gt;The goal of this work is to produce several decades of daily gridded wind fields, hence, computational efficiency is of utmost importance. We alleviate the computational cost of the FSLR and GPR analyses by incorporating greedy algorithms and sparse matrix algebra in the analyses.&lt;/p&gt;&lt;p&gt;&lt;strong&gt;Novelty&lt;/strong&gt;&amp;#160;&amp;#160;&amp;#160;&lt;/p&gt;&lt;p&gt;The gridded wind fields are calculated as a statistical ensemble of realizations. In the present analysis, the ensemble spread is based on uncertainties arising from the auxiliary variables as well as from the spatial distribution of stations.&lt;/p&gt;&lt;p&gt;Cross-validation is used to tune the GPR hyper parameters. Where conventional GPR hyperparameter tuning aims at an optimal prediction of the gridded mean, instead, we tune the GPR hyperparameters for optimal prediction of the gridded ensemble spread.&lt;/p&gt;&lt;p&gt;Building on our experience with providing similar gridded climate data sets, this set of gridded wind fields is a novel addition to the E-OBS climate data sets.&lt;/p&gt;


2021 ◽  
Vol 53 (1) ◽  
Author(s):  
Bambang Sukresno ◽  
Dinarika Jatisworo ◽  
Rizki Hanintyo

Sea surface temperature (SST) is an important variable in oceanography. One of the SST data can be obtained from the Global Observation Mission-Climate (GCOM-C) satellite. Therefore, this data needs to be validated before being applied in various fields. This study aimed to validate SST data from the GCOM-C satellite in the Indonesian Seas. Validation was performed using the data of Multi-sensor Ultra-high Resolution sea surface temperature (MUR-SST) and in situ sea surface temperature Quality Monitor (iQuam). The data used are the daily GCOM-C SST dataset from January to December 2018, as well as the daily dataset from MUR-SST and iQuam in the same period. The validation process was carried out using the three-way error analysis method. The results showed that the accuracy of the GCOM-C SST was 0.37oC.


2014 ◽  
Vol 11 (6) ◽  
pp. 6139-6166 ◽  
Author(s):  
T. R. Marthews ◽  
S. J. Dadson ◽  
B. Lehner ◽  
S. Abele ◽  
N. Gedney

Abstract. Modelling land surface water flow is of critical importance for simulating land-surface fluxes, predicting runoff and water table dynamics and for many other applications of Land Surface Models. Many approaches are based on the popular hydrology model TOPMODEL, and the most important parameter of this model is the well-knowntopographic index. Here we present new, high-resolution parameter maps of the topographic index for all ice-free land pixels calculated from hydrologically-conditioned HydroSHEDS data sets using the GA2 algorithm. At 15 arcsec resolution, these layers are 4× finer than the resolution of the previously best-available topographic index layers, the Compound Topographic Index of HYDRO1k (CTI). In terms of the largest river catchments occurring on each continent, we found that in comparison to our revised values, CTI values were up to 20% higher in e.g. the Amazon. We found the highest catchment means were for the Murray-Darling and Nelson-Saskatchewan rather than for the Amazon and St. Lawrence as found from the CTI. We believe these new index layers represent the most robust existing global-scale topographic index values and hope that they will be widely used in land surface modelling applications in the future.


Sign in / Sign up

Export Citation Format

Share Document