scholarly journals SPATIAL DATA QUALITY AND A WORKFLOW TOOL

Author(s):  
M. Meijer ◽  
L. A. E. Vullings ◽  
J. D. Bulens ◽  
F. I. Rip ◽  
M. Boss ◽  
...  

Although by many perceived as important, spatial data quality has hardly ever been taken centre stage unless something went wrong due to bad quality. However, we think this is going to change soon. We are more and more relying on data driven processes and due to the increased availability of data, there is a choice in what data to use. How to make that choice? We think spatial data quality has potential as a selection criterion. <br><br> In this paper we focus on how a workflow tool can help the consumer as well as the producer to get a better understanding about which product characteristics are important. For this purpose, we have developed a framework in which we define different roles (consumer, producer and intermediary) and differentiate between product specifications and quality specifications. A number of requirements is stated that can be translated into quality elements. We used case studies to validate our framework. This framework is designed following the fitness for use principle. Also part of this framework is software that in some cases can help ascertain the quality of datasets.

2019 ◽  
Vol 1 ◽  
pp. 1-2
Author(s):  
Nils Mesterton ◽  
Mari Isomäki ◽  
Antti Jakobsson ◽  
Joonas Jokela

<p><strong>Abstract.</strong> The Finnish National Topographic Database (NTDB) is currently developed by the National Land Survey of Finland (NLS) together with municipalities and other governmental agencies. It will be a harmonized database for topographic data in Finland provided by municipalities, the NLS and other agencies. The NTDB has been divided into several themes, of which the buildings theme was the focus in the first stage of development. Data collection for the NTDB is performed by different municipalities and governmental organizations. Having many supplying organizations can lead to inconsistencies in spatial data. Without a robust quality process this could lead to a chaos. Fortunately data quality can be controlled with an automated data quality evaluation process. Reaching a better degree of harmonization across the database is one of the main goals of NTDB in the future, besides reducing the amount of overlapping work and making national topographic data more accessible to all potential users.</p><p>The aim of the NTDB spatial data management system architecture is to have a modular architecture. Therefore, the Data Quality Module named as QualityGuard can also be utilized in the National Geospatial Platform which will be a key component in the future Spatial Data Infrastructure of Finland. The National Geospatial Platform will include the NTDB data themes but also addresses, detailed plans and other land use information. FME was chosen as the implementation platform of the QualityGuard because it is robust and highly adaptable, allowing development of even the most complicated ETL workflows and spatial applications. This approach allows effortless communication with different applications via various types of interfaces, thus efficiently enabling the modularity requirement in all stages of development and integration.</p><p>The QualityGuard works in two modes: a) as a part of the import process to NTDB, and b) independently. Users can validate their data using the independent QualityGuard to find possible errors in their data and fix them. Once validated and the data is fixed, data producers can import their data using the import option. The users receive a data quality report containing statistics and a quality error dataset regarding their imported data, which can be inspected in any GIS software, e.g. overlaid on original features. Geographical locations of quality errors are displayed as points. Each error finding produces a row in the error dataset, containing information about the type and cause of the error as short descriptions.</p><p>Data quality evaluation is based on validating the conformance against data product specifications specified as quality rules. Three different ISO 19157 quality elements are utilized: format consistency, domain consistency and topological consistency. The quality rules have been defined in a co-operation with specialists in the field and the technical developing team. The definition work is based on the concept developed in the ESDIN project, quality specifications of INSPIRE, national topographic database quality specifications, national and international quality recommendations and standards, quality rules developed in European Location Framework (ELF) project and interviews of experts from National Land Survey of Finland and municipalities. In fact the NLS was one of the first agencies in the world who published a quality model for the digital topographic data in 1995.</p><p>Quality rules are currently documented in spreadsheet documents representing each theme. Each quality rule has been defined using RuleSpeak, a structured notation for expressing business rules. RuleSpeak provides a consistent structure for each definition. The rules are divided in general rules and feature-specific rules. General rules are relevant for all feature types of a specific theme, although exceptions can be defined.</p><p>A nation-wide, centralized automated spatial data quality process is one of the key elements in an effort towards achieving better harmonization of the NTDB. In principle, the greater aim is to achieve compliance with the auditing process described in ISO 19158. This process is meant to ensure that the supplying organizations are capable of delivering data of expected quality. However, implementing a nation-wide process is rather challenging because municipalities and other organizations might not have the capability or resources to repair the quality issues identified by the QualityGuard. Inconsistent data quality is not desirable, and data quality requirements will be less strict at first phases of implementation. Some of the issues will be automatically repaired by the software once the process has been established, but the organizations will still receive a notification about data quality issues in any conflicting features.</p><p>The Finnish NTDB is in a continuous state of development and currently effort is made towards reaching automation, improved data quality and less overlapping work in co-operation with municipalities and other data producers. The QualityGuard has enabled an automated spatial data quality validation process for incoming data and it is currently being evaluated in practice. The results have already been well received by the users. Automating data quality validation is no longer a work of fiction. As indicated earlier we believe this will be a common practice with all SDI datasets in Finland.</p></p>


2013 ◽  
Vol 7 (10) ◽  
pp. 771-788 ◽  
Author(s):  
Krista Jones ◽  
Rodolphe Devillers ◽  
Yvan Bédard ◽  
Olaf Schroth

Spatium ◽  
2004 ◽  
pp. 77-83 ◽  
Author(s):  
Dusan Joksic ◽  
Branislav Bajat

We are witnessing nowadays that the last decade of the past century, as well as the first years of the present one, have brought technology expansion with respect to spatial data gathering and processing which makes a physical basis for management of spatial development. This has resulted in enlargement of the spatial data market. New technologies, presented in computer applications, have greatly expanded the number of users of these products. The philosophy of spatial data collecting has changed; analogue maps and plans printed on paper have been replaced by digital data bases which enable their presentation in a way that is the best for a particular user. Further, digital spatial data bases provide the possibility of their further upgrading by users. The two aspects, with respect to circumstances mentioned above, are very important in the process of data bases production and distribution. Firstly, the users of these data bases should be the ones who decide which of the available bases could satisfy their requirements, or in other words, what is the data quality level necessary for a certain application. On the other hand, the visualization of digital data bases could often mislead, since review of data bases could present data with better accuracy then the actual one. Thus, certain methods that would point to a quality of the selected data in the process of their analysis should be available to users. Specific, already adopted international standards, or specially developed procedures and methodologies, so called de facto standards, could be used in this data processing, enabling the estimation of these data quality. The development of Open GIS concept requires the adoption of widely accepted standards for spatial data quality. It is recommended that ISO standards should be accepted, firstly TC211 standards which are related to geographic information and geomatics. The realization of projects on ISO standards should be finished by 2006, so, all participants of these data bases should be both familiar with this project and ready to adapt to the given solutions. The basic components defining quality of data bases are explained by this work, and the results of the standardization regarding the procedures and methodology of their quality assessment, obtained so far, are also presented.


Author(s):  
J.-F. Hangouët

The many facets of what is encompassed by such an expression as “quality of spatial data” can be considered as a specific domain of reality worthy of formal description, i.e. of ontological abstraction. Various ontologies for data quality elements have already been proposed in literature. Today, the system of quality elements is most generally used and discussed according to the configuration exposed in the “data dictionary for data quality” of international standard ISO 19157. Our communication proposes an alternative view. This is founded on a perspective which focuses on the specificity of spatial data as a product: the representation perspective, where data in the computer are meant to show things of the geographic world and to be interpreted as such. The resulting ontology introduces new elements, the usefulness of which will be illustrated by orthoimagery examples.


2021 ◽  
Vol 10 (4) ◽  
pp. 265
Author(s):  
Godwin Yeboah ◽  
João Porto de Albuquerque ◽  
Rafael Troilo ◽  
Grant Tregonning ◽  
Shanaka Perera ◽  
...  

This paper examines OpenStreetMap data quality at different stages of a participatory mapping process in seven slums in Africa and Asia. Data were drawn from an OpenStreetMap-based participatory mapping process developed as part of a research project focusing on understanding inequalities in healthcare access of slum residents in the Global South. Descriptive statistics and qualitative analysis were employed to examine the following research question: What is the spatial data quality of collaborative remote mapping achieved by volunteer mappers in morphologically complex urban areas? Findings show that the completeness achieved by remote mapping largely depends on the morphology and characteristics of slums such as building density and rooftop architecture, varying from 84% in the best case, to zero in the most difficult site. The major scientific contribution of this study is to provide evidence on the spatial data quality of remotely mapped data through volunteer mapping efforts in morphologically complex urban areas such as slums; the results could provide insights into how much fieldwork would be needed in what level of complexity and to what extent the involvement of local volunteers in these efforts is required.


Author(s):  
G. Vosselman ◽  
S. J. Oude Elberink ◽  
M. Y. Yang

<p><strong>Abstract.</strong> The ISPRS Geospatial Week 2019 is a combination of 13 workshops organised by 30 ISPRS Working Groups active in areas of interest of ISPRS. The Geospatial Week 2019 is held from 10–14 June 2019, and is convened by the University of Twente acting as local organiser. The Geospatial Week 2019 is the fourth edition, after Antalya Turkey in 2013, La Grande Motte France in 2015 and Wuhan China in 2017.</p><p>The following 13 workshops provide excellent opportunities to discuss the latest developments in the fields of sensors, photogrammetry, remote sensing, and spatial information sciences:</p> <ul> <li>C3M&amp;amp;GBD – Collaborative Crowdsourced Cloud Mapping and Geospatial Big Data</li> <li>CHGCS – Cryosphere and Hydrosphere for Global Change Studies</li> <li>EuroCow-M3DMaN – Joint European Calibration and Orientation Workshop and Workshop onMulti-sensor systems for 3D Mapping and Navigation</li> <li>HyperMLPA – Hyperspectral Sensing meets Machine Learning and Pattern Analysis</li> <li>Indoor3D</li> <li>ISSDQ – International Symposium on Spatial Data Quality</li> <li>IWIDF – International Workshop on Image and Data Fusion</li> <li>Laser Scanning</li> <li>PRSM – Planetary Remote Sensing and Mapping</li> <li>SarCon – Advances in SAR: Constellations, Signal processing, and Applications</li> <li>Semantics3D – Semantic Scene Analysis and 3D Reconstruction from Images and ImageSequences</li> <li>SmartGeoApps – Advanced Geospatial Applications for Smart Cities and Regions</li> <li>UAV-g – Unmanned Aerial Vehicles in Geomatics</li> </ul> <p>Many of the workshops are part of well-established series of workshops convened in the past. They cover topics like UAV photogrammetry, laser scanning, spatial data quality, scene understanding, hyperspectral imaging, and crowd sourcing and collaborative mapping with applications ranging from indoor mapping and smart cities to global cryosphere and hydrosphere studies and planetary mapping.</p><p>In total 143 full papers and 357 extended abstracts were submitted by authors from 63 countries. 1250 reviews have been delivered by 295 reviewers. A total of 81 full papers have been accepted for the volume IV-2/W5 of the International Annals of Photogrammetry, Remote Sensing and Spatial Information Sciences. Another 289 papers are published in volume XLII-2/W13 of the International Archives of Photogrammetry, Remote Sensing and Spatial Information Sciences.</p><p>The editors would like to thank all contributing authors, reviewers and all workshop organizers for their role in preparing and organizing the Geospatial Week 2019. Thanks to their contributions, we can offer an excessive and varying collection in the Annals and the Archives.</p><p>We hope you enjoy reading the proceedings.</p><p>George Vosselman, Geospatial Week Director 2019, General Chair<br /> Sander Oude Elberink, Programme Chair<br /> Michael Ying Yang, Programme Chair</p>


Author(s):  
Philip Rocco ◽  
Jessica A. J. Rich ◽  
Katarzyna Klasa ◽  
Kenneth A. Dubin ◽  
Daniel Béland

Abstract Context: While the World Health Organization (WHO) has established guidance on COVID-19 surveillance, little is known about implementation of these guidelines in federations, which fragment authority across multiple levels of government. This study examines how subnational governments in federal democracies collect and report data on COVID-19 cases and mortality associated with COVID-19. Methods: We collected data from subnational government websites in 15 federal democracies to construct indices of COVID-19 data quality. Using bivariate and multivariate regression, we analyzed the relationship between these indices and indicators of state capacity, the decentralization of resources and authority, and the quality of democratic institutions. We supplement these quantitative analyses with qualitative case studies of subnational COVID-19 data in Brazil, Spain, and the United States. Findings: Subnational governments in federations vary in their collection of data on COVID-19 mortality, testing, hospitalization, and demographics. There are statistically significant associations (p&lt;0.05) between subnational data quality and key indicators of public health system capacity, fiscal decentralization, and the quality of democratic institutions. Case studies illustrate the importance of both governmental and civil-society institutions that foster accountability. Conclusions: The quality of subnational COVID-19 surveillance data in federations depends in part on public health system capacity, fiscal decentralization, and the quality of democracy.


Sign in / Sign up

Export Citation Format

Share Document