scholarly journals MEASURES AND INDICATORS OF VGI QUALITY: AN OVERVIEW

Author(s):  
V. Antoniou ◽  
A. Skopeliti

The evaluation of VGI quality has been a very interesting and popular issue amongst academics and researchers. Various metrics and indicators have been proposed for evaluating VGI quality elements. Various efforts have focused on the use of well-established methodologies for the evaluation of VGI quality elements against authoritative data. In this paper, a number of research papers have been reviewed and summarized in a detailed report on measures for each spatial data quality element. Emphasis is given on the methodology followed and the data used in order to assess and evaluate the quality of the VGI datasets. However, as the use of authoritative data is not always possible many researchers have turned their focus on the analysis of new quality indicators that can function as proxies for the understanding of VGI quality. In this paper, the difficulties in using authoritative datasets are briefly presented and new proposed quality indicators are discussed, as recorded through the literature review. We classify theses new indicators in four main categories that relate with: i) data, ii) demographics, iii) socio-economic situation and iv) contributors. This paper presents a dense, yet comprehensive overview of the research on this field and provides the basis for the ongoing academic effort to create a practical quality evaluation method through the use of appropriate quality indicators.

2019 ◽  
Vol 1 ◽  
pp. 1-2
Author(s):  
Nils Mesterton ◽  
Mari Isomäki ◽  
Antti Jakobsson ◽  
Joonas Jokela

<p><strong>Abstract.</strong> The Finnish National Topographic Database (NTDB) is currently developed by the National Land Survey of Finland (NLS) together with municipalities and other governmental agencies. It will be a harmonized database for topographic data in Finland provided by municipalities, the NLS and other agencies. The NTDB has been divided into several themes, of which the buildings theme was the focus in the first stage of development. Data collection for the NTDB is performed by different municipalities and governmental organizations. Having many supplying organizations can lead to inconsistencies in spatial data. Without a robust quality process this could lead to a chaos. Fortunately data quality can be controlled with an automated data quality evaluation process. Reaching a better degree of harmonization across the database is one of the main goals of NTDB in the future, besides reducing the amount of overlapping work and making national topographic data more accessible to all potential users.</p><p>The aim of the NTDB spatial data management system architecture is to have a modular architecture. Therefore, the Data Quality Module named as QualityGuard can also be utilized in the National Geospatial Platform which will be a key component in the future Spatial Data Infrastructure of Finland. The National Geospatial Platform will include the NTDB data themes but also addresses, detailed plans and other land use information. FME was chosen as the implementation platform of the QualityGuard because it is robust and highly adaptable, allowing development of even the most complicated ETL workflows and spatial applications. This approach allows effortless communication with different applications via various types of interfaces, thus efficiently enabling the modularity requirement in all stages of development and integration.</p><p>The QualityGuard works in two modes: a) as a part of the import process to NTDB, and b) independently. Users can validate their data using the independent QualityGuard to find possible errors in their data and fix them. Once validated and the data is fixed, data producers can import their data using the import option. The users receive a data quality report containing statistics and a quality error dataset regarding their imported data, which can be inspected in any GIS software, e.g. overlaid on original features. Geographical locations of quality errors are displayed as points. Each error finding produces a row in the error dataset, containing information about the type and cause of the error as short descriptions.</p><p>Data quality evaluation is based on validating the conformance against data product specifications specified as quality rules. Three different ISO 19157 quality elements are utilized: format consistency, domain consistency and topological consistency. The quality rules have been defined in a co-operation with specialists in the field and the technical developing team. The definition work is based on the concept developed in the ESDIN project, quality specifications of INSPIRE, national topographic database quality specifications, national and international quality recommendations and standards, quality rules developed in European Location Framework (ELF) project and interviews of experts from National Land Survey of Finland and municipalities. In fact the NLS was one of the first agencies in the world who published a quality model for the digital topographic data in 1995.</p><p>Quality rules are currently documented in spreadsheet documents representing each theme. Each quality rule has been defined using RuleSpeak, a structured notation for expressing business rules. RuleSpeak provides a consistent structure for each definition. The rules are divided in general rules and feature-specific rules. General rules are relevant for all feature types of a specific theme, although exceptions can be defined.</p><p>A nation-wide, centralized automated spatial data quality process is one of the key elements in an effort towards achieving better harmonization of the NTDB. In principle, the greater aim is to achieve compliance with the auditing process described in ISO 19158. This process is meant to ensure that the supplying organizations are capable of delivering data of expected quality. However, implementing a nation-wide process is rather challenging because municipalities and other organizations might not have the capability or resources to repair the quality issues identified by the QualityGuard. Inconsistent data quality is not desirable, and data quality requirements will be less strict at first phases of implementation. Some of the issues will be automatically repaired by the software once the process has been established, but the organizations will still receive a notification about data quality issues in any conflicting features.</p><p>The Finnish NTDB is in a continuous state of development and currently effort is made towards reaching automation, improved data quality and less overlapping work in co-operation with municipalities and other data producers. The QualityGuard has enabled an automated spatial data quality validation process for incoming data and it is currently being evaluated in practice. The results have already been well received by the users. Automating data quality validation is no longer a work of fiction. As indicated earlier we believe this will be a common practice with all SDI datasets in Finland.</p></p>


2019 ◽  
Vol 90 (7-8) ◽  
pp. 838-846
Author(s):  
Meng Wang ◽  
Yinlan Zhan ◽  
Leon Yao

The theory of purification is proposed in this article. Based on the framework of this theory, several models can be built to give a synthetic quality evaluation of interlacing yarns. In this paper, three models are given. The first one is the most popular method to evaluate the quality of interlacing yarn and we can prove it meets the theory of purification. The second one is modified according to the coefficient of variation. The third one is a new evaluation method. Meanwhile, we give a test algorithm to compare the degree of reflecting common cognition of the three models. From the result of comparison and analysis, the new evaluation method is recommended. It has a comparatively complete evaluation value for every state in universe M. Compared to other modeling methods, this method can give every level of interlacing yarn a relatively pertinent evaluation.


Algorithms ◽  
2020 ◽  
Vol 13 (10) ◽  
pp. 257
Author(s):  
Wu Dong ◽  
Hongxia Bie ◽  
Likun Lu ◽  
Yeli Li

Currently, screen content images (SCIs) are widely used in our modern society. However, since SCIs have distinctly different properties compared to natural images, traditional quality assessment methods of natural images cannot precisely evaluate the quality of SCIs. Thus, we propose a blind quality evaluation method for SCIs based on regionalized structural features that are closely relevant to the intrinsic quality of SCIs. Firstly, the features of textual and pictorial regions of SCIs are extracted separately. For textual regions, since they contain noticeable structural information, we propose improved histograms of oriented gradients extracted from multi-order derivatives as structural features. For pictorial regions, since human vision is sensitive to texture information and luminance variation, we adopt texture as the structural feature; meanwhile, luminance is used as the auxiliary feature. The local derivative pattern and the shearlet local binary pattern are used to extract texture in the spatial and shearlet domains, respectively. Secondly, to derive the quality of textual and pictorial regions, two mapping functions are respectively trained from their features to subjective values. Finally, an activity weighting strategy is proposed to combine the quality of textual and pictorial regions. Experimental results show that the proposed method achieves better performance than the state-of-the-art methods.


Author(s):  
M. Meijer ◽  
L. A. E. Vullings ◽  
J. D. Bulens ◽  
F. I. Rip ◽  
M. Boss ◽  
...  

Although by many perceived as important, spatial data quality has hardly ever been taken centre stage unless something went wrong due to bad quality. However, we think this is going to change soon. We are more and more relying on data driven processes and due to the increased availability of data, there is a choice in what data to use. How to make that choice? We think spatial data quality has potential as a selection criterion. <br><br> In this paper we focus on how a workflow tool can help the consumer as well as the producer to get a better understanding about which product characteristics are important. For this purpose, we have developed a framework in which we define different roles (consumer, producer and intermediary) and differentiate between product specifications and quality specifications. A number of requirements is stated that can be translated into quality elements. We used case studies to validate our framework. This framework is designed following the fitness for use principle. Also part of this framework is software that in some cases can help ascertain the quality of datasets.


2019 ◽  
Vol 1 ◽  
pp. 1-8
Author(s):  
Vaclav Talhofer ◽  
Šárka Hošková-Mayerová

<p><strong>Abstract.</strong> Multi-criterial analysis is becoming one of the main methods for evaluation of influence of geographic environment on human activity, or human activity on geographic environment, respectively. Analysis results are often used in command and control systems, especially in armed forces and units of rescue systems. For analyses, digital geographic data – whose quality significantly influences the reached results – are used. Visualization of results of analyses in command and control systems are usually thematic layers over raster images of topographic maps. That is why this visualization must correspond to cartographic principles used for the creation of thematic maps. The article presents problems that an analyst encounters within the evaluation of the quality of the used data, performance of the analysis itself as well as preparation of data files for their transfer and publishing in command and control systems.</p>


Information ◽  
2020 ◽  
Vol 11 (10) ◽  
pp. 477
Author(s):  
Yiping Zhu ◽  
Zan Zhou

High-quality power demand side information is necessary for scientific decision-making of power grid construction projects. Literature research shows that the current demand side management (DSM) information quality theories and methods need to be improved, and the information quality indicators and evaluation work are essential. In this paper, based on the grounded theory, about 250 copies of relevant literatures and interview records are reviewed. Through open coding, spindle coding, and selective coding, 105 initial concepts are finally extracted to 35 categories and 10 main categories. On this basis, four information dimensions including load extraction, monitoring, management, and government planning are summarized. An index system containing 34 indicators for DSM information quality evaluation on the power demand side is constructed. Finally, using matter-element extension evaluation method, a case study in China is performed to verify the feasibility and scientificity of the indexes. The results show that DSM information quality evaluation indexes are effective, and the evaluation method is also applicable. The establishment of DSM information quality indicators and the evaluation methods in this paper can provide a reference for similar information quality evaluation work in power systems.


2013 ◽  
Vol 706-708 ◽  
pp. 2095-2098
Author(s):  
Cheng Zan Chu ◽  
Li Wei Zhu ◽  
Ran Na

Highway general mechanical and electrical product quality is one of the important factors for guaranteeing highway efficient and safe operation. Over the past decade, with the rapid development of highway construction and mechanical equipment manufacturing technology, more and more mechanical and electrical products applied in highway. For the objective and scientific evaluation of quality of mechanical and electrical products, highway mechanical and electrical product quality index system and quality evaluation model were researched based on product generic quality evaluation method, and then verified by actual product case.


Author(s):  
Chen Zhuo ◽  
Xiaoming Dong

The MOOC-based education is an important means to improve the quality of education as the increasing development of internet; meanwhile, the assessment of teaching quality is an indispensable aspect in teaching management, and it has been more and more important as the scale of the students' expansion, In order to deal with the challenges of big data processing effectively in the field of education, we designed a teaching quality assessment model on the MOOC platform based on comprehensive fuzzy evaluation. To verify the effectiveness of our method, a control experiment was adopted to explore the significance of our evaluation method, the results show that it can help teacher to prepare their teaching contents and students to improve their learning efficiency.


2015 ◽  
Vol 1092-1093 ◽  
pp. 67-71 ◽  
Author(s):  
Dong Wang ◽  
Mao Sheng Yang

In order to make a further promotion and development of photovoltaic power generation project, this paper brings the extension theory in photovoltaic power generation project quality evaluation and builds the photovoltaic power generation project extension evaluation index system and model on the basis of matter-element theory and extension analysis. Ranking the evaluating projects by calculating the indicators compared with the correlation of evaluation degrees. The paper takes three photovoltaic power generation project in Shaanxi in 2012 as an example, expounding the model is reasonable and reliable. The empirical results give a best reflection on the quality of the photovoltaic power generation project, which provide a scientific basis for project decision. The paper results provide a new thought for quality assessment of photovoltaic power generation project, enriching the photovoltaic power generation project evaluation method.


Sign in / Sign up

Export Citation Format

Share Document