scholarly journals Point-of-Interest (POI) Data Validation Methods: An Urban Case Study

2021 ◽  
Vol 10 (11) ◽  
pp. 735
Author(s):  
Lih Wei Yeow ◽  
Raymond Low ◽  
Yu Xiang Tan ◽  
Lynette Cheah

Point-of-interest (POI) data from map sources are increasingly used in a wide range of applications, including real estate, land use, and transport planning. However, uncertainties in data quality arise from the fact that some of this data are crowdsourced and proprietary validation workflows lack transparency. Comparing data quality between POI sources without standardized validation metrics is a challenge. This study reviews and implements the available POI validation methods, working towards identifying a set of metrics that is applicable across datasets. Twenty-three validation methods were found and categorized. Most methods evaluated positional accuracy, while logical consistency and usability were the least represented. A subset of nine methods was implemented to assess four real-world POI datasets extracted for a highly urbanized neighborhood in Singapore. The datasets were found to have poor completeness with errors of commission and omission, although spatial errors were reasonably low (<60 m). Thematic accuracy in names and place types varied. The move towards standardized validation metrics depends on factors such as data availability for intrinsic or extrinsic methods, varying levels of detail across POI datasets, the influence of matching procedures, and the intended application of POI data.

2019 ◽  
Vol 11 (24) ◽  
pp. 7041
Author(s):  
Tobias Engelmann ◽  
Daniel Fischer ◽  
Marianne Lörchner ◽  
Jaya Bowry ◽  
Holger Rohn

Sustainability as a guiding idea for societal and economic development causes a growing need for reliable sustainability assessments (SAs). In response, a plethora of increasingly sophisticated, standardizAed, and specialized approaches have emerged. However, little attention has been paid to how applications of SAs in different contexts navigate the challenges of selecting and customizing SA approaches for their research purposes. This paper provides an exploration of the context-specific conditions of SA through a case study of three research projects. Each case study explores the different approaches, methodologies, as well as difficulties and similarities that researchers face in “doing” SA based on the research question “What are common challenges that researchers are facing in using SA approaches?” Our case study comparison follows a most different approach for covering a wide range of SA applications and is structured along with three key challenges of doing SA: (i) Deliberation, learning and assessment; (ii) normative assessment principles; (iii) feasibility, especially regarding data quality/availability. Above all, the comparative case study underlines the role and importance of reflexivity and context: We argue that a more explicit and transparent discussion of these challenges could contribute to greater awareness, and thus, to improving the ability of researchers to transparently modify and customize generic SA methodologies to their research contexts. Our findings can help researchers to more critically appraise the differences between SA approaches, as well as their normative assumptions, and guide them to assemble their SA methodology in a reflexive and case-sensitive way.


2019 ◽  
Vol 28 (6) ◽  
pp. 804-816 ◽  
Author(s):  
Massimo Migliorini ◽  
Jenny Sjåstad Hagen ◽  
Jadranka Mihaljević ◽  
Jaroslav Mysiak ◽  
Jean-Louis Rossi ◽  
...  

Purpose The purpose of this paper is to discuss how, despite increasing data availability from a wide range of sources unlocks unprecedented opportunities for disaster risk reduction, data interoperability remains a challenge due to a number of barriers. As a first step to enhancing data interoperability for disaster risk reduction is to identify major barriers, this paper presents a case study on data interoperability in disaster risk reduction in Europe, linking current barriers to the regional initiative of the European Science and Technology Advisory Group. Design/methodology/approach In support of Priority 2 (“Strengthening disaster risk governance to manage disaster risk”) of the Sendai Framework and SDG17 (“Partnerships for the goals”), this paper presents a case study on barriers to data interoperability in Europe based on a series of reviews, surveys and interviews with National Sendai Focal Points and stakeholders in science and research, governmental agencies, non-governmental organizations and industry. Findings For a number of European countries, there remains a clear imbalance between long-term disaster risk reduction and short-term preparation and the dominant role of emergency relief, response and recovery, pointing to the potential of investments in ex ante measures with better inclusion and exploitation of data. Originality/value Modern society is facing a digital revolution. As highlighted by the International Council of Science and the Committee on Data for Science and Technology, digital technology offers profound opportunities for science to discover unsuspected patterns and relationships in nature and society, on scales from the molecular to the cosmic, from local health systems to global sustainability. It has created the potential for disciplines of science to synergize into a holistic understanding of the complex challenges currently confronting humanity; the Sustainable Development Goals are a direct reflectance of this. Interdisciplinary is obtained with integration of data across relevant disciplines. However, a barrier to realization and exploitation of this potential arises from the incompatible data standards and nomenclatures used in different disciplines. Although the problem has been addressed by several initiatives, the following challenge still remains: to make online data integration a routine.


2018 ◽  
Vol 10 (11) ◽  
pp. 4320 ◽  
Author(s):  
Riccardo D’Alberto ◽  
Matteo Zavalloni ◽  
Meri Raggi ◽  
Davide Viaggi

A large share of the Common Agricultural Policy (CAP) is allocated to agri-environmental schemes (AESs), whose goal is to foster the provision of a wide range of environmental public goods. Despite this effort, little is known on the actual environmental and economic impact of the AESs, due to the non-experimental conditions of the assessment exercise and several data availability issues. The main objective of the paper is to explore the feasibility of combining the non-parametric statistical matching (SM) method and propensity score matching (PSM) counterfactual approach analysis and to test its usefulness and practicability on a case study represented by selected impacts of the AESs in Emilia-Romagna. The work hints at the potentialities of the combined use of SM and PSM as well as of the systematic collection of additional information to be included in EU-financed project surveys in order to enrich and complete data collected in the official statistics. The results show that the combination of the two methods enables us to enlarge and deepen the scope of counterfactual analysis applied to AESs. In a specific case study, AESs seem to reduce the amount of rent-in land and decrease the crop mix diversity.


Water ◽  
2021 ◽  
Vol 13 (20) ◽  
pp. 2820
Author(s):  
Gimoon Jeong ◽  
Do-Guen Yoo ◽  
Tae-Woong Kim ◽  
Jin-Young Lee ◽  
Joon-Woo Noh ◽  
...  

In our intelligent society, water resources are being managed using vast amounts of hydrological data collected through telemetric devices. Recently, advanced data quality control technologies for data refinement based on hydrological observation history, such as big data and artificial intelligence, have been studied. However, these are impractical due to insufficient verification and implementation periods. In this study, a process to accurately identify missing and false-reading data was developed to efficiently validate hydrological data by combining various conventional validation methods. Here, false-reading data were reclassified into suspected and confirmed groups by combining the results of individual validation methods. Furthermore, an integrated quality control process that links data validation and reconstruction was developed. In particular, an iterative quality control feedback process was proposed to achieve highly reliable data quality, which was applied to precipitation and water level stations in the Daecheong Dam Basin, South Korea. The case study revealed that the proposed approach can improve the quality control procedure of hydrological database and possibly be implemented in practice.


2013 ◽  
Vol 16 (1) ◽  
pp. 59-67

<p>The Soil Science Institute of Thessaloniki produces new digitized Soil Maps that provide a useful electronic database for the spatial representation of the soil variation within a region, based on in situ soil sampling, laboratory analyses, GIS techniques and plant nutrition mathematical models, coupled with the local land cadastre. The novelty of these studies is that local agronomists have immediate access to a wide range of soil information by clicking on a field parcel shown in this digital interface and, therefore, can suggest an appropriate treatment (e.g. liming, manure incorporation, desalination, application of proper type and quantity of fertilizer) depending on the field conditions and cultivated crops. A specific case study is presented in the current work with regards to the construction of the digitized Soil Map of the regional unit of Kastoria. The potential of this map can easily be realized by the fact that the mapping of the physicochemical properties of the soils in this region provided delineation zones for differential fertilization management. An experiment was also conducted using remote sensing techniques for the enhancement of the fertilization advisory software database, which is a component of the digitized map, and the optimization of nitrogen management in agricultural areas.</p>


2020 ◽  
pp. 107699862095666
Author(s):  
Alina A. von Davier

In this commentary, I share my perspective on the goals of assessments in general, on linking assessments that were developed according to different specifications and for different purposes, and I propose several considerations for the authors and the readers. This brief commentary is structured around three perspectives (1) the context of this research, (2) the methodology proposed here, and (3) the consequences for applied research.


Oxford Studies in Ancient Philosophy provides, twice each year, a collection of the best current work in the field of ancient philosophy. Each volume features original essays that contribute to an understanding of a wide range of themes and problems in all periods of ancient Greek and Roman philosophy, from the beginnings to the threshold of the Middle Ages. From its first volume in 1983, OSAP has been a highly influential venue for work in the field, and has often featured essays of substantial length as well as critical essays on books of distinctive importance. Volume LV contains: a methodological examination on how the evidence for Presocratic thought is shaped through its reception by later thinkers, using discussions of a world soul as a case study; an article on Plato’s conception of flux and the way in which sensible particulars maintain a kind of continuity while undergoing constant change; a discussion of J. L. Austin’s unpublished lecture notes on Aristotle’s Nicomachean Ethics and his treatment of loss of control (akrasia); an article on the Stoics’ theory of time and in particular Chrysippus’ conception of the present and of events; and two articles on Plotinus, one that identifies a distinct argument to show that there is a single, ultimate metaphysical principle; and a review essay discussing E. K. Emilsson’s recent book, Plotinus.


2021 ◽  
Vol 13 (12) ◽  
pp. 6981
Author(s):  
Marcela Bindzarova Gergelova ◽  
Slavomir Labant ◽  
Jozef Mizak ◽  
Pavel Sustek ◽  
Lubomir Leicher

The concept of further sustainable development in the area of administration of the register of old mining works and recent mining works in Slovakia requires precise determination of the locations of the objects that constitute it. The objects in this register have their uniqueness linked with the history of mining in Slovakia. The state of positional accuracy in the registration of objects in its current form is unsatisfactory. Different database sources containing the locations of the old mining works are insufficient and show significant locational deviations. For this reason, it is necessary to precisely locate old mining works using modern measuring technologies. The most effective approach to solving this problem is the use of LiDAR data, which at the same time allow determining the position and above-ground shape of old mining works. Two localities with significant mining history were selected for this case study. Positional deviations in the location of old mining works among the selected data were determined from the register of old mining works in Slovakia, global navigation satellite system (GNSS) measurements, multidirectional hill-shading using LiDAR, and accessible data from the open street map. To compare the positions of identical old mining works from the selected database sources, we established differences in the coordinates (ΔX, ΔY) and calculated the positional deviations of the same objects. The average positional deviation in the total count of nineteen objects comparing documents, LiDAR data, and the register was 33.6 m. Comparing the locations of twelve old mining works between the LiDAR data and the open street map, the average positional deviation was 16.3 m. Between the data sources from GNSS and the registry of old mining works, the average positional deviation of four selected objects was 39.17 m.


Energies ◽  
2021 ◽  
Vol 14 (5) ◽  
pp. 1377
Author(s):  
Musaab I. Magzoub ◽  
Raj Kiran ◽  
Saeed Salehi ◽  
Ibnelwaleed A. Hussein ◽  
Mustafa S. Nasser

The traditional way to mitigate loss circulation in drilling operations is to use preventative and curative materials. However, it is difficult to quantify the amount of materials from every possible combination to produce customized rheological properties. In this study, machine learning (ML) is used to develop a framework to identify material composition for loss circulation applications based on the desired rheological characteristics. The relation between the rheological properties and the mud components for polyacrylamide/polyethyleneimine (PAM/PEI)-based mud is assessed experimentally. Four different ML algorithms were implemented to model the rheological data for various mud components at different concentrations and testing conditions. These four algorithms include (a) k-Nearest Neighbor, (b) Random Forest, (c) Gradient Boosting, and (d) AdaBoosting. The Gradient Boosting model showed the highest accuracy (91 and 74% for plastic and apparent viscosity, respectively), which can be further used for hydraulic calculations. Overall, the experimental study presented in this paper, together with the proposed ML-based framework, adds valuable information to the design of PAM/PEI-based mud. The ML models allowed a wide range of rheology assessments for various drilling fluid formulations with a mean accuracy of up to 91%. The case study has shown that with the appropriate combination of materials, reasonable rheological properties could be achieved to prevent loss circulation by managing the equivalent circulating density (ECD).


Sign in / Sign up

Export Citation Format

Share Document