Spatial Analysis Management Using Inconsistent Data Sources

Author(s):  
Stanislav Belyakov ◽  
Alexander Bozhenyuk ◽  
Andrey Glushkov ◽  
Igor Rozenberg
2021 ◽  
Author(s):  
Cristian Lussana ◽  
Thomas N. Nipen ◽  
Ivar A. Seierstad ◽  
Christoffer A. Elo

<p>Hourly precipitation is often simultaneously simulated by numerical models and observed by multiple data sources. Accurate precipitation fields based on all available information are valuable input for numerous applications and a critical aspect of climate monitoring. </p><p>Inverse problem theory offers an ideal framework for the combination of observations with a numerical model background. In particular, we have considered a modified ensemble optimal interpolation scheme. The deviations between background and observations are used to adjust for deficiencies in the ensemble. A data transformation based on Gaussian anamorphosis has been used to optimally exploit the potential of the spatial analysis, given that precipitation is approximated with a gamma distribution and the spatial analysis requires normally distributed variables. For each point, the spatial analysis returns the shape and rate parameters of its gamma distribution. </p><p>The ensemble-based statistical interpolation scheme with Gaussian anamorphosis for precipitation (EnSI-GAP) is implemented in a way that the covariance matrices are locally stationary, and the background error covariance matrix undergoes a localization process. Concepts and methods that are usually found in data assimilation are here applied to spatial analysis, where they have been adapted in an original way to represent precipitation at finer spatial scales than those resolved by the background, at least where the observational network is dense enough.</p><p>The EnSI-GAP setup requires the specification of a restricted number of parameters, and specifically, the explicit values of the error variances are not needed, since they are inferred from the available data. </p><p>The examples of applications presented over Norway provide a better understanding of EnSI-GAP. The data sources considered are those typically used at national meteorological services, such as local area models, weather radars, and in situ observations. For this last data source, measurements from both traditional and opportunistic sensors have been considered.</p>


Author(s):  
Lihua Lu ◽  
Hengzhen Zhang ◽  
Xiao-Zhi Gao

Purpose – Data integration is to combine data residing at different sources and to provide the users with a unified interface of these data. An important issue on data integration is the existence of conflicts among the different data sources. Data sources may conflict with each other at data level, which is defined as data inconsistency. The purpose of this paper is to aim at this problem and propose a solution for data inconsistency in data integration. Design/methodology/approach – A relational data model extended with data source quality criteria is first defined. Then based on the proposed data model, a data inconsistency solution strategy is provided. To accomplish the strategy, fuzzy multi-attribute decision-making (MADM) approach based on data source quality criteria is applied to obtain the results. Finally, users feedbacks strategies are proposed to optimize the result of fuzzy MADM approach as the final data inconsistent solution. Findings – To evaluate the proposed method, the data obtained from the sensors are extracted. Some experiments are designed and performed to explain the effectiveness of the proposed strategy. The results substantiate that the solution has a better performance than the other methods on correctness, time cost and stability indicators. Practical implications – Since the inconsistent data collected from the sensors are pervasive, the proposed method can solve this problem and correct the wrong choice to some extent. Originality/value – In this paper, for the first time the authors study the effect of users feedbacks on integration results aiming at the inconsistent data.


2022 ◽  
Vol 8 (1) ◽  
pp. 105-123
Author(s):  
Heba K. Khayyal ◽  
Zaki M. Zeidan ◽  
Ashraf A. A. Beshr

The 3D city model is one of the crucial topics that are still under analysis by many engineers and programmers because of the great advancements in data acquisition technologies and 3D computer graphics programming. It is one of the best visualization methods for representing reality. This paper presents different techniques for the creation and spatial analysis of 3D city modeling based on Geographical Information System (GIS) technology using free data sources. To achieve that goal, the Mansoura University campus, located in Mansoura city, Egypt, was chosen as a case study. The minimum data requirements to generate a 3D city model are the terrain, 2D spatial features such as buildings, landscape area and street networks. Moreover, building height is an important attribute in the 3D extrusion process. The main challenge during the creation process is the dearth of accurate free datasets, and the time-consuming editing. Therefore, different data sources are used in this study to evaluate their accuracy and find suitable applications which can use the generated 3D model. Meanwhile, an accurate data source obtained using the traditional survey methods is used for the validation purpose. First, the terrain was obtained from a digital elevation model (DEM) and compared with grid leveling measurements. Second, 2D data were obtained from: the manual digitization from (30 cm) high-resolution imagery, and deep learning structure algorithms to detect the 2D features automatically using an object instance segmentation model and compared the results with the total station survey observations. Different techniques are used to investigate and evaluate the accuracy of these data sources. The procedural modeling technique is applied to generate the 3D city model. TensorFlow & Keras frameworks (Python APIs) were used in this paper; moreover, global mapper, ArcGIS Pro, QGIS and CityEngine software were used. The precision metrics from the trained deep learning model were 0.78 for buildings, 0.62 for streets and 0.89 for landscape areas. Despite, the manual digitizing results are better than the results from deep learning, but the extracted features accuracy is accepted and can be used in the creation process in the cases not require a highly accurate 3D model. The flood impact scenario is simulated as an application of spatial analysis on the generated 3D city model. Doi: 10.28991/CEJ-2022-08-01-08 Full Text: PDF


2012 ◽  
Vol 2012 ◽  
pp. 1-9 ◽  
Author(s):  
Wei Yang ◽  
Karen Spears ◽  
Fan Zhang ◽  
Wai Lee ◽  
Heidi L. Himler

Background. Studies have documented that built environment factors potentially promote or impede leisure time physical activity (LTPA). This study explored the relationship between multiple built environment factors and individual characteristics on LTPA.Methods. Multiple data sources were utilized including individual level data for health behaviors and health status from the Nevada Behavioral Risk Factor Surveillance System (BRFSS) and community level data from different data sources including indicators for recreation facilities, safety, air quality, commute time, urbanization, population density, and land mix level. Mixed model logistic regression and geographic information system (GIS) spatial analysis were conducted.Results. Among 6,311 respondents, 24.4% reported no LTPA engagement during the past 30 days. No engagement in LTPA was significantly associated with (1) individual factors: older age, less education, lower income, being obesity, and low life satisfaction and (2) community factors: more commute time, higher crime rate, urban residence, higher population density, but not for density and distance to recreation facilities, air quality, and land mix.Conclusions. Multiple data systems including complex population survey and spatial analysis are valuable tools on health and built environment studies.


2018 ◽  
Vol 34 (3) ◽  
pp. 317-329 ◽  
Author(s):  
Paulina Pankowska ◽  
Bart Bakker ◽  
Daniel L. Oberski ◽  
Dimitris Pavlopoulos

Author(s):  
Munesh Chandra Trivedi ◽  
Virendra Kumar Yadav ◽  
Avadhesh Kumar Gupta

<p>Data warehouse generally contains both types of data i.e. historical &amp; current data from various data sources. Data warehouse in world of computing can be defined as system created for analysis and reporting of these both types of data. These analysis report is then used by an organization to make decisions which helps them in their growth. Construction of data warehouse appears to be simple, collection of data from data sources into one place (after extraction, transform and loading). But construction involves several issues such as inconsistent data, logic conflicts, user acceptance, cost, quality, security, stake holder’s contradictions, REST alignment etc. These issues need to be overcome otherwise will lead to unfortunate consequences affecting the organization growth. Proposed model tries to solve these issues such as REST alignment, stake holder’s contradiction etc. by involving experts of various domains such as technical, analytical, decision makers, management representatives etc. during initialization phase to better understand the requirements and mapping these requirements to data sources during design phase of data warehouse.</p>


2020 ◽  
Author(s):  
Cristian Lussana ◽  
Thomas N. Nipen ◽  
Ivar A. Seierstad ◽  
Christoffer A. Elo

Abstract. Hourly precipitation over a region is often simultaneously simulated by numerical models and observed by multiple data sources. An accurate precipitation representation based on all available information is a valuable result for numerous applications and a critical aspect of climate. Inverse problem theory offers an ideal framework for the combination of observations with a numerical model background. In particular, we have considered a modified ensemble optimal interpolation scheme, that takes into account deficiencies of the background. An additional source of uncertainty for the ensemble background has been included. A data transformation based on Gaussian anamorphosis has been used to optimally exploit the potential of the spatial analysis, given that precipitation is approximated with a gamma distribution and the spatial analysis requires normally distributed variables. For each point, the spatial analysis returns the shape and rate parameters of its gamma distribution. The Ensemble-based Statistical Interpolation scheme with Gaussian AnamorPhosis (EnSI-GAP) is implemented in a way that the covariance matrices are locally stationary and the background error covariance matrix undergoes a localization process. Concepts and methods that are usually found in data assimilation are here applied to spatial analysis, where they have been adapted in an original way to represent precipitation at finer spatial scales than those resolved by the background, at least where the observational network is dense enough. The EnSI-GAP setup requires the specification of a restricted number of parameters and specifically the explicit values of the error variances are not needed, since they are inferred from the available data. The examples of applications presented provide a better understanding of the characteristics of EnSI-GAP. The data sources considered are those typically used at national meteorological services, such as local area models, weather radars and in-situ observations. For this last data source, measurements from both traditional and opportunistic sensors have been considered.


2021 ◽  
Vol 28 (1) ◽  
pp. 61-91
Author(s):  
Cristian Lussana ◽  
Thomas N. Nipen ◽  
Ivar A. Seierstad ◽  
Christoffer A. Elo

Abstract. Hourly precipitation over a region is often simultaneously simulated by numerical models and observed by multiple data sources. An accurate precipitation representation based on all available information is a valuable result for numerous applications and a critical aspect of climate monitoring. The inverse problem theory offers an ideal framework for the combination of observations with a numerical model background. In particular, we have considered a modified ensemble optimal interpolation scheme. The deviations between background and observations are used to adjust for deficiencies in the ensemble. A data transformation based on Gaussian anamorphosis has been used to optimally exploit the potential of the spatial analysis, given that precipitation is approximated with a gamma distribution and the spatial analysis requires normally distributed variables. For each point, the spatial analysis returns the shape and rate parameters of its gamma distribution. The ensemble-based statistical interpolation scheme with Gaussian anamorphosis for precipitation (EnSI-GAP) is implemented in a way that the covariance matrices are locally stationary, and the background error covariance matrix undergoes a localization process. Concepts and methods that are usually found in data assimilation are here applied to spatial analysis, where they have been adapted in an original way to represent precipitation at finer spatial scales than those resolved by the background, at least where the observational network is dense enough. The EnSI-GAP setup requires the specification of a restricted number of parameters, and specifically, the explicit values of the error variances are not needed, since they are inferred from the available data. The examples of applications presented over Norway provide a better understanding of EnSI-GAP. The data sources considered are those typically used at national meteorological services, such as local area models, weather radars, and in situ observations. For this last data source, measurements from both traditional and opportunistic sensors have been considered.


2010 ◽  
Vol 69 (8) ◽  
pp. 779-799 ◽  
Author(s):  
Shichao Zhang ◽  
Qingfeng Chen ◽  
Qiang Yang

Sign in / Sign up

Export Citation Format

Share Document