scholarly journals Integrating cellular automata and discrete global grid systems: a case study into wildfire modelling

2020 ◽  
Vol 1 ◽  
pp. 1-23
Author(s):  
Majid Hojati ◽  
Colin Robertson

Abstract. With new forms of digital spatial data driving new applications for monitoring and understanding environmental change, there are growing demands on traditional GIS tools for spatial data storage, management and processing. Discrete Global Grid System (DGGS) are methods to tessellate globe into multiresolution grids, which represent a global spatial fabric capable of storing heterogeneous spatial data, and improved performance in data access, retrieval, and analysis. While DGGS-based GIS may hold potential for next-generation big data GIS platforms, few of studies have tried to implement them as a framework for operational spatial analysis. Cellular Automata (CA) is a classic dynamic modeling framework which has been used with traditional raster data model for various environmental modeling such as wildfire modeling, urban expansion modeling and so on. The main objectives of this paper are to (i) investigate the possibility of using DGGS for running dynamic spatial analysis, (ii) evaluate CA as a generic data model for dynamic phenomena modeling within a DGGS data model and (iii) evaluate an in-database approach for CA modelling. To do so, a case study into wildfire spread modelling is developed. Results demonstrate that using a DGGS data model not only provides the ability to integrate different data sources, but also provides a framework to do spatial analysis without using geometry-based analysis. This results in a simplified architecture and common spatial fabric to support development of a wide array of spatial algorithms. While considerable work remains to be done, CA modelling within a DGGS-based GIS is a robust and flexible modelling framework for big-data GIS analysis in an environmental monitoring context.

2018 ◽  
Vol 7 (10) ◽  
pp. 399 ◽  
Author(s):  
Junghee Jo ◽  
Kang-Woo Lee

With the rapid development of Internet of Things (IoT) technologies, the increasing volume and diversity of sources of geospatial big data have created challenges in storing, managing, and processing data. In addition to the general characteristics of big data, the unique properties of spatial data make the handling of geospatial big data even more complicated. To facilitate users implementing geospatial big data applications in a MapReduce framework, several big data processing systems have extended the original Hadoop to support spatial properties. Most of those platforms, however, have included spatial functionalities by embedding them as a form of plug-in. Although offering a convenient way to add new features to an existing system, the plug-in has several limitations. In particular, while executing spatial and nonspatial operations by alternating between the existing system and the plug-in, additional read and write overheads have to be added to the workflow, significantly reducing performance efficiency. To address this issue, we have developed Marmot, a high-performance, geospatial big data processing system based on MapReduce. Marmot extends Hadoop at a low level to support seamless integration between spatial and nonspatial operations of a solid framework, allowing improved performance of geoprocessing workflow. This paper explains the overall architecture and data model of Marmot as well as the main algorithm for automatic construction of MapReduce jobs from a given spatial analysis task. To illustrate how Marmot transforms a sequence of operators for spatial analysis to map and reduce functions in a way to achieve better performance, this paper presents an example of spatial analysis retrieving the number of subway stations per city in Korea. This paper also experimentally demonstrates that Marmot generally outperforms SpatialHadoop, one of the top plug-in based spatial big data frameworks, particularly in dealing with complex and time-intensive queries involving spatial index.


2021 ◽  
Vol 27 (1) ◽  
Author(s):  
Mehmet Alkan ◽  
Elif Taş Arslan

Abstract: The processes starting with the identification and registration of treasury properties have an essential place in the cadastral systems. Spatial data modelling studies were conducted in 2002 to establish a common standard structure on the fundamental similarities of land management systems. These studies were stated as a beginning named Core Cadastral Domain Model (CCDM), since 2006, it has been started to be made under the name of LADM. This model was accepted in 2012 as a standard model in the field of land administration by the International Organization for Standardization (ISO). In this study, an external model class is proposed for LADM’s transactions related to Treasury’s real estates properties which are related National Property Automation Project (MEOP). In order to determine the deficiency of this current external model, databases containing records related to spatial data and property rights were examined, and the deficiencies related to transactions on treasury properties were determined. The created external class is associated with the LADM’s LA_Party, LA_RRR, LA_SpatialUnit and LA_BAUnit master classes. Herewith the standardization of the external data model is ensured. If the external model is implemented by the responsible standardization of the archiving processes will be more comfortable and faster to register.


2020 ◽  
Author(s):  
Chiranjib Chaudhuri ◽  
Annie Gray ◽  
Colin Robertson

Abstract. Despite the high historical losses attributed to flood events, Canadian flood mitigation efforts have been hindered by a dearth of current, accessible flood extent/risk models and maps. Such resources often entail large datasets and high computational requirements. This study presents a novel, computationally efficient flood inundation modeling framework (InundatEd) using the height above the nearest drainage-based solution for Manning's equation, implemented in a big-data discrete global grid systems-based architecture with a web-GIS platform. Specifically, this study aimed to develop, present, and validate InundatEd through binary classification comparisons to known flood extents. The framework is divided into multiple swappable modules including GIS pre-processing; regional regression; inundation model; and web-GIS visualization. Extent testing and processing speed results indicate the value of a DGGS-based architecture alongside a simple conceptual inundation model and a dynamic user interface.


Author(s):  
R. Wang ◽  
J. Ben ◽  
Y. Li ◽  
L. Du

Discrete global grid system is a new data model which supports the fusion processing of multi-source geospatial data. In discrete global grid systems, all cell operations can be completed by codes theoretically, but most of current spatial data are in the forms of geographic coordinates and projected coordinates. It is necessary to study the transform between geographic coordinates and grid codes, which will support data entering and getting out of the systems. This paper chooses the icosahedral hexagonal discrete global system as a base, and builds the mapping relationships between the sphere and the icosahedron. Then an encoding scheme of planar aperture 4 hexagonal grid system is designed and applied to the icosahedron. Basing on this, a new algorithm of transforms between geographic coordinates and grid codes is designed. Finally, experiments test the accuracy and efficiency of this algorithm. The efficiency of code addition of HLQT is about 5 times the efficiency of code addition of HQBS.


2017 ◽  
Vol 6 (2) ◽  
pp. 1-21 ◽  
Author(s):  
Ian D. Bishop ◽  
Serryn Eagleson ◽  
Christopher J. Pettit ◽  
Abbas Rajabifard ◽  
Hannah Badland ◽  
...  

This paper introduces an online spatial data portal with advanced data access, analytical and visualisation capabilities which can be used for evidence based city planning and supporting data driven research. Through a case study approach, focused in the city of Melbourne, the authors show how the Australian Urban Infrastructure Network (AURIN) portal can be used to investigate a multi-facetted approach to understanding the various spatial dimension of livability. While the tools explore separate facets of livability (employment, housing, health service and walkability), their outputs flow through to the other tools showing the benefits of integrated systems.


Author(s):  
Заставной ◽  
Dmitriy Zastavnoy

A key feature for the information systems is data security, but the Geoinformation Systems and Spatial Databases as well as their applications appear to have some drawbacks concerning that matter. Most suggestions about geodata confidentiality are obviously stuck in their attempts to link access rules with geometric properties of spatial data. In this paper we suggest a different approach toward build data access model and a complete data security system for a WinMAP system which includes account control, data access control based on an extended DAC and audit features. A data model of WinMAP is also described because its specialized features allow to rationally develop and effectively implement the data security system. An implementation of the extended DAC model is briefly sketched.


2020 ◽  
pp. 585-607
Author(s):  
Ian D. Bishop ◽  
Serryn Eagleson ◽  
Christopher J. Pettit ◽  
Abbas Rajabifard ◽  
Hannah Badland ◽  
...  

This paper introduces an online spatial data portal with advanced data access, analytical and visualisation capabilities which can be used for evidence based city planning and supporting data driven research. Through a case study approach, focused in the city of Melbourne, the authors show how the Australian Urban Infrastructure Network (AURIN) portal can be used to investigate a multi-facetted approach to understanding the various spatial dimension of livability. While the tools explore separate facets of livability (employment, housing, health service and walkability), their outputs flow through to the other tools showing the benefits of integrated systems.


Author(s):  
Alexey Noskov ◽  
A. Yair Grinberger ◽  
Nikolaos Papapesios ◽  
Adam Rousell ◽  
Rafael Troilo ◽  
...  

Many methods for intrinsic quality assessment of spatial data are based on the OpenStreetMap full-history dump. Typically, the high-level analysis is conducted; few approaches take into account the low-level properties of data files. In this chapter, a low-level data-type analysis is introduced. It offers a novel framework for the overview of big data files and assessment of full-history data provenance (lineage). Developed tools generate tables and charts, which facilitate the comparison and analysis of datasets. Also, resulting data helped to develop a universal data model for optimal storing of OpenStreetMap full-history data in the form of a relational database. Databases for several pilot sites were evaluated by two use cases. First, a number of intrinsic data quality indicators and related metrics were implemented. Second, a framework for the inventory of spatial distribution of massive data uploads is discussed. Both use cases confirm the effectiveness of the proposed data-type analysis and derived relational data model.


Sign in / Sign up

Export Citation Format

Share Document