Research on the Spatial Query Technology of Geospatial Database

2014 ◽  
Vol 571-572 ◽  
pp. 600-605
Author(s):  
Lei Gang Sun ◽  
Jian Feng Liu ◽  
Quan Hong Xu

The application requirement of Geospatial data is increasing and complex as it is getting numerous as a result of furthering study on geosciences. Based on a deeply research on Oracle Spatial storage management mechanism, this paper proposed a method that applies the graph theory to domain of optimizing spatial query of massive geographical data, and established a geospatial data query model in order to settle a problem of lower spatial query efficiency in geospatial database. Combining with the practical applications, this paper did a conventional spatial query test and a spatial query based on geospatial data model respectively. The result is that the spatial query based on geospatial data query model has a better efficiency than that on conventional method. Besides, this model can greatly improve the spatial query performance and this improvement will be increasingly apparent as the data volume increases.

Author(s):  
N. N. Nasorudin ◽  
M. I. Hassan ◽  
N. A. Zulkifli ◽  
A. Abdul Rahman

Recently in our country, the construction of buildings become more complex and it seems that strata objects database becomes more important in registering the real world as people now own and use multilevel of spaces. Furthermore, strata title was increasingly important and need to be well-managed. LADM is a standard model for land administration and it allows integrated 2D and 3D representation of spatial units. LADM also known as ISO 19152. The aim of this paper is to develop a strata objects database using LADM. This paper discusses the current 2D geospatial database and needs for 3D geospatial database in future. This paper also attempts to develop a strata objects database using a standard data model (LADM) and to analyze the developed strata objects database using LADM data model. The current cadastre system in Malaysia includes the strata title is discussed in this paper. The problems in the 2D geospatial database were listed and the needs for 3D geospatial database in future also is discussed. The processes to design a strata objects database are conceptual, logical and physical database design. The strata objects database will allow us to find the information on both non-spatial and spatial strata title information thus shows the location of the strata unit. This development of strata objects database may help to handle the strata title and information.


Author(s):  
Tian Zhao ◽  
Chuanrong Zhang ◽  
Mingzhen Wei ◽  
Zhong-Ren Peng
Keyword(s):  

2020 ◽  
Vol 25 (4) ◽  
pp. 1376-1391
Author(s):  
Liangfu Lu ◽  
Wenbo Wang ◽  
Zhiyuan Tan

AbstractThe Parallel Coordinates Plot (PCP) is a popular technique for the exploration of high-dimensional data. In many cases, researchers apply it as an effective method to analyze and mine data. However, when today’s data volume is getting larger, visual clutter and data clarity become two of the main challenges in parallel coordinates plot. Although Arc Coordinates Plot (ACP) is a popular approach to address these challenges, few optimization and improvement have been made on it. In this paper, we do three main contributions on the state-of-the-art PCP methods. One approach is the improvement of visual method itself. The other two approaches are mainly on the improvement of perceptual scalability when the scale or the dimensions of the data turn to be large in some mobile and wireless practical applications. 1) We present an improved visualization method based on ACP, termed as double arc coordinates plot (DACP). It not only reduces the visual clutter in ACP, but use a dimension-based bundling method with further optimization to deals with the issues of the conventional parallel coordinates plot (PCP). 2)To reduce the clutter caused by the order of the axes and reveal patterns that hidden in the data sets, we propose our first dimensional reordering method, a contribution-based method in DACP, which is based on the singular value decomposition (SVD) algorithm. The approach computes the importance score of attributes (dimensions) of the data using SVD and visualize the dimensions from left to right in DACP according the score in SVD. 3) Moreover, a similarity-based method, which is based on the combination of nonlinear correlation coefficient and SVD algorithm, is proposed as well in the paper. To measure the correlation between two dimensions and explains how the two dimensions interact with each other, we propose a reordering method based on non-linear correlation information measurements. We mainly use mutual information to calculate the partial similarity of dimensions in high-dimensional data visualization, and SVD is used to measure global data. Lastly, we use five case scenarios to evaluate the effectiveness of DACP, and the results show that our approaches not only do well in visualizing multivariate dataset, but also effectively alleviate the visual clutter in the conventional PCP, which bring users a better visual experience.


2014 ◽  
Vol 12 (5) ◽  
pp. 383
Author(s):  
Nuala M. Cowan, DSc, MA, BA

Objective: An effectual emergency response effort is contingent upon the quality and timeliness of information provided to both the decision making and coordinating functions; conditions that are hard to guarantee in the urgent climate of the response effort. The purpose of this paper is to present a validated Humanitarian Data Model (HDM) that can assist in the rapid assessment of disaster needs and subsequent decision making. Substandard, inconsistent information can lead to poorly informed decisions, and subsequently, inappropriate response activities. Here we present a novel, organized, and fluid information management workflow to be applied during the rapid assessment phase of an emergency response. A comprehensive, peer-reviewed geospatial data model not only directs the design of data collection tools but also allows for more systematic data collection and management, leading to improved analysis and response outcomes.Design: This research involved the development of a comprehensive geospatial data model to guide the collection, management and analysis of geographically referenced assessment information, for implementation at the rapid response phase of a disaster using a mobile data collection app based on key outcome parameters. A systematic review of literature and best practices was used to identify and prioritize the minimum essential data variables.Subjects: The data model was critiqued for variable content, structure, and usability by a group of subject matter experts in the fields of humanitarian information management and geographical information systems.Conclusions: Consensus found that the adoption of a standardized system of data collection, management, and processing, such as the data model presented here, could facilitate the collection and sharing of information between agencies with similar goals, facilitate the better coordination of efforts by unleashing the power of geographic information for humanitarian decision support.


Author(s):  
Yu-Jin Zhang

Along with the progress of imaging modality and the wide utility of digital images (including video) in various fields, many potential content producers have emerged, and many image databases have been built. Because images require large amounts of storage space and processing time, how to quickly and efficiently access and manage these large, both in the sense of information contents and data volume, databases has become an urgent problem. The research solution for this problem, using content-based image retrieval (CBIR) techniques, was initiated in the last decade (Kato, 1992). An international standard for multimedia content descriptions, MPEG-7, was formed in 2001 (MPEG). With the advantages of comprehensive descriptions of image contents and consistence to human visual perception, research in this direction is considered as one of the hottest research points in the new century (Castelli, 2002; Zhang, 2003; Deb, 2004). Many practical retrieval systems have been developed; a survey of near 40 systems can be found in Veltkamp (2000). Most of them mainly use low-level image features, such as color, texture, and shape, etc., to represent image contents. However, there is a considerable difference between the users’ interest in reality and the image contents described by only using the above low-level image features. In other words, there is a wide gap between the image content description based on low-level features and that of human beings’ understanding. As a result, these low-level featurebased systems often lead to unsatisfying querying results in practical applications. To cope with this challenging task, many approaches have been proposed to represent and describe the content of images at a higher level, which should be more related to human beings’ understanding. Three broad categories could be classified: synthetic, semantic, and semiotic (Bimbo, 1999; Djeraba, 2002). From the understanding point of view, the semantic approach is natural. Human beings often describe image content in terms of objects, which can be defined at different abstraction levels. In this article, objects are considered not only as carrying semantic information in images, but also as suitable building blocks for further image understanding. The rest of the article is organized as follows: in “Background,” early object-based techniques will be briefly reviewed, and the current research on object-based techniques will be surveyed. In “Main Techniques,” a general paradigm for object-based image retrieval will be described; and different object-based techniques, such as techniques for extracting meaningful regions, for identifying objects, for matching semantics, and for conducting feedback are discussed. In “Future Trends,” some potential directions for further research are pointed out. In “Conclusion,” several final remarks are presented.


2013 ◽  
pp. 1773-1793
Author(s):  
Hugo Martins ◽  
Jorge G. Rocha

Since the authors were able to design all the supporting software, all syntactical interoperability was guaranteed by the use of Open Geospatial Consortium (OGC) standards. The semantic interoperability was assured by design, by developing a unique data model. Data invariants are guaranteed either by the interface, with validation routines written in Javascript, or by the data constrains included in the database. Integration and interoperability with other BT programs might require some additional effort, but all the necessary semantic translation could be encapsulated into the WFS component.


Author(s):  
Isam Mashhour Aljawarneh ◽  
Paolo Bellavista ◽  
Antonio Corradi ◽  
Rebecca Montanari ◽  
Luca Foschini ◽  
...  

2020 ◽  
Author(s):  
Anton Gladkov ◽  
Lunina Oxana

<p>Studying and mapping of faults in the Earth’s crust is one of the priority objectives in structural geology and tectonophysics. Generally, faults are associated with mineral deposits, thermal springs, and earthquakes, and fault zones are areas of the most dangerous geological processes and various geophysical anomalies. In this regard, databases of faults are highly demanded by both science and practical applications. In this work, we present an on-line geospatial database containing faults, which were active in the Pliocene‐Quaternary within the territory between 96–124°E to 49–58°N. The locations of the faults were mapped with using MapInfo GIS based on the extensive analysis of cartographic, published and own structural materials. The data about each fault were input via ActiveTectonics Information System developed by us. The interactive version of the database put out in the open (http://www.activetectonics.ru/) in Russian and English and anyone may get available information about a fault by a click. The geoportal is constantly developing and constitutes a base for the creation of an automated system for modeling geological hazards (seismic soil liquefaction, secondary rupturing, subsidence and slope processes) in the Baikal region.</p><p>Currently, as part of the modernization of the ActiveTectonics geographic information product, we are developing models and schemes of data and metadata to create a detailed geospatial database of seismogenic ruptures of the Baikal region. A modern user-friendly interface is being developed to automate the data collection process.</p><p>The creation of such a publicly accessible catalog of seismogenic ruptures will be useful for applied and fundamental research.</p><p>The reported study was partly funded by RFBR and the Government of the Irkutsk Region, project number 20-45-385001.</p>


Sign in / Sign up

Export Citation Format

Share Document