scholarly journals Μέθοδοι προσπέλασης και επεξεργασίας ερωτήσεων σε χρονικές και χωροχρονικές βάσεις δεδομένων

2002 ◽  
Author(s):  
Θεόδωρος Τζουραμάνης

Time is a very important concept related to almost all phenomena of the real world. Information and data correspond to specific time-points and usually change over time. One of the roles of databases is the support of the time evolving nature of the phenomena they model. This ability is of fundamental importance in many applications, such as accounting, banking, law, medical, commercial, econometrics, land and cartographic applications. Temporal and spatio-temporal databases are two categories of databases, which equally deal with the concept of time but are, however, related to different types of applications. Conventional databases have been designed to maintain only the most recently stored information that is current information. As this information is updated, the database content is modified and the last stored information is removed from the database. Therefore, the only retained version of the database is the current one. Temporal databases, on the other hand, support the maintenance of time-evolving data and the satisfaction of specialized queries that are related to three notions of time for these data: the past, the current and the present. Traditional spatial databases are restricted to represent, store and manipulate only static spatial data, such as points, lines, surfaces, volumes and hyper-volumes in multi-dimensional space. However, there are many applications that demand the storage and retrieval of continuously changing spatial information Geographical information systems, image and multi- media databases, urban planning, transportation, mobile communications, computer-aided design and medical databases are only some of the applications that would benefit from the management of this type of dynamically-changing spatial information. Spatio-temporal databases manipulate spatial data, the geometry of which changes dynamically. They provide the chronological framework for the efficient storage and retrieval of all the states of a spatial database over time. This includes the current and past states and the support of spatial queries that refer to present and past time-points as well. In this doctoral dissertation, the research over the temporal and spatio-temporal databases focuses on data that are indexed according to transaction time. More specifically, with regards to spatio-temporal databases, the present research focuses in time-evolving regional data. Real world examples of such applications include the storage and manipulation of data of meteorological phenomena (e.g. atmospheric pressure-zones; icebergs as they change and move over time), of faunal phenomena (e.g. movements of populations of animals/birds/fishes), of urban phenomena (e.g. traffic jams or traffic networks in big cities; city planning events: building and destroying), of natural catastrophes (e.g. fires; hurricanes; oil slicks; floods; pollution clouds) etc. In particular, the focus of the present dissertation is on designing efficient access methods and query processing algorithms for transaction-time databases and databases for time- evolving regional data. This contribution is considered to be of particular importance because access methods play a very important role in the development of efficient database management systems. One access method for transaction-time data and four access methods for time-evolving regional data are designed and implemented. Are also implemented efficient algorithms for the processing of three queries for temporal and five new queries for spatio-temporal databases. These queries exploit the advantage of the properties of these new access methods. The first in the bibliography generator for synthetic time-evolving regional data is also introduced. Finally, an extensive experimental performance evaluation and comparison of all the above four new access methods for time-evolving regional data, is presented. Because of the lack of real benchmark data, the regional data sets used in the experiments were synthetic raster images with real-world semantics that were generated by the new synthetic data generator. The comparison is made under a common and flexible benchmarking environment in order to make it possible to choose the best technique depending on the application and on the characteristics of the manipulated images.

Author(s):  
Paolo Corti ◽  
Benjamin G Lewis ◽  
Athanasios Tom Kralidis ◽  
Ntabathia Jude Mwenda

A Spatial Data Infrastructure (SDI) is a framework of geospatial data, metadata, users and tools intended to provide an efficient and flexible way to use spatial information. One of the key software components of an SDI is the catalogue service which is needed to discover, query, and manage the metadata. Catalogue services in an SDI are typically based on the Open Geospatial Consortium (OGC) Catalogue Service for the Web (CSW) standard which defines common interfaces for accessing the metadata information. A search engine is a software system capable of supporting fast and reliable search, which may use “any means necessary” to get users to the resources they need quickly and efficiently. These techniques may include features such as full text search, natural language processing, weighted results, fuzzy tolerance results, faceting, hit highlighting, recommendations, feedback mechanisms based on log mining, usage statistic gathering, and many others. In this paper we will be focusing on improving geospatial search with a search engine platform that uses Lucene, a Java-based search library, at its core. In work funded by the National Endowment for the Humanities, the Centre for Geographic Analysis (CGA) at Harvard University is in the process of re-engineering the search component of its public domain SDI (WorldMap http://worldmap.harvard.edu ) which is based on the GeoNode platform. In the process the CGA has developed Harvard Hypermap (HHypermap), a map services registry and search platform independent from WorldMap. The goal of HHypermap is to provide a framework for building and maintaining a comprehensive registry of web map services, and because such a registry is expected to be large, the system supports the development of clients with modern search capabilities such as spatial and temporal faceting and instant previews via an open API. Behind the scenes HHypermap scalably harvests OGC and Esri service metadata from distributed servers, organizes that information, and pushes it to a search engine. The system monitors services for reliability and uses that to improve search. End users will be able to search the SDI metadata using standard interfaces provided by the internal CSW catalogue, and will benefit from the enhanced search possibilities provided by an advanced search engine. HHypermap is built on an open source software source stack.


2016 ◽  
Author(s):  
Paolo Corti ◽  
Benjamin G Lewis ◽  
Tom Kralidis ◽  
Jude Mwenda

A Spatial Database Infrastructure (SDI) is a framework of geospatial data, metadata, users and tools intended to provide the most efficient and flexible way to use spatial information. One of the key software component of a SDI is the catalogue service, needed to discover, query and manage the metadata. Catalogue services in a SDI are typically based on the Open Geospatial Consortium (OGC) Catalogue Service for the Web (CSW) standard, that defines common interfaces to access the metadata information. A search engine is a software system able to perform very fast and reliable search, with features such as full text search, natural language processing, weighted results, fuzzy tolerance results, faceting, hit highlighting and many others. The Centre of Geographic Analysis (CGA) at Harvard University is trying to integrate within its public domain SDI (named WorldMap), the benefits of both worlds (OGC catalogs and search engines). Harvard Hypermap (HHypermap) is a component that will be part of WorldMap, totally built on an open source stack, implementing an OGC catalog, based on pycsw, to provide access to metadata in a standard way, and a search engine, based on Solr/Lucene, to provide the advanced search features typically found in search engines.


2016 ◽  
Author(s):  
Paolo Corti ◽  
Benjamin G Lewis ◽  
Tom Kralidis ◽  
Jude Mwenda

A Spatial Data Infrastructure (SDI) is a framework of geospatial data, metadata, users and tools intended to provide the most efficient and flexible way to use spatial information. One of the key software components of a SDI is the catalogue service, needed to discover, query and manage the metadata. Catalogue services in a SDI are typically based on the Open Geospatial Consortium (OGC) Catalogue Service for the Web (CSW) standard, that defines common interfaces to access the metadata information. A search engine is a software system able to perform very fast and reliable search, with features such as full text search, natural language processing, weighted results, fuzzy tolerance results, faceting, hit highlighting and many others. The Centre of Geographic Analysis (CGA) at Harvard University is trying to integrate within its public domain SDI (named WorldMap), the benefits of both worlds (OGC catalogues and search engines). Harvard Hypermap (HHypermap) is a component that will be part of WorldMap, totally built on an open source stack, implementing an OGC catalogue, based on pycsw, to provide access to metadata in a standard way, and a search engine, based on Solr/Lucene, to provide the advanced search features typically found in search engines.


2016 ◽  
Author(s):  
Paolo Corti ◽  
Benjamin G Lewis ◽  
Tom Kralidis ◽  
Jude Mwenda

A Spatial Data Infrastructure (SDI) is a framework of geospatial data, metadata, users and tools intended to provide the most efficient and flexible way to use spatial information. One of the key software components of a SDI is the catalogue service, needed to discover, query and manage the metadata. Catalogue services in a SDI are typically based on the Open Geospatial Consortium (OGC) Catalogue Service for the Web (CSW) standard, that defines common interfaces to access the metadata information. A search engine is a software system able to perform very fast and reliable search, with features such as full text search, natural language processing, weighted results, fuzzy tolerance results, faceting, hit highlighting and many others. The Centre of Geographic Analysis (CGA) at Harvard University is trying to integrate within its public domain SDI (named WorldMap), the benefits of both worlds (OGC catalogues and search engines). Harvard Hypermap (HHypermap) is a component that will be part of WorldMap, totally built on an open source stack, implementing an OGC catalogue, based on pycsw, to provide access to metadata in a standard way, and a search engine, based on Solr/Lucene, to provide the advanced search features typically found in search engines.


Author(s):  
M. Yu. Kataev ◽  
◽  
M. O. Krylov ◽  
P. P. Geiko ◽  
◽  
...  

At present, the practice of supporting many types of human activities requires the use of the spatial data infrastructure. Such an infrastructure integrates spatio-temporal sets from many sources of information within itself, providing the user with various types of processing, analysis and visualization methods. This article describes the architecture of the software system and the processes for managing sets of spatio-temporal data to solve agricultural problems. Measurement data using multispectral satellite systems, unmanned aerial vehicles (UAVs), as well as a priori information (meteorology, agrochemical information, etc.) are taken as input information. The User of the Software System is provided with the opportunity to control the spatial information of the territory of agricultural fields, sets of temporal data from various spatial data. An important achievement of the work is the combination of the results of satellite and UAV images according to the controlled parameters, that makes possible to expand the area of use of UAVs and verify them. The results of real data processing are presented.


Author(s):  
Wynne Hsu ◽  
Mong Li Lee ◽  
Junmei Wang

Association rule mining in spatial databases and temporal databases have been studied extensively in data mining research. Most of the research studies have found interesting patterns in either spatial information or temporal information, however, few studies have handled both efficiently. Meanwhile, developments in spatio-temporal databases and spatio-temporal applications have prompted data analysts to turn their focus to spatio-temporal patterns that explore both spatial and temporal information.


Author(s):  
Hassina Bounif

In the field of computer science, we are currently facing a major problem designing models that evolve over time. This holds in particular for the case of databases: Their data models need to evolve, but their evolution is difficult. User requirements are now changing much faster than before for several reasons, among them the changing perception of the real world and the development of new technologies. Databases show little flexibility in terms of supporting changes in the organization of their schemas and data. Database evolution approaches maintain current populated data and software application functionalities when changing database schema. Data model versioning is one of these chosen approaches used to resolve the evolution of conventional and nonconventional databases such as temporal and spatial-temporal databases. This article provides some background on database evolution and versioning technique fields. It presents the unresolved issues as well.


2016 ◽  
Author(s):  
Paolo Corti ◽  
Benjamin Lewis ◽  
Tom Kralidis ◽  
Jude Mwenda

A Spatial Database Infrastructure (SDI) is a framework of geospatial data, metadata, users and tools intended to provide the most efficient and flexible way to use spatial information. One of the key software component of a SDI is the catalogue service, needed to discover, query and manage the metadata. Catalogue services in a SDI are typically based on the Open Geospatial Consortium (OGC) Catalogue Service for the Web (CSW) standard, that defines common interfaces to access the metadata information. A search engine is a software system able to perform very fast and reliable search, with features such as full text search, natural language processing, weighted results, fuzzy tolerance results, faceting, hit highlighting and many others. The Centre of Geographic Analysis (CGA) at Harvard University is trying to integrate within its public domain SDI (named WorldMap), the benefits of both worlds (OGC catalogs and search engines). Harvard Hypermap (HHypermap) is a component that will be part of WorldMap, totally built on an open source stack, implementing an OGC catalog, based on pycsw, to provide access to metadata in a standard way, and a search engine, based on Solr/Lucene, to provide the advanced search features typically found in search engines.


Author(s):  
Paolo Corti ◽  
Benjamin G Lewis ◽  
Athanasios Tom Kralidis ◽  
Ntabathia Jude Mwenda

A Spatial Data Infrastructure (SDI) is a framework of geospatial data, metadata, users and tools intended to provide an efficient and flexible way to use spatial information. One of the key software components of an SDI is the catalogue service which is needed to discover, query, and manage the metadata. Catalogue services in an SDI are typically based on the Open Geospatial Consortium (OGC) Catalogue Service for the Web (CSW) standard which defines common interfaces for accessing the metadata information. A search engine is a software system capable of supporting fast and reliable search, which may use “any means necessary” to get users to the resources they need quickly and efficiently. These techniques may include features such as full text search, natural language processing, weighted results, fuzzy tolerance results, faceting, hit highlighting, recommendations, feedback mechanisms based on log mining, usage statistic gathering, and many others. In this paper we will be focusing on improving geospatial search with a search engine platform that uses Lucene, a Java-based search library, at its core. In work funded by the National Endowment for the Humanities, the Centre for Geographic Analysis (CGA) at Harvard University is in the process of re-engineering the search component of its public domain SDI (WorldMap http://worldmap.harvard.edu ) which is based on the GeoNode platform. In the process the CGA has developed Harvard Hypermap (HHypermap), a map services registry and search platform independent from WorldMap. The goal of HHypermap is to provide a framework for building and maintaining a comprehensive registry of web map services, and because such a registry is expected to be large, the system supports the development of clients with modern search capabilities such as spatial and temporal faceting and instant previews via an open API. Behind the scenes HHypermap scalably harvests OGC and Esri service metadata from distributed servers, organizes that information, and pushes it to a search engine. The system monitors services for reliability and uses that to improve search. End users will be able to search the SDI metadata using standard interfaces provided by the internal CSW catalogue, and will benefit from the enhanced search possibilities provided by an advanced search engine. HHypermap is built on an open source software source stack.


Author(s):  
Michael Vassilakopoulos ◽  
Antonio Corral

Time and space are ubiquitous aspects of reality. Temporal and spatial information appear together in many everyday activities, and many information systems of modern life should be able to handle such information. For example, information systems for traffic control, fleet management, environmental management, military applications, local and public administration, and academic institutions need to manage information with spatial characteristics that change over time, or in other words, spatio-temporal information. The need for spatio-temporal applications has been strengthened by recent developments in mobile telephony technology, mobile computing, positioning technology, and the evolution of the World Wide Web.


Sign in / Sign up

Export Citation Format

Share Document