scholarly journals Overview of the OGC CDB Standard for 3D Synthetic Environment Simulation & Modeling

Author(s):  
Sara Saeedi ◽  
Steve Liang ◽  
David Graham ◽  
Michael F. Lokuta ◽  
Mir Abolfazl Mostafavi

Recent advances in sensor and platform technologies such as satellite systems, unmanned aerial vehicles (UAV), manned aerial platforms, and ground-based sensor networks have resulted in massive volumes of data is produced and collected about the earth. Processing, managing, and analyzing these data is one of the main challenges in 3D synthetic representation used in modeling and simulation (M&S) of the natural environment. M&S devices, such as flight simulators, traditionally require a variety of different databases to provide a synthetic representation of the world. M&S often requires integration of data from a variety of sources stored in different formats. Thus, for simulation of a complex synthetic environment, such as a 3D terrain model, tackling interoperability among its components (geospatial data, natural and man-made objects, dynamic and static models) is a critical challenge. Conventional approaches used local proprietary data models and formats. These approaches often lacked interoperability and created silos of content within the simulation community. Therefore, open geospatial standards are increasingly perceived as a means to promote interoperability and reusability for 3D M&S. In this paper, the Open Geospatial Consortium (OGC) CDB Standard is introduced. “CDB” originally refers to Common DataBase which is currently considered as a name with no abbreviation in the OGC community. The OGC CDB is an international standard for structuring, modeling, and storing geospatial information required in high performance modeling and simulation applications. CDB defines the core conceptual models, use cases, requirements, and specifications for employing geospatial data in 3D M&S. The main features of the OGC CDB Standard are described as run-time performance, full plug-and-play interoperable geospatial data store, usefulness in 3D and dynamic simulation environment, ability to integrate proprietary and open-source data formats. Furthermore, compatibility with the OGC standards baseline reduces the complexity of discovering, transforming, and streaming geospatial data into the synthetic environment and makes them more widely acceptable to major geospatial data/software producers. This paper includes an overview of OGC CDB version 1.0 which defines a conceptual model and file structure for the storage, access, and modification of a multi-resolution 3D synthetic environment data store. Finally, this paper presents a perspective of future versions of the OGC CDB and what the steps are for humanizing the OGC CDB standard with the other OGC/ISO standards baseline.

Author(s):  
W. W. Song ◽  
B. X. Jin ◽  
S. H. Li ◽  
X. Y. Wei ◽  
D. Li ◽  
...  

Traditional geospatial information platforms are built, managed and maintained by the geoinformation agencies. They integrate various geospatial data (such as DLG, DOM, DEM, gazetteers, and thematic data) to provide data analysis services for supporting government decision making. In the era of big data, it is challenging to address the data- and computing- intensive issues by traditional platforms. In this research, we propose to build a spatiotemporal cloud platform, which uses HDFS for managing image data, and MapReduce-based computing service and workflow for high performance geospatial analysis, as well as optimizing auto-scaling algorithms for Web client users’ quick access and visualization. Finally, we demonstrate the feasibility by several GIS application cases.


2019 ◽  
pp. 191-227
Author(s):  
Zhenlong Li ◽  
Zhipeng Gui ◽  
Barbara Hofer ◽  
Yan Li ◽  
Simon Scheider ◽  
...  

Abstract The increasing availability of geospatial data offers great opportunities for advancing scientific discovery and practices in society. Effective and efficient processing of geospatial data is essential for a wide range of Digital Earth applications such as climate change, natural hazard prediction and mitigation, and public health. However, the massive volume, heterogeneous, and distributed nature of global geospatial data pose challenges in geospatial information processing and computing. This chapter introduces three technologies for geospatial data processing: high-performance computing, online geoprocessing, and distributed geoprocessing, with each technology addressing one aspect of the challenges. The fundamental concepts, principles, and key techniques of the three technologies are elaborated in detail, followed by examples of applications and research directions in the context of Digital Earth. Lastly, a Digital Earth reference framework called discrete global grid system (DGGS) is discussed.


Aerospace ◽  
2020 ◽  
Vol 7 (11) ◽  
pp. 158
Author(s):  
Andrew Weinert

As unmanned aerial systems (UASs) increasingly integrate into the US national airspace system, there is an increasing need to characterize how commercial and recreational UASs may encounter each other. To inform the development and evaluation of safety critical technologies, we demonstrate a methodology to analytically calculate all potential relative geometries between different UAS operations performing inspection missions. This method is based on a previously demonstrated technique that leverages open source geospatial information to generate representative unmanned aircraft trajectories. Using open source data and parallel processing techniques, we performed trillions of calculations to estimate the relative horizontal distance between geospatial points across sixteen locations.


2019 ◽  
Vol 214 ◽  
pp. 07016 ◽  
Author(s):  
Tian Yan ◽  
Shan Zeng ◽  
Mengyao Qi ◽  
Qingbao Hu ◽  
Fazhi Qi

To improve hardware utilization and save manpower in system maintenance, most of the web services in IHEP have been migrated to a private cloud build upon OpenStack. However, cyber security attacks becomes a serious threats to the cloud progressively. Therefore, a cyber security detection and monitoring system is deployed for this cloud platform. This system collects various security related logs as data sources, and processes them in a framework composed of open source data store, analysis and visualization tools. With this system, security incidents and events can be handled in time and rapid response can be taken to protect cloud platform against cyber security threats.


2021 ◽  
Author(s):  
Robert Haehnel ◽  
Scott Christensen ◽  
J. Whitlow ◽  
Andrew Bauer ◽  
Ari Meyer ◽  
...  

Computational Prototyping Environment (CPE) is a web-based portal designed to simplify running Department of Defense (DoD) modeling and simulation tools on the DoD Supercomputing Resource Center’s (DSRC) High Performance Computing (HPC) systems. The first of these tools to be deployed in the CPE is an application (app) to conduct parametric studies and view results using the CREATE-AV Helios CFD software. Initial capability includes hover (collective sweep) and forward flight (speed sweep) performance calculations. The CPE Helios app allows for job submission to a DSRC’s HPC system and for the viewing of results created by Helios, i.e., time series and volumetric data. Example data input and results viewing are presented. Planned future functionality is also outlined.


2019 ◽  
pp. 342-352
Author(s):  
David Foster ◽  
Christopher Mayfield

The U.S. Department of Defense (DoD) has faced numerous challenges within the realm of Geospatial Information Systems and Science in fostering a Common Operational Picture suitable to homeland defense and security. This paper details the challenges and successes since September 11th, 2001 to build common ground for all federal, state, local governments, and non-government organizations that depend on geospatial data to provide for the safety and security of the Nation. An analysis of the protracted integration of commercial GIS technologies within the DoD and the speed, openness, and scale this expertise can bring is discussed as an issue for the Federal response to disasters. Finally, distinct successes of collaboration and integration of common standards and data currently in use at military commands is discussed as a robust path to improve future geospatial efforts.


2019 ◽  
pp. 254-277 ◽  
Author(s):  
Ying Zhang ◽  
Chaopeng Li ◽  
Na Chen ◽  
Shaowen Liu ◽  
Liming Du ◽  
...  

Since large amount of geospatial data are produced by various sources, geospatial data integration is difficult because of the shortage of semantics. Despite standardised data format and data access protocols, such as Web Feature Service (WFS), can enable end-users with access to heterogeneous data stored in different formats from various sources, it is still time-consuming and ineffective due to the lack of semantics. To solve this problem, a prototype to implement the geospatial data integration is proposed by addressing the following four problems, i.e., geospatial data retrieving, modeling, linking and integrating. We mainly adopt four kinds of geospatial data sources to evaluate the performance of the proposed approach. The experimental results illustrate that the proposed linking method can get high performance in generating the matched candidate record pairs in terms of Reduction Ratio(RR), Pairs Completeness(PC), Pairs Quality(PQ) and F-score. The integrating results denote that each data source can get much Complementary Completeness(CC) and Increased Completeness(IC).


Sign in / Sign up

Export Citation Format

Share Document