scholarly journals Geospatial Information Processing Technologies

2019 ◽  
pp. 191-227
Author(s):  
Zhenlong Li ◽  
Zhipeng Gui ◽  
Barbara Hofer ◽  
Yan Li ◽  
Simon Scheider ◽  
...  

Abstract The increasing availability of geospatial data offers great opportunities for advancing scientific discovery and practices in society. Effective and efficient processing of geospatial data is essential for a wide range of Digital Earth applications such as climate change, natural hazard prediction and mitigation, and public health. However, the massive volume, heterogeneous, and distributed nature of global geospatial data pose challenges in geospatial information processing and computing. This chapter introduces three technologies for geospatial data processing: high-performance computing, online geoprocessing, and distributed geoprocessing, with each technology addressing one aspect of the challenges. The fundamental concepts, principles, and key techniques of the three technologies are elaborated in detail, followed by examples of applications and research directions in the context of Digital Earth. Lastly, a Digital Earth reference framework called discrete global grid system (DGGS) is discussed.

Author(s):  
W. W. Song ◽  
B. X. Jin ◽  
S. H. Li ◽  
X. Y. Wei ◽  
D. Li ◽  
...  

Traditional geospatial information platforms are built, managed and maintained by the geoinformation agencies. They integrate various geospatial data (such as DLG, DOM, DEM, gazetteers, and thematic data) to provide data analysis services for supporting government decision making. In the era of big data, it is challenging to address the data- and computing- intensive issues by traditional platforms. In this research, we propose to build a spatiotemporal cloud platform, which uses HDFS for managing image data, and MapReduce-based computing service and workflow for high performance geospatial analysis, as well as optimizing auto-scaling algorithms for Web client users’ quick access and visualization. Finally, we demonstrate the feasibility by several GIS application cases.


Author(s):  
Sara Saeedi ◽  
Steve Liang ◽  
David Graham ◽  
Michael F. Lokuta ◽  
Mir Abolfazl Mostafavi

Recent advances in sensor and platform technologies such as satellite systems, unmanned aerial vehicles (UAV), manned aerial platforms, and ground-based sensor networks have resulted in massive volumes of data is produced and collected about the earth. Processing, managing, and analyzing these data is one of the main challenges in 3D synthetic representation used in modeling and simulation (M&S) of the natural environment. M&S devices, such as flight simulators, traditionally require a variety of different databases to provide a synthetic representation of the world. M&S often requires integration of data from a variety of sources stored in different formats. Thus, for simulation of a complex synthetic environment, such as a 3D terrain model, tackling interoperability among its components (geospatial data, natural and man-made objects, dynamic and static models) is a critical challenge. Conventional approaches used local proprietary data models and formats. These approaches often lacked interoperability and created silos of content within the simulation community. Therefore, open geospatial standards are increasingly perceived as a means to promote interoperability and reusability for 3D M&S. In this paper, the Open Geospatial Consortium (OGC) CDB Standard is introduced. “CDB” originally refers to Common DataBase which is currently considered as a name with no abbreviation in the OGC community. The OGC CDB is an international standard for structuring, modeling, and storing geospatial information required in high performance modeling and simulation applications. CDB defines the core conceptual models, use cases, requirements, and specifications for employing geospatial data in 3D M&S. The main features of the OGC CDB Standard are described as run-time performance, full plug-and-play interoperable geospatial data store, usefulness in 3D and dynamic simulation environment, ability to integrate proprietary and open-source data formats. Furthermore, compatibility with the OGC standards baseline reduces the complexity of discovering, transforming, and streaming geospatial data into the synthetic environment and makes them more widely acceptable to major geospatial data/software producers. This paper includes an overview of OGC CDB version 1.0 which defines a conceptual model and file structure for the storage, access, and modification of a multi-resolution 3D synthetic environment data store. Finally, this paper presents a perspective of future versions of the OGC CDB and what the steps are for humanizing the OGC CDB standard with the other OGC/ISO standards baseline.


2012 ◽  
Vol 229-231 ◽  
pp. 1539-1542
Author(s):  
Wei Li ◽  
Jie Wang

Micro-assembly technology is one of important processing technologies to realize electronic equipments miniature and light with high performance and reliability. The key techniques in micro-assembly of micro-radio modules were discussed in this paper with analysis of their characteristics. Some suggestions in application of micro-assembly were pointed out in the end of this text.


2010 ◽  
Vol 108-111 ◽  
pp. 319-323
Author(s):  
Gen Yuan Du ◽  
Fang Miao ◽  
Xi Rong Guo

This paper proposes a novel digital earth platform framework, which is a application, service and decision-making support systems of geospatial data acquisition, storage, transmission, conversion, processing, analysis, retrieval, expression and output as a unified body, the core of which is to deal with geospatial data for spatial geographic information network service Geo-Browser/Geospatial Information Server (G/S) mode and Hyper Geographic Markup Language (HGML). Depth understanding and analysis of platform architecture, this paper realizes of digital earth platform prototype – U-Star based on the geospatial information network service G/S mode, the platform has the merits of C/S mode that full use of client resources, efficient to deal with the client data, as well as the advantages of B/S mode that the unified client and convenient to access network. This platform is a new server which can cope with massive geospatial information and a solution which can provide efficient service. It has already been applied into many fields, such as the digital tourism service system, the time sequence analysis of Wenchuan Earthquake, the real-time video monitoring based on the digital earth platform and the intelligent processing and exhibition of remote sensing data. The result indicates that the perfection of space, time and complexion of the geospatial information network accessing will effectively improve the quality and efficiency of the shared data, which has very important theoretical significance and bright prospect of application.


Author(s):  
Ardis Hanson ◽  
Susan Jane Heron

To be optimally useful, geospatial resources must be described. This description is referred to as metadata. Metadata tells “who, what, where, when, why, and how” about every facet of a piece of data or service. When properly done, metadata answers a wide range of questions about geospatial resources, such as what geospatial data is available, how to evaluate its quality and suitability for use, and how to access it, transfer it, and process it. To ensure consistency for access and retrieval, metadata can be standardized to provide a common set of terms, definitions, and organization.


2019 ◽  
Author(s):  
Onur Yukselen ◽  
Osman Turkyilmaz ◽  
Ahmet Rasit Ozturk ◽  
Manuel Garber ◽  
Alper Kucukural

ABSTRACTThe emergence of high throughput technologies that produce vast amounts of genomic data, such as next-generation sequencing (NGS) are transforming biological research. The dramatic increase in the volume of data makes analysis the main bottleneck for scientific discovery. The processing of high throughput datasets typically involves many different computational programs, each of which performs a specific step in a pipeline. Given the wide range of applications and organizational infrastructures, there is a great need for a highly-parallel, flexible, portable, and reproducible data processing frameworks. Flexibility ensures that pipelines can support a variety of applications without requiring one-off modifications. Portability ensures that users can leverage computationally available resources and work within economic constraints. Reproducibility warrants credibility to the results and is particularly challenging in the face of the sheer volume of data and the complexity of processing pipelines that vary widely between users.Several platforms currently exist for the design and execution of complex pipelines (e.g. Galaxy, GenePattern, GeneProf). Unfortunately, these platforms lack the necessary combination of parallelism, portability, flexibility and/or reproducibility that are required by the current research environment. To address these shortcomings, Nextflow was implemented to simplify portable, scalable, and reproducible scientific pipelines using containers. We have used Nextflow capabilities as leverage and developed a user interface, DolphinNext, for creating, deploying, and executing complex Nextflow pipelines for high throughput data processing. The guiding principle of DolphinNext is to facilitate the building and deployment of complex pipelines using a modular approach implemented in a graphical interface. DolphinNext provides: 1. A drag and drop user interface that abstracts pipelines and allows users to create pipelines without familiarity in underlying programming languages. 2. A user interface to monitor pipeline execution that allows the re-initiation of pipelines at intermediate steps 3. Reproducible pipelines with version tracking and stand-alone versions that can be run independently. 4. Seamless portability to distributed computational environments such as high-performance clusters or cloud computing environments.


2019 ◽  
Vol 15 (3) ◽  
pp. 273-279
Author(s):  
Shweta G. Rangari ◽  
Nishikant A. Raut ◽  
Pradip W. Dhore

Background:The unstable and/or toxic degradation products may form due to degradation of drug which results into loss of therapeutic activity and lead to life threatening condition. Hence, it is important to establish the stability characteristics of drug in various conditions such as in temperature, light, oxidising agent and susceptibility across a wide range of pH values.Introduction:The aim of the proposed study was to develop simple, sensitive and economic stability indicating high performance thin layer chromatography (HPTLC) method for the quantification of Amoxapine in the presence of degradation products.Methods:Amoxapine and its degraded products were separated on precoated silica gel 60F254 TLC plates by using mobile phase comprising of methanol: toluene: ammonium acetate (6:3:1, v/v/v). The densitometric evaluation was carried out at 320 nm in reflectance/absorbance mode. The degradation products obtained as per ICH guidelines under acidic, basic and oxidative conditions have different Rf values 0.12, 0.26 and 0.6 indicating good resolution from each other and pure drug with Rf: 0.47. Amoxapine was found to be stable under neutral, thermal and photo conditions.Results:The method was validated as per ICH Q2 (R1) guidelines in terms of accuracy, precision, ruggedness, robustness and linearity. A good linear relationship between concentration and response (peak area and peak height) over the range of 80 ng/spot to 720 ng/spot was observed from regression analysis data showing correlation coefficient 0.991 and 0.994 for area and height, respectively. The limit of detection (LOD) and limit of quantitation (LOQ) for area were found to be 1.176 ng/mL and 3.565 ng/mL, whereas for height, 50.063 ng/mL and 151.707 ng/mL respectively.Conclusion:The statistical analysis confirmed the accuracy, precision and selectivity of the proposed method which can be effectively used for the analysis of amoxapine in the presence of degradation products.


2019 ◽  
Vol 5 (4) ◽  
pp. 270-277 ◽  
Author(s):  
Vijay Kumar ◽  
Simranjeet Singh ◽  
Ragini Bhadouria ◽  
Ravindra Singh ◽  
Om Prakash

Holoptelea integrifolia Roxb. Planch (HI) has been used to treat various ailments including obesity, osteoarthritis, arthritis, inflammation, anemia, diabetes etc. To review the major phytochemicals and medicinal properties of HI, exhaustive bibliographic research was designed by means of various scientific search engines and databases. Only 12 phytochemicals have been reported including biologically active compounds like betulin, betulinic acid, epifriedlin, octacosanol, Friedlin, Holoptelin-A and Holoptelin-B. Analytical methods including the Thin Layer Chromatography (TLC), High-Performance Thin Layer Chromatography (HPTLC), High-Performance Liquid Chromatography (HPLC) and Liquid Chromatography With Mass Spectral (LC-MS) analysis have been used to analyze the HI. From medicinal potency point of view, these phytochemicals have a wide range of pharmacological activities such as antioxidant, antibacterial, anti-inflammatory, and anti-tumor. In the current review, it has been noticed that the mechanism of action of HI with biomolecules has not been fully explored. Pharmacology and toxicological studies are very few. This seems a huge literature gap to be fulfilled through the detailed in-vivo and in-vitro studies.


Sign in / Sign up

Export Citation Format

Share Document