scholarly journals GEOINFORMATION SUPPORT FOR FORECASTING FLOOD ZONES IN THE SOUTH OF SAKHALIN

Author(s):  
Alexey A. Verhoturov ◽  
◽  
Vyacheslav A. Melkiy ◽  

Modern systems of hydrometeorological monitoring, for the most part, widely use WEB and GIS technology tools. Territorial fragmentation divisions of the World Meteorological Organization (WMO), Roshydromet, Russian Academy of Sciences and other services and departments interested in obtaining data requires creation of unified information environment for exchange of heterogeneous information. Formation field for geospatial data has become possible with the availability of industrial design platforms with high performance, supporting standard data exchange formats suitable of system for building projectoin. The purpose of the study is to develop requirements for geoinformation sup-port of the system necessary for flood forecasting. Methods: GIS mapping, interpretation and analysis of remote sensing data of the Earth. When developing system for hydrological monitoring of rivers in the Southern Sakhalin, we used the experience of operating similar observational network in services of several European countries, as well as the geographically distributed GIS created by Roshydromet. Considering the vast experience of predecessors and requirements for geoinformation support neces-sary for predicting flood zones in the rivers of Southern Sakhalin have been developed. The initial da-ta for creating the correct flood model are satellite images, large-scale topographic maps, digital terrain models, data from long-term hydrometeorological observations, and engineering surveys.

Author(s):  
Yassine Sabri ◽  
Aouad Siham

Multi-area and multi-faceted remote sensing (SAR) datasets are widely used due to the increasing demand for accurate and up-to-date information on resources and the environment for regional and global monitoring. In general, the processing of RS data involves a complex multi-step processing sequence that includes several independent processing steps depending on the type of RS application. The processing of RS data for regional disaster and environmental monitoring is recognized as computationally and data demanding.Recently, by combining cloud computing and HPC technology, we propose a method to efficiently solve these problems by searching for a large-scale RS data processing system suitable for various applications. Real-time on-demand service. The ubiquitous, elastic, and high-level transparency of the cloud computing model makes it possible to run massive RS data management and data processing monitoring dynamic environments in any cloud. via the web interface. Hilbert-based data indexing methods are used to optimally query and access RS images, RS data products, and intermediate data. The core of the cloud service provides a parallel file system of large RS data and an interface for accessing RS data from time to time to improve localization of the data. It collects data and optimizes I/O performance. Our experimental analysis demonstrated the effectiveness of our method platform.


Author(s):  
Jianhong Sun ◽  
Martin Hardwick

Abstract The rapid expansion of high performance computer networks such as the Internet allows companies to come together electronically to exploit emerging market conditions and produce new products. Such a group of companies has been called a virtual enterprise (VE). One barrier to VE collaboration is the lack of interoperability among the application systems of different companies — product data produced by the systems at one company cannot be read by the systems at another. STEP is a state-of-the-art technology for product data exchange and provides a basis for VE data sharing. In this paper, some challenges and difficulties of building an integrated large-scale database of STEP data are discussed. A prototype database was designed and implemented to test our solutions. Preliminary results show that the database is scalable and able to integrate STEP data. We believe the solutions are a basis for building a practical STEP database that can be used to support VE collaboration.


Sensors ◽  
2021 ◽  
Vol 21 (21) ◽  
pp. 7006
Author(s):  
Mohamed Wassim Baba ◽  
Gregoire Thoumyre ◽  
Erwin W. J. Bergsma ◽  
Christopher J. Daly ◽  
Rafael Almar

Coasts are areas of vitality because they host numerous activities worldwide. Despite their major importance, the knowledge of the main characteristics of the majority of coastal areas (e.g., coastal bathymetry) is still very limited. This is mainly due to the scarcity and lack of accurate measurements or observations, and the sparsity of coastal waters. Moreover, the high cost of performing observations with conventional methods does not allow expansion of the monitoring chain in different coastal areas. In this study, we suggest that the advent of remote sensing data (e.g., Sentinel 2A/B) and high performance computing could open a new perspective to overcome the lack of coastal observations. Indeed, previous research has shown that it is possible to derive large-scale coastal bathymetry from S-2 images. The large S-2 coverage, however, leads to a high computational cost when post-processing the images. Thus, we develop a methodology implemented on a High-Performance cluster (HPC) to derive the bathymetry from S-2 over the globe. In this paper, we describe the conceptualization and implementation of this methodology. Moreover, we will give a general overview of the generated bathymetry map for NA compared with the reference GEBCO global bathymetric product. Finally, we will highlight some hotspots by looking closely to their outputs.


Author(s):  
C.K. Wu ◽  
P. Chang ◽  
N. Godinho

Recently, the use of refractory metal silicides as low resistivity, high temperature and high oxidation resistance gate materials in large scale integrated circuits (LSI) has become an important approach in advanced MOS process development (1). This research is a systematic study on the structure and properties of molybdenum silicide thin film and its applicability to high performance LSI fabrication.


Author(s):  
В.В. ГОРДЕЕВ ◽  
В.Е. ХАЗАНОВ

При выборе типа доильной установки и ее размера необходимо учитывать максимальное планируемое поголовье дойных коров и размер технологической группы, кратность и время одного доения, продолжительность рабочей смены дояров. Анализ технико-экономических показателей наиболее распространенных на сегодняшний день типов доильных установок одинакового технического уровня свидетельствует, что наилучшие удельные показатели имеет установка типа «Карусель» (1), а установка типа «Елочка» (2) требует более высоких затрат труда и средств. Установка «Параллель» (3) занимает промежуточное положение. Из анализа пропускной способности и количества необходимых операторов: установка 2 рекомендована для ферм с поголовьем дойного стада до 600 голов, 3 — не более 1200 дойных коров, 1 — более 1200 дойных коров. «Карусель» — наиболее рациональный, высокопроизводительный, легко автоматизируемый и, следовательно, перспективный способ доения в залах, особенно для крупных молочных ферм. The choice of the proper type and size of milking installations needs to take into account the maximum planned number of dairy cows, the size of a technological group, the number of milkings per day, and the duration of one milking and the operator's working shift. The analysis of technical and economic indicators of currently most common types of milking machines of the same technical level revealed that the Carousel installation had the best specific indicators while the Herringbone installation featured higher labour inputs and cash costs. The Parallel installation was found somewhere in between. In terms of the throughput and the required number of operators Herringbone is recommended for farms with up to 600 dairy cows, Parallel — below 1200 dairy cows, Carousel — above 1200 dairy cows. Carousel was found the most practical, high-performance, easily automated and, therefore, promising milking system for milking parlours, especially on the large-scale dairy farms.


Author(s):  
Mark Endrei ◽  
Chao Jin ◽  
Minh Ngoc Dinh ◽  
David Abramson ◽  
Heidi Poxon ◽  
...  

Rising power costs and constraints are driving a growing focus on the energy efficiency of high performance computing systems. The unique characteristics of a particular system and workload and their effect on performance and energy efficiency are typically difficult for application users to assess and to control. Settings for optimum performance and energy efficiency can also diverge, so we need to identify trade-off options that guide a suitable balance between energy use and performance. We present statistical and machine learning models that only require a small number of runs to make accurate Pareto-optimal trade-off predictions using parameters that users can control. We study model training and validation using several parallel kernels and more complex workloads, including Algebraic Multigrid (AMG), Large-scale Atomic Molecular Massively Parallel Simulator, and Livermore Unstructured Lagrangian Explicit Shock Hydrodynamics. We demonstrate that we can train the models using as few as 12 runs, with prediction error of less than 10%. Our AMG results identify trade-off options that provide up to 45% improvement in energy efficiency for around 10% performance loss. We reduce the sample measurement time required for AMG by 90%, from 13 h to 74 min.


Radiation ◽  
2021 ◽  
Vol 1 (2) ◽  
pp. 79-94
Author(s):  
Peter K. Rogan ◽  
Eliseos J. Mucaki ◽  
Ben C. Shirley ◽  
Yanxin Li ◽  
Ruth C. Wilkins ◽  
...  

The dicentric chromosome (DC) assay accurately quantifies exposure to radiation; however, manual and semi-automated assignment of DCs has limited its use for a potential large-scale radiation incident. The Automated Dicentric Chromosome Identifier and Dose Estimator (ADCI) software automates unattended DC detection and determines radiation exposures, fulfilling IAEA criteria for triage biodosimetry. This study evaluates the throughput of high-performance ADCI (ADCI-HT) to stratify exposures of populations in 15 simulated population scale radiation exposures. ADCI-HT streamlines dose estimation using a supercomputer by optimal hierarchical scheduling of DC detection for varying numbers of samples and metaphase cell images in parallel on multiple processors. We evaluated processing times and accuracy of estimated exposures across census-defined populations. Image processing of 1744 samples on 16,384 CPUs required 1 h 11 min 23 s and radiation dose estimation based on DC frequencies required 32 sec. Processing of 40,000 samples at 10 exposures from five laboratories required 25 h and met IAEA criteria (dose estimates were within 0.5 Gy; median = 0.07). Geostatistically interpolated radiation exposure contours of simulated nuclear incidents were defined by samples exposed to clinically relevant exposure levels (1 and 2 Gy). Analysis of all exposed individuals with ADCI-HT required 0.6–7.4 days, depending on the population density of the simulation.


Antioxidants ◽  
2021 ◽  
Vol 10 (6) ◽  
pp. 843
Author(s):  
Tamara Ortiz ◽  
Federico Argüelles-Arias ◽  
Belén Begines ◽  
Josefa-María García-Montes ◽  
Alejandra Pereira ◽  
...  

The best conservation method for native Chilean berries has been investigated in combination with an implemented large-scale extract of maqui berry, rich in total polyphenols and anthocyanin to be tested in intestinal epithelial and immune cells. The methanolic extract was obtained from lyophilized and analyzed maqui berries using Folin–Ciocalteu to quantify the total polyphenol content, as well as 2,2-diphenyl-1-picrylhydrazyl (DPPH), ferric reducing antioxidant power (FRAP), and oxygen radical absorbance capacity (ORAC) to measure the antioxidant capacity. Determination of maqui’s anthocyanins profile was performed by ultra-high-performance liquid chromatography (UHPLC-MS/MS). Viability, cytotoxicity, and percent oxidation in epithelial colon cells (HT-29) and macrophages cells (RAW 264.7) were evaluated. In conclusion, preservation studies confirmed that the maqui properties and composition in fresh or frozen conditions are preserved and a more efficient and convenient extraction methodology was achieved. In vitro studies of epithelial cells have shown that this extract has a powerful antioxidant strength exhibiting a dose-dependent behavior. When lipopolysaccharide (LPS)-macrophages were activated, noncytotoxic effects were observed, and a relationship between oxidative stress and inflammation response was demonstrated. The maqui extract along with 5-aminosalicylic acid (5-ASA) have a synergistic effect. All of the compiled data pointed out to the use of this extract as a potential nutraceutical agent with physiological benefits for the treatment of inflammatory bowel disease (IBD).


Author(s):  
Jianglin Feng ◽  
Nathan C Sheffield

Abstract Summary Databases of large-scale genome projects now contain thousands of genomic interval datasets. These data are a critical resource for understanding the function of DNA. However, our ability to examine and integrate interval data of this scale is limited. Here, we introduce the integrated genome database (IGD), a method and tool for searching genome interval datasets more than three orders of magnitude faster than existing approaches, while using only one hundredth of the memory. IGD uses a novel linear binning method that allows us to scale analysis to billions of genomic regions. Availability https://github.com/databio/IGD


Sign in / Sign up

Export Citation Format

Share Document