Model project for establishing an avalanche warning and evacuation system

1993 ◽  
Vol 18 ◽  
pp. 245-250
Author(s):  
Hideki Terada ◽  
Kazunori Fujisawa ◽  
Yoshimitsu Nakamura ◽  
Noriyuki Minami

A project was started in 1990 to rationalize the avalanche warning and evacuation system. The major items in the model project were: (1) zoning methods for dangerous areas; (2) establishment of a surveillance system; and (3) methods of determining a warning and evacuation standard. Since the scale of a snow avalanche cannot be predicted, areas with more than 30 KN m2 of impact are identified. Equipment for conducting automatic surveillance and data processing was developed to telemeter snow and weather data. Issuing warning or evacuation advice is the focus of the discussion. Judgment is based on information on current conditions obtained from a telemeter and from residents, and on information provided by forecasts of snowfall and temperatures until the following morning. Two methods of judgment are under consideration: one is to set out a standard value based on meteorological data obtained by observation, the other is based on a discriminant of danger level.

1993 ◽  
Vol 18 ◽  
pp. 245-250
Author(s):  
Hideki Terada ◽  
Kazunori Fujisawa ◽  
Yoshimitsu Nakamura ◽  
Noriyuki Minami

A project was started in 1990 to rationalize the avalanche warning and evacuation system. The major items in the model project were: (1) zoning methods for dangerous areas; (2) establishment of a surveillance system; and (3) methods of determining a warning and evacuation standard. Since the scale of a snow avalanche cannot be predicted, areas with more than 30 KN m2 of impact are identified. Equipment for conducting automatic surveillance and data processing was developed to telemeter snow and weather data. Issuing warning or evacuation advice is the focus of the discussion. Judgment is based on information on current conditions obtained from a telemeter and from residents, and on information provided by forecasts of snowfall and temperatures until the following morning. Two methods of judgment are under consideration: one is to set out a standard value based on meteorological data obtained by observation, the other is based on a discriminant of danger level.


2016 ◽  
Vol 75 (13) ◽  
pp. 1193-1200 ◽  
Author(s):  
A. A. Strelnitsky ◽  
G. Е. Zavolodko ◽  
V. А. Аndrusevich

Author(s):  
Raama Alves ◽  
Thamires Bernardes ◽  
MANOEL ANTONIO FONSECA COSTA

Energies ◽  
2021 ◽  
Vol 14 (4) ◽  
pp. 802
Author(s):  
Kristian Skeie ◽  
Arild Gustavsen

In building thermal energy characterisation, the relevance of proper modelling of the effects caused by solar radiation, temperature and wind is seen as a critical factor. Open geospatial datasets are growing in diversity, easing access to meteorological data and other relevant information that can be used for building energy modelling. However, the application of geospatial techniques combining multiple open datasets is not yet common in the often scripted workflows of data-driven building thermal performance characterisation. We present a method for processing time-series from climate reanalysis and satellite-derived solar irradiance services, by implementing land-use, and elevation raster maps served in an elevation profile web-service. The article describes a methodology to: (1) adapt gridded weather data to four case-building sites in Europe; (2) calculate the incident solar radiation on the building facades; (3) estimate wind and temperature-dependent infiltration using a single-zone infiltration model and (4) including separating and evaluating the sheltering effect of buildings and trees in the vicinity, based on building footprints. Calculations of solar radiation, surface wind and air infiltration potential are done using validated models published in the scientific literature. We found that using scripting tools to automate geoprocessing tasks is widespread, and implementing such techniques in conjunction with an elevation profile web service made it possible to utilise information from open geospatial data surrounding a building site effectively. We expect that the modelling approach could be further improved, including diffuse-shading methods and evaluating other wind shelter methods for urban settings.


2018 ◽  
Vol 47 (2) ◽  
pp. 150-159 ◽  
Author(s):  
Emerta Aragie

By developing a model that describes the Kenyan coffee value chain, this study evaluates opportunities emanating from four scenarios representing productivity gains in the various value chain stages of the coffee sector and additional three scenarios reflecting shifts in market situations. Results show that productivity-enhancing policies have stronger effects on coffee output and export performance if they target the milling stage of the value chain. Export subsidy and favourable external marketing conditions also have stronger effects, distributed comparably across the various value chain stages. We, however, found that these gains in the coffee sector come at the expense of other cash crops such as cotton, tea, sugar and tobacco. The approach followed in this study is relevant as this trade-off between coffee and the other cash crop sectors may not be visibly shown using standard value chain approaches.


2017 ◽  
Vol 8 (2) ◽  
pp. 88-105 ◽  
Author(s):  
Gunasekaran Manogaran ◽  
Daphne Lopez

Ambient intelligence is an emerging platform that provides advances in sensors and sensor networks, pervasive computing, and artificial intelligence to capture the real time climate data. This result continuously generates several exabytes of unstructured sensor data and so it is often called big climate data. Nowadays, researchers are trying to use big climate data to monitor and predict the climate change and possible diseases. Traditional data processing techniques and tools are not capable of handling such huge amount of climate data. Hence, there is a need to develop advanced big data architecture for processing the real time climate data. The purpose of this paper is to propose a big data based surveillance system that analyzes spatial climate big data and performs continuous monitoring of correlation between climate change and Dengue. Proposed disease surveillance system has been implemented with the help of Apache Hadoop MapReduce and its supporting tools.


2019 ◽  
Vol 111 ◽  
pp. 06056
Author(s):  
Kuo-Tsang Huang ◽  
Yu-Teng Weng ◽  
Ruey-Lung Hwang

These future building energy studies mainly stem from hourly based dynamic building simulation results with the future weather data. The reliability of the future building energy forecast heavily relies on the accuracy of these future weather data. The global circulation models (GCMs) provided by IPCC are the major sources for constructing future weather data. However, there are uncertainties existed among them even with the same climate change scenarios. There is a need to develop a method on how to select the suitable GCM for local application. This research firstly adopted principal component analysis (PCA) method in choosing the suitable GCM for application in Taiwan, and secondly the Taiwanese hourly future meteorological data sets were constructed based on the selected GCM by morphing method. Thirdly, the future cooling energy consumption of an actual office building in the near (2011-2040), the mid (2041-2070), and the far future (2071-2100), were analysed. The results show that NorESM1-M GCM has the lowest root mean square error (RMSE) as opposed to the other GCMs, and was identified as the suitable GCM for further future climate generation processing. The building simulation against the future weather datasets revealed that the average cooling energy use intensity (EUIc) in Taipei will be increased by 12%, 17%, and 34% in the 2020s, 2050s, and 2080s, respectively, as compared to the current climate.


2016 ◽  
Vol 28 (5) ◽  
pp. 517-527
Author(s):  
Adam Stančić ◽  
Ivan Grgurević ◽  
Zvonko Kavran

Integration of the collected information on the road within the image recorded by the surveillance system forms a unified source of transport-relevant data about the supervised situation. The basic assumption is that the procedure of integration changes the image to the extent that is invisible to the human eye, and the integrated data keep identical content. This assumption has been proven by studying the statistical properties of the image and integrated data using mathematical model modelled in the programming language Python using the combinations of the functions of additional libraries (OpenCV, NumPy, SciPy and Matplotlib). The model has been used to compare the input methods of meta-data and methods of steganographic integration by correcting the coefficients of Discrete Cosine Transform JPEG compressed image. For the procedures of steganographic data processing the steganographic algorithm F5 was used. The review paper analyses the advantages and drawbacks of the integration methods and present the examples of situations in traffic in which the formed unified sources of transport-relevant information could be used.


Sign in / Sign up

Export Citation Format

Share Document