Natural Hazards and Earth System Sciences Discussions
Latest Publications


TOTAL DOCUMENTS

691
(FIVE YEARS 0)

H-INDEX

11
(FIVE YEARS 0)

Published By Copernicus Gmbh

2195-9269

2015 ◽  
Vol 3 (12) ◽  
pp. 7587-7630
Author(s):  
J. D'Amato ◽  
D. Hantz ◽  
A. Guerin ◽  
M. Jaboyedoff ◽  
L. Baillet ◽  
...  

Abstract. The influence of meteorological conditions on rockfall occurrence has been often highlighted, but its knowledge is still not sufficient due to the lack of exhaustive and precise rockfall data bases. In this study, rockfalls have been detected in a limestone cliff by annual terrestrial laser scanning, and dated by photographic survey during 2.5 years. A near-continuous survey (1 photo each 10 mn) with a wide-angle lens have allowed dating 214 rockfalls larger than 0.1 m3, and a monthly survey with a telephoto lens, dating 854 rockfalls larger than 0.01 m3. The analysis of the two data bases shows that the rockfall frequency can be multiplied by a factor as high as 7 during freeze–thaw episodes and 26 when the mean rainfall intensity (since the beginning of the rainfall episode) is higher than 5 mm h−1. Based on these results, a 4-level scale has been proposed for predicting the temporal variations of hazard. The more precise data base and freeze–thaw episode definition make it possible to distinguish different phases in freeze–thaw episodes: negative temperature cooling periods, negative temperature warming periods and thawing periods. It appears that rockfalls occur more frequently during warming and thawing periods than during cooling periods. It can be inferred that rockfalls are caused by thermal ice dilatation rather than by dilatation due to the phase transition. But they may occur only when the ice melt, because the cohesion of the ice–rock interface can be sufficient to hold the rock compartment which has been cut.


2015 ◽  
Vol 3 (12) ◽  
pp. 7555-7586
Author(s):  
A. K. Abd el-aal ◽  
M. A. El-Eraki ◽  
S. I. Mostafa

Abstract. In this contribution, we developed an extended stochastic technique for seismic hazard assessment purposes. This technique depends on the hypothesis of stochastic technique of Boore (2003) "Simulation of ground motion using the stochastic method. Appl. Geophy. 160:635–676". The essential characteristics of extended stochastic technique are to obtain and simulate ground motion in order to minimize future earthquake consequences. The first step of this technique is defining the seismic sources which mostly affect the study area. Then, the maximum expected magnitude is defined for each of these seismic sources. It is followed by estimating the ground motion using an empirical attenuation relationship. Finally, the site amplification is implemented in calculating the peak ground acceleration (PGA) at each site of interest. We tested and applied this developed technique at Cairo, Suez, Port Said, Ismailia, Zagazig and Damietta cities to predict the ground motion. Also, it is applied at Cairo, Zagazig and Damietta cities to estimate the maximum peak ground acceleration at actual soil conditions. In addition, 0.5, 1, 5, 10 and 20 % damping median response spectra are estimated using the extended stochastic simulation technique. The calculated highest acceleration values at bedrock conditions are found at Suez city with a value of 44 cm s−2. However, these acceleration values decrease towards the north of the study area to reach 14.1 cm s−2 at Damietta city. This comes in agreement with the results of previous studies of seismic hazards in northern Egypt and is found to be comparable. This work can be used for seismic risk mitigation and earthquake engineering purposes.


2015 ◽  
Vol 3 (12) ◽  
pp. 7487-7525
Author(s):  
K. Goda ◽  
K. Abilova

Abstract. This study investigates the issues related to underestimation of the earthquake source parameters in the context of tsunami early warning and tsunami risk assessment. The magnitude of a very large event may be underestimated significantly during the early stage of the disaster, resulting in the issuance of incorrect tsunami warnings. Tsunamigenic events in the Tohoku region of Japan, where the 2011 tsunami occurred, are focused on as a case study to illustrate the significance of the problems. The effects of biases in the estimated earthquake magnitude on tsunami loss are investigated using a rigorous probabilistic tsunami loss calculation tool that can be applied to a range of earthquake magnitudes by accounting for uncertainties of earthquake source parameters (e.g. geometry, mean slip, and spatial slip distribution). The quantitative tsunami loss results provide with valuable insights regarding the importance of deriving accurate seismic information as well as the potential biases of the anticipated tsunami consequences. Finally, usefulness of rigorous tsunami risk assessment is discussed in defining critical hazard scenarios based on the potential consequences due to tsunami disasters.


2015 ◽  
Vol 3 (12) ◽  
pp. 7457-7486
Author(s):  
S. Cusack

Abstract. The clustering of severe European windstorms on annual timescales has substantial impacts on the re/insurance industry. Management of the risk is impaired by large uncertainties in estimates of clustering from historical storm datasets typically covering the past few decades. The uncertainties are unusually large because clustering depends on the variance of storm counts. Eight storm datasets are gathered for analysis in this study in order to reduce these uncertainties. Six of the datasets contain more than 100~years of severe storm information to reduce sampling errors, and the diversity of information sources and analysis methods between datasets sample observational errors. All storm severity measures used in this study reflect damage, to suit re/insurance applications. It is found that the shortest storm dataset of 42 years in length provides estimates of clustering with very large sampling and observational errors. The dataset does provide some useful information: indications of stronger clustering for more severe storms, particularly for southern countries off the main storm track. However, substantially different results are produced by removal of one stormy season, 1989/1990, which illustrates the large uncertainties from a 42-year dataset. The extended storm records place 1989/1990 into a much longer historical context to produce more robust estimates of clustering. All the extended storm datasets show a greater degree of clustering with increasing storm severity and suggest clustering of severe storms is much more material than weaker storms. Further, they contain signs of stronger clustering in areas off the main storm track, and weaker clustering for smaller-sized areas, though these signals are smaller than uncertainties in actual values. Both the improvement of existing storm records and development of new historical storm datasets would help to improve management of this risk.


2015 ◽  
Vol 3 (12) ◽  
pp. 7527-7553
Author(s):  
C.-H. Chan

Abstract. This study provides some new insights into earthquake forecasting models to the regions with subduction systems, including the depth-component for forecasting grids and time-dependent factors. To manifest the importance of depth-component, I incorporate three-dimensional grids into forecasting approaches and compare with those with two-dimensional cells. Through applications to the two subduction regions, Ryukyu and Kanto, the approaches with three-dimensional grids always obtain better forecasting ability. I thus confirm the importance of depth-dependency for forecasting, especially for the applications to a subduction environment or a region with non-vertical seismogenic structures. In addition, I discuss the role of time-dependent factors for forecasting models. I conclude that time-dependency becomes crucial only when significant seismicity rate change follows a large earthquake. The insights into the applications of forecasting models could provide key information regarding seismic and tsunami hazard assessments.


2015 ◽  
Vol 3 (12) ◽  
pp. 7411-7456
Author(s):  
K. Kobayashi ◽  
S. Otsuka ◽  
K. Saito ◽  

Abstract. This paper presents a study on short-term ensemble flood forecasting specifically for small dam catchments in Japan. Numerical ensemble simulations of rainfall from the Japan Meteorological Agency Nonhydrostatic Model are used as the input data to a rainfall–runoff model for predicting river discharge into a dam. The ensemble weather simulations use a conventional 10 km and a high-resolution 2 km spatial resolution. A distributed rainfall–runoff model is constructed for the Kasahori dam catchment (approx. 70 km2) and applied with the ensemble rainfalls. The results show that the hourly maximum and cumulative catchment-average rainfalls of the 2 km-resolution JMA-NHM ensemble simulation are more appropriate than the 10 km-resolution rainfalls. All the simulated inflows based on the 2 and 10 km rainfalls become larger than the flood discharge of 140 m3 s−1; a threshold value for flood control. The inflows with the 10 km-resolution ensemble rainfall are all considerably smaller than the observations, while, at least one simulated discharge out of 11 ensemble members with the 2 km-resolution rainfalls reproduces the first peak of the inflow at the Kasahori dam with similar amplitude to observations, although there are spatiotemporal lags between simulation and observation. To take positional lags into account of the ensemble discharge simulation, the rainfall distribution in each ensemble member is shifted so that the catchment-averaged cumulative rainfall of the Kasahori dam maximizes. The runoff simulation with the position-shifted rainfalls show much better results than the original ensemble discharge simulations.


2015 ◽  
Vol 3 (12) ◽  
pp. 7379-7409 ◽  
Author(s):  
G. Ceccherini ◽  
S. Russo ◽  
I. Ameztoy ◽  
C. P. Romero ◽  
C. Carmona-Moreno

Abstract. In recent decades there has been an increase in magnitude and occurrence of heat waves and a decrease of cold waves which are possibly related to the anthropogenic influence (Solomon et al., 2007). This study describes the extreme temperature regime of heat waves and cold waves across South America over recent years (1980–2014). Temperature records come from the Global Surface Summary of the Day (GSOD), a climatological dataset produced by the National Climatic Data Center that provides records of daily maximum and minimum temperatures acquired worldwide. The magnitude of heat waves and cold waves for each GSOD station are quantified on annual basis by means of the Heat Wave Magnitude Index (Russo et al., 2014) and the Cold Wave Magnitude Index (CWMI, Forzieri et al., 2015). Results indicate an increase in intensity and in frequency of heat waves, with up to 75 % more events occurring only in the last 10 years. Conversely, no significant changes are detected for cold waves. In addition, the trend of the annual temperature range (i.e., yearly mean of Tmax – yearly mean of Tmin) is positive – up to 1 °C decade−1 – over the extra-tropics and negative – up to 0.5 °C decade−1 – over the tropic. This dichotomous behaviour indicates that the annual mean of Tmax is generally increasing more than the annual mean of Tmin in the extra-tropics and vice versa in the tropics.


2015 ◽  
Vol 3 (12) ◽  
pp. 7247-7273 ◽  
Author(s):  
F. Ferrigno ◽  
G. Gigli ◽  
R. Fanti ◽  
N. Casagli

Abstract. On 10 March 2010, due to the heavy rainfall that occurred on the previous days, the Montaguto earthflow reactivated, involving the road SS 90 "Delle Puglie", as had happened previously in May 2005 and in September 2009, and reaching the Roma–Bari railway. This determined a special attention of the National Civil Protection Department and a widespread monitoring and analysis program was initiated. A monitoring activity using GB-InSAR (Ground Based Interferometric Synthetic Aperture Radar) system began, in order to investigate the landslide kinematics, to plan urgent safety measures for risk mitigation and to design long term stabilization work. In this paper the GB-InSAR monitoring system results and its applications in the Observational Method (OM) approach are presented. The paper also highlights how the OM based on the GB-InSAR technique can produce savings in cost and time on engineering projects, without compromising safety, and how it can also benefit the geotechnical community by increasing scientific knowledge. This study focuses on the very much active role played by the monitoring activities, in both the design and plan modifications; with a special consideration for the emergency phase.


2015 ◽  
Vol 3 (12) ◽  
pp. 7275-7309 ◽  
Author(s):  
P. Scussolini ◽  
J. C. J. H. Aerts ◽  
B. Jongman ◽  
L. M. Bouwer ◽  
H. C. Winsemius ◽  
...  

Abstract. With the projected changes in climate, population and socioeconomic activity located in flood-prone areas, the global assessment of the flood risk is essential to inform climate change policy and disaster risk management. Whilst global flood risk models exist for this purpose, the accuracy of their results is greatly limited by the lack of information on the current standard of protection to floods, with studies either neglecting this aspect or resorting to crude assumptions. Here we present a first global database of FLOod PROtection Standards, FLOPROS, which comprises information in the form of the flood return period associated with protection measures, at different spatial scales. FLOPROS comprises three layers of information, and combines them into one consistent database. The Design layer contains empirical information about the actual standard of existing protection already in place, while the Policy layer and the Model layer are proxies for such protection standards, and serve to increase the spatial coverage of the database. The Policy layer contains information on protection standards from policy regulations; and the Model layer uses a validated modeling approach to calculate protection standards. Based on this first version of FLOPROS, we suggest a number of strategies to further extend and increase the resolution of the database. Moreover, as the database is intended to be continually updated, while flood protection standards are changing with new interventions, FLOPROS requires input from the flood risk community. We therefore invite researchers and practitioners to contribute information to this evolving database by corresponding to the authors.


2015 ◽  
Vol 3 (12) ◽  
pp. 7311-7332
Author(s):  
P. Nicolet ◽  
M. Jaboyedoff ◽  
C. Cloutier ◽  
G. B. Crosta ◽  
S. Lévy

Abstract. When calculating the risk of railway or road users to be killed by a natural hazard, one has to calculate a "spatio-temporal probability", i.e. the probability for a vehicle to be in the path of the falling mass when the mass falls, or the expected number of affected vehicles in case of an event. To calculate this, different methods are used in the literature, and, most of the time, they consider only the dimensions of the falling mass or the dimensions of the vehicles. Some authors do however consider both dimensions at the same time, and the use of their approach is recommended. Finally, a method considering an impact on the front of the vehicle in addition is discussed.


Sign in / Sign up

Export Citation Format

Share Document