catastrophe models
Recently Published Documents


TOTAL DOCUMENTS

64
(FIVE YEARS 5)

H-INDEX

12
(FIVE YEARS 0)

2021 ◽  
Vol 2021 ◽  
pp. 1-10
Author(s):  
Xinjiang Wei ◽  
Xiao Wang ◽  
Taotao Chen ◽  
Zhi Ding ◽  
Xi Wu

The failure modes of rockburst in catastrophe theory play an essential role in both theoretical analysis and practical applications. The tensile cracking and sliding rockburst is studied by analyzing the stability of the simplified mechanical model based on the fold catastrophe model. Moreover, the theory of mechanical system stability, together with an engineering example, is introduced to verify the analysis accuracy. Additionally, the results of the fold catastrophe model are compared with that of the cusp catastrophe model, and the applicability of two catastrophe models is discussed. The results show that the analytical results of the fold catastrophe model are consistent with the solutions of the mechanical systems stability theory. Moreover, the critical loads calculated by two catastrophe models are both less than the sliding force, which conforms to the actual situations. Nevertheless, the critical loads calculated from the cusp catastrophe model are bigger than those obtained by the fold catastrophe model. In conclusion, a reasonable result of the critical load can be obtained by the fold catastrophe model rather than the cusp catastrophe model. Moreover, the fold catastrophe model has a much wider application. However, when the potential function of the system is a high-order function of the state variable, the fold catastrophe model can only be used to analyze local parts of the system, and using a more complex catastrophe model such as the cusp catastrophe model is recommended.



Geosciences ◽  
2021 ◽  
Vol 11 (3) ◽  
pp. 143
Author(s):  
Michel Jaboyedoff ◽  
Tiggi Choanji ◽  
Marc-Henri Derron ◽  
Li Fei ◽  
Amalia Gutierrez ◽  
...  

Based on a previous risk calculation study conducted along a road corridor, risk is recalculated using a stochastic simulation by introducing variability into most of the parameters in the risk equation. This leads to an exceedance curve comparable to those of catastrophe models. This approach introduces uncertainty into the risk calculation in a simple way, and it can be used for poorly documented cases to compensate for a lack of data. This approach tends to minimize risk or question risk calculations.



2021 ◽  
Author(s):  
Jose Salinas

<p>This presentation is going to address some of the main commonalities between hydrological research and hydrological practice, from the perspective of the Natural Catastrophe (Nat Cat) model developer. For example, hydrological research on the one hand, has a strong focus on the advancement of understanding hydrological processes. The hazard component of Nat Cat flood models, on the other hand, tends to be focused more on model suitability, accuracy and precision. However, it does rely heavily on a thorough understanding of the main hydro-meteorological drivers to describe catchment processes across the relevant spatial and temporal scales, and these are incorporated to achieve model realism and robustness, in particular when extrapolating outside the range of observed regimes. The latter is of importance when modelling extremes, which by definition are scarce.</p><p>The presentation will also go into detail on the feedbacks between hydrological research and hydrological practice. For example, how the latest generation of Natural Catastrophe models benefit from the advances in hydrological research, e.g. research on large scale hydroclimatic patterns like ENSO, or climate change research. Incorporating the latest research in hydrological hazard modeling into Catastrophe Models ultimately improves the risk assessment for a set of assets. Also, large-scale flood risk models using coupled model chains that are relatively new in the hydrological research literature, have been part of the standard methodology for the Nat Cat models for a couple of decades, and might be seen as an indicator for the societal demand to perform novel research in these fields.</p>



2021 ◽  
Author(s):  
Svetlana Stripajova ◽  
Jan Vodicka ◽  
Peter Pazak ◽  
Goran Trendafiloski

<p>Fire following earthquake (FFE) can pose considerable threat in densely populated urban area with significant earthquake hazard and presence of non-fire-resistant buildings typology. Severe building damage and consequently broken pipelines can lead to release of flammable gasses and liquid, which increase possibility of fire occurrence when they come into contact with ignition sources, like short circuits or open flames. Numerous simultaneous ignitions followed by uncontrolled fire spread to adjacent buildings can lead to major fires and conflagrations, whose damage can substantially exceed the earthquake shaking damage. Well-known example of such high financial losses due to FFE is Mw 7.9 San Francisco 1906, where Great Fire losses were 10 times higher than due to earthquake shaking itself. Thus, the quantification of FFE losses has particularly important role for the current underwriting products and the industry requires their further detailed consideration in the catastrophe models and pricing approaches. Impact Forecasting, Aon’s catastrophe model development centre of excellence, has been committed to help (re)insurers on that matter.</p><p>This paper presents quantification of FFE contribution to mean losses for case study of the Vancouver region, Canada for specific scenario Mw 7.5 Strait of Georgia crustal earthquake. FFE methodology encompasses 3 phases: ignitions, fire spread and suppression and loss estimation. Number of ignitions (fires that require fire department response) and their location were calculated using HAZUS empirical equation with input variables earthquake shaking intensity and estimated total building floor area. An urban fire spread is a complicated phenomenon that includes numerous uncertainties. An advanced cellular automata (CA) engine is used for simulation of the fire spread and suppression based on Zhao 2011. The CA engine represents collection of grid-arranged cells, where each grid cell changes state as a function of time according to a defined set of rules that includes the states of adjacent cells. The CA simulations include only matrix mathematical operations that allow us to take into account building construction types and their damage due to earthquake shaking, meteorological and environmental data and fire suppression modifiers. Unlike in older empirical approach, the fire spread CA engine enable to consider fire spread not only from initially ignited building as well as fire developing within a single building, building-to-building fire spread, and fire extinguishing works at the same time. An output of CA engine is the building fire-state grades based on which damage functions are created with PGA as input parameter at the level of 3-digit postal codes. For the chosen scenario potential contribution to mean loss due to FFE could be up to 75% depending on typical buildings setting within 3-digit postal codes.</p>



Author(s):  
Michel Jaboyedoff ◽  
Tiggi Choanji ◽  
Marc-Henri Derron ◽  
Li Fei ◽  
Amalia Gutierrez ◽  
...  

Based on a previous risk calculation study along a road corridor, risk is recalculated using stochastic simulation by introducing variability for most of the parameters in the risk equation. This leads to an exceedance curve comparable to that of catastrophe models. This approach introduces uncertainty into the risk calculation in a simple way, which can be used for poorly documented cases to fulfil lack of data. This approach seems to tend to minimize risk or to question risk calculations.



2020 ◽  
Vol 11 (6) ◽  
pp. 790-806
Author(s):  
Jean-Paul Pinelli ◽  
Josemar Da Cruz ◽  
Kurtis Gurley ◽  
Andres Santiago Paleo-Torres ◽  
Mohammad Baradaranshoraka ◽  
...  

AbstractCatastrophe models estimate risk at the intersection of hazard, exposure, and vulnerability. Each of these areas requires diverse sources of data, which are very often incomplete, inconsistent, or missing altogether. The poor quality of the data is a source of epistemic uncertainty, which affects the vulnerability models as well as the output of the catastrophe models. This article identifies the different sources of epistemic uncertainty in the data, and elaborates on strategies to reduce this uncertainty, in particular through identification, augmentation, and integration of the different types of data. The challenges are illustrated through the Florida Public Hurricane Loss Model (FPHLM), which estimates insured losses on residential buildings caused by hurricane events in Florida. To define the input exposure, and for model development, calibration, and validation purposes, the FPHLM teams accessed three main sources of data: county tax appraiser databases, National Flood Insurance Protection (NFIP) portfolios, and wind insurance portfolios. The data from these different sources were reformatted and processed, and the insurance databases were separately cross-referenced at the county level with tax appraiser databases. The FPHLM hazard teams assigned estimates of natural hazard intensity measure to each insurance claim. These efforts produced an integrated and more complete set of building descriptors for each policy in the NFIP and wind portfolios. The article describes the impact of these uncertainty reductions on the development and validation of the vulnerability models, and suggests avenues for data improvement. Lessons learned should be of interest to professionals involved in disaster risk assessment and management.



2020 ◽  
Author(s):  
Svetlana Stripajova ◽  
Peter Pazak ◽  
Jan Vodicka ◽  
Goran Trendafiloski

<p>The presence of thick soft alluvial sediment-filled basins, like in river’s deltas, can significantly amplify and prolongate the earthquake ground motion. Moreover, the high-water saturation of such soft sediments and cyclic earthquake loading can lead to liquefaction. The basin and liquefaction effect can contribute to substantial modification of the seismic motion and increase of the potential losses at a particular location. Well-known examples of such high financial losses during earthquakes for basin effect is Mw 8.1 Mexico City 1985 and for liquefaction is Darfield and Christchurch earthquakes series in 2010 and 2011. Thus, the quantification of these effects is particularly important for the current underwriting products and the industry requires their further detailed consideration in the catastrophe models and pricing approaches. Impact Forecasting, Aon’s catastrophe model development center of excellence, has been committed to help (re)insurers on that matter.</p><p>This paper presents case study of the quantification of the basin effect and liquefaction for Vancouver region, Canada for specific scenario Mw 7.5 Strait of Georgia crustal earthquake. The southern part of the Vancouver region is located on a deep sedimentary basin created in the Fraser River delta. In case of deep Vancouver sedimentary basin considering amplification only due to shallow site response Vs30-dependent site term is not sufficient. Therefore, we derived (de)amplification function for different periods to quantify basin effect. We used NGA – West 2 ground motion prediction equations (GMPEs) for crustal events which include basin depth term. Amplification function was derived with respect to standard GMPEs for crustal events in western Canada. Amplification, considering site response including Vs30 and basin depth term at period 0.5 s can reach values as high as 3 at the softest and deepest sediments. The liquefaction potential was based on HAZUS and Zhu et al. (2017) methodologies calibrated to better reflect local geological conditions and liquefaction observations (Monahan et al. 2010, Clague 2002). We used USGS Vs30 data, enhanced by local seismic and geologic measurements, to characterize soil conditions, and topographical data and IF proprietary flow accumulation data to characterize water saturation. Liquefaction hazard is calculated in terms of probability of liquefaction occurrence and permanent ground deformation. For the chosen scenario the potential contribution to mean loss due to basin effect could be in the range 15% - 30% and 35% - 75% due to liquefaction depending on structural types of the buildings.</p>



2020 ◽  
Author(s):  
Paul Dunning ◽  
Kirsty Styles ◽  
Daniel Evans ◽  
Stephen Hutchings

<p>Catastrophe models are well established tools, traditionally used by the re/insurance industry to assess the financial risk to insured property (“exposure”) associated with natural perils. Catastrophe modelling is challenging, particularly for flood perils over large geographical scales, for a number of reasons. To adequately capture the fine spatial variability of flood depth, a flood catastrophe model must be of high spatial resolution. To validly compare estimates of risk obtained from catastrophe models for different geographical regions, those models must be built from geographically consistent data. To compare estimates of risk between any given collection of geographical regions globally, global coverage is required.</p><p>Traditional catastrophe models struggle to meet these requirements; compromises are made, often for performance reasons.  In addition, traditional models are typically static datasets, built in a discrete process prior to their use in exposure risk assessment. Scientific assumptions are therefore deeply embedded; there is little scope for the end user to adjust the model based on their own scientific knowledge.</p><p>This research presents a new and better approach to catastrophe modelling that addresses these challenges and, in doing so, has allowed creation of the world’s first global flood catastrophe model.</p><p>JBA’s Global Flood Model is facilitated by a technological breakthrough in the form of JBA’s <strong>FLY</strong> Technology. The innovations encoded in <strong>FLY</strong> have enabled JBA to create a model capable of consistent global probabilistic flood risk assessment, operating at 30m resolution and supported by a catalogue of 15 million distinct flood events (both river and surface water). <strong>FLY </strong>brings a model to life dynamically, from raw flood hazard data, simultaneously addressing the user configurability and performance challenges.</p><p>Global Flood Model and <strong>FLY</strong> Technology will be of interest to those involved in financial, economic or humanitarian risk assessment, particularly in and between countries and regions not covered by flood catastrophe models to date. The detail of how they work will be covered here, and their power in facilitating consistent global flood risk assessment will be demonstrated.</p>



2020 ◽  
Author(s):  
Shane Latchman ◽  
Alastair Clarke ◽  
Boyd Zapatka ◽  
Peter Sousounis ◽  
Scott Stransky

<p>In 2019, the Bank of England, through the Prudential Regulation Authority (PRA), became the first regulator to ask insurers how financial losses could change under prescribed climate scenarios. Insurers readily use catastrophe models to quantify the likelihood and severity of financial losses based on at least 40 years of past climate data. However, they cannot readily use these models to answer the climate scenarios posed by the PRA.</p><p>We present four novel methods for how to use existing catastrophe models to answer what-if climate scenario questions. The methods make use of sampling algorithms, quantile mapping, and adjustments to model parameters, to represent different climate scenarios.</p><p>Using AIR’s Hurricane model for the United States (US), Inland Flood model for Great Britain, and Coastal Flood model for Great Britain, we quantify the sensitivity of the Average Annual Loss (AAL) and the 100-year exceedance probability aggregate loss (100-year loss) to four environmental variables under three climate scenarios. The environmental variables include the (i) frequency and (ii) severity of major US landfalling hurricanes; (iii) the mean sea level along the coast of the US and Great Britain; and (iv) the surface run-off from extreme precipitation events in Great Britain. Each of these variables are increased in turn by low, medium and high amounts as prescribed by the PRA.</p><p>We compare each variable and rank their influence on loss. We find that the AAL and 100-year loss are more sensitive to changes in the severity of major US hurricanes than changes in the frequency. We will show whether sea level rise has a greater influence on coastal flooding losses in the US or in Great Britain, and we show how sensitive inland flooding losses are to surface run-off.     </p><p>The methods yield approximate results but are quicker and easier to implement than running Global Circulation Models. The methods and results will interest those in insurance, the public sector, and academia, who are working to understand how society best adapts to climate change.</p>



2019 ◽  
Vol 79 ◽  
pp. 152-168 ◽  
Author(s):  
Stephen J. Guastello ◽  
Anthony N. Correro ◽  
David E. Marra


Sign in / Sign up

Export Citation Format

Share Document