scholarly journals A probabilistic approach to estimating residential losses from different flood types

Author(s):  
Dominik Paprotny ◽  
Heidi Kreibich ◽  
Oswaldo Morales-Nápoles ◽  
Dennis Wagenaar ◽  
Attilio Castellarin ◽  
...  

AbstractResidential assets, comprising buildings and household contents, are a major source of direct flood losses. Existing damage models are mostly deterministic and limited to particular countries or flood types. Here, we compile building-level losses from Germany, Italy and the Netherlands covering a wide range of fluvial and pluvial flood events. Utilizing a Bayesian network (BN) for continuous variables, we find that relative losses (i.e. loss relative to exposure) to building structure and its contents could be estimated with five variables: water depth, flow velocity, event return period, building usable floor space area and regional disposable income per capita. The model’s ability to predict flood losses is validated for the 11 flood events contained in the sample. Predictions for the German and Italian fluvial floods were better than for pluvial floods or the 1993 Meuse river flood. Further, a case study of a 2010 coastal flood in France is used to test the BN model’s performance for a type of flood not included in the survey dataset. Overall, the BN model achieved better results than any of 10 alternative damage models for reproducing average losses for the 2010 flood. An additional case study of a 2013 fluvial flood has also shown good performance of the model. The study shows that data from many flood events can be combined to derive most important factors driving flood losses across regions and time, and that resulting damage models could be applied in an open data framework.

Author(s):  
Vanessa Simonite

In a module designed to develop skills in presenting and evaluating statistics, students of mathematics and statistics were given an assignment asking them to research and write a piece of data driven journalism. Data driven journalism is a new phenomenon which has expanded rapidly due to the growth in open data, new visualisation tools and online reporting in newspapers, periodicals and blogs. The assignment provided students with a writing assignment that was individual, small-scale, research-based and embedded within their discipline. The students were asked to formulate a research question that could be investigated using survey data available from an electronic data archive. The result of the investigation was to be written up as a piece of data driven journalism for online publication, including a data visualisation. In addition to using discipline-based skills and written communication, the assignment required students to use research skills and digital literacy. An assignment set in the context of writing for the public extends students’ writing experience beyond the domains of discipline-based professional reports and academic writing. Data driven journalism provides opportunities to develop students’ writing alongside other skills for employment and can be used to design assessments for a wide range of disciplines.


2021 ◽  
Author(s):  
Axelle Doppagne ◽  
Pierre Archambeau ◽  
Jacques Teller ◽  
Anna Rita Scorzini ◽  
Daniela Molinari ◽  
...  

<p>Flood damage modelling is a key component of flood risk modelling, assessment and management. Reliable empirical data of flood damage are essential to support the development and validation of flood damage models. However, such datasets remain scarce and incomplete, particularly those combining a large spatial coverage (e.g., regional, national) over a long time period (e.g., several decades) with a detailed resolution (e.g., address-level data).</p><p>In this research, we analysed a database of 27,000 compensation claims submitted to a Belgian state agency (Disaster Fund). It covers 104 natural disasters of various types (incl. floods, storms, rockslides …) which occurred in the Walloon region in Belgium between 1993 and 2019. The region extends over parts of the Meuse and of the Scheldt river basins. The registered amounts of damage at the building level were estimated by state-designated experts. They are classified in six categories. While roughly half of the registered disasters are pluvial flooding events, they account for less than a quarter of the total claimed damage. In contrast, riverine floods correspond to about one third of the registered events, but they lead to one half of the claimed damage.</p><p>A detailed analysis of the data was undertaken for a limited number of major riverine flood events (1993, 1995, 2002), which have caused a very large portion of the total damage. By geo-referencing the postal address of each individual building, it was possible to assign each claim to a specific river reach. This enabled pointing at the most flood prone river stretches in an objective way. Then, using cadastral data, each type and amount of damage could be attributed to a specific building.</p><p>To explore the value of the database for elaborating and validating damage models, the claimed damage data at the building level were related to estimates of hydraulic variables for the corresponding flood events. To do so, we used an existing database of results of 2D hydrodynamic modelling, covering 1,200+ km of river reaches and providing raster files at a spatial resolution ranging from 2 m to 5 m for computed flow depth and velocity in the floodplains. The attribution of flow depth to individual buildings was performed either by averaging the computed flow depths around the building footprint or by considering the maximum value.</p><p>The correlation between claimed damage at the building level and attributed flow depth is relatively low, irrespective of the flow depth attribution method. This may result from the high uncertainty affecting each of these variables. It also hints at the necessity of using multivariable damage models which account for a broader range of explanatory variables than the sole flow depth (flow velocity, characteristics of building material and equipment, building age, etc.). This will be discussed in the presentation and further explored in the next steps of this research.</p><p>Data for this analysis were provided by the Belgian regional agency SPW-IAS in July 2020. Due to privacy reasons, data at the address-level may not be disseminated in the scientific community; but results of data processing may be shared at an aggregated level.</p>


2020 ◽  
Vol 20 (11) ◽  
pp. 2997-3017 ◽  
Author(s):  
Daniela Molinari ◽  
Anna Rita Scorzini ◽  
Chiara Arrighi ◽  
Francesca Carisi ◽  
Fabio Castelli ◽  
...  

Abstract. Effective flood risk management requires a realistic estimation of flood losses. However, available flood damage estimates are still characterized by significant levels of uncertainty, questioning the capacity of flood damage models to depict real damages. With a joint effort of eight international research groups, the objective of this study was to compare, in a blind-validation test, the performances of different models for the assessment of the direct flood damage to the residential sector at the building level (i.e. microscale). The test consisted of a common flood case study characterized by high availability of hazard and building data but with undisclosed information on observed losses in the implementation stage of the models. The nine selected models were chosen in order to guarantee a good mastery of the models by the research teams, variety of the modelling approaches, and heterogeneity of the original calibration context in relation to both hazard and vulnerability features. By avoiding possible biases in model implementation, this blind comparison provided more objective insights on the transferability of the models and on the reliability of their estimations, especially regarding the potentials of local and multivariable models. From another perspective, the exercise allowed us to increase awareness of strengths and limits of flood damage modelling, which are summarized in the paper in the form of take-home messages from a modeller's perspective.


2016 ◽  
Vol 16 (1) ◽  
pp. 1-14 ◽  
Author(s):  
D. J. Wagenaar ◽  
K. M. de Bruijn ◽  
L. M. Bouwer ◽  
H. de Moel

Abstract. This paper addresses the large differences that are found between damage estimates of different flood damage models. It explains how implicit assumptions in flood damage functions and maximum damages can have large effects on flood damage estimates. This explanation is then used to quantify the uncertainty in the damage estimates with a Monte Carlo analysis. The Monte Carlo analysis uses a damage function library with 272 functions from seven different flood damage models. The paper shows that the resulting uncertainties in estimated damages are in the order of magnitude of a factor of 2 to 5. The uncertainty is typically larger for flood events with small water depths and for smaller flood events. The implications of the uncertainty in damage estimates for flood risk management are illustrated by a case study in which the economic optimal investment strategy for a dike segment in the Netherlands is determined. The case study shows that the uncertainty in flood damage estimates can lead to significant over- or under-investments.


2020 ◽  
Author(s):  
Daniela Molinari ◽  
Anna Rita Scorzini ◽  
Chiara Arrighi ◽  
Francesca Carisi ◽  
Fabio Castelli ◽  
...  

Abstract. Effective flood risk management requires a realistic estimation of flood losses. However, available flood damage estimates are still characterised by significant levels of uncertainty, questioning the capacity of flood damage models to depict real damages. With a joint effort of eight international research groups, the objective of this study was to compare the performances of different damage models for the estimation of the direct flood damage to the residential sector at the building level (i.e. micro scale) in a blind validation test. The test consisted in a common flood case study characterised by high availability of hazard and building data, but with undisclosed information on observed losses in the implementation stage of the models. The selected nine models were chosen in order to guarantee a good mastery of the models by the research teams, variety of the modelling approaches and heterogeneity of the original calibration context, in relation to both hazard and vulnerability features. By avoiding possible biases in model implementation, this blind comparison provided more objective insights on the transferability of the models and on the reliability of their estimations, especially regarding the potentials of local and multi-variable models. From another perspective, the exercise allowed to increase authors’ awareness on strengths and limits of flood damage modelling, which are summarised in the paper in the form of take-home messages from a modeller's perspective.


2013 ◽  
Vol 16 (1) ◽  
pp. 59-67

<p>The Soil Science Institute of Thessaloniki produces new digitized Soil Maps that provide a useful electronic database for the spatial representation of the soil variation within a region, based on in situ soil sampling, laboratory analyses, GIS techniques and plant nutrition mathematical models, coupled with the local land cadastre. The novelty of these studies is that local agronomists have immediate access to a wide range of soil information by clicking on a field parcel shown in this digital interface and, therefore, can suggest an appropriate treatment (e.g. liming, manure incorporation, desalination, application of proper type and quantity of fertilizer) depending on the field conditions and cultivated crops. A specific case study is presented in the current work with regards to the construction of the digitized Soil Map of the regional unit of Kastoria. The potential of this map can easily be realized by the fact that the mapping of the physicochemical properties of the soils in this region provided delineation zones for differential fertilization management. An experiment was also conducted using remote sensing techniques for the enhancement of the fertilization advisory software database, which is a component of the digitized map, and the optimization of nitrogen management in agricultural areas.</p>


Author(s):  
Edgar Meij ◽  
Marc Bron ◽  
Laura Hollink ◽  
Bouke Huurnink ◽  
Maarten de Rijke
Keyword(s):  

Oxford Studies in Ancient Philosophy provides, twice each year, a collection of the best current work in the field of ancient philosophy. Each volume features original essays that contribute to an understanding of a wide range of themes and problems in all periods of ancient Greek and Roman philosophy, from the beginnings to the threshold of the Middle Ages. From its first volume in 1983, OSAP has been a highly influential venue for work in the field, and has often featured essays of substantial length as well as critical essays on books of distinctive importance. Volume LV contains: a methodological examination on how the evidence for Presocratic thought is shaped through its reception by later thinkers, using discussions of a world soul as a case study; an article on Plato’s conception of flux and the way in which sensible particulars maintain a kind of continuity while undergoing constant change; a discussion of J. L. Austin’s unpublished lecture notes on Aristotle’s Nicomachean Ethics and his treatment of loss of control (akrasia); an article on the Stoics’ theory of time and in particular Chrysippus’ conception of the present and of events; and two articles on Plotinus, one that identifies a distinct argument to show that there is a single, ultimate metaphysical principle; and a review essay discussing E. K. Emilsson’s recent book, Plotinus.


Computers ◽  
2021 ◽  
Vol 10 (6) ◽  
pp. 82
Author(s):  
Ahmad O. Aseeri

Deep Learning-based methods have emerged to be one of the most effective and practical solutions in a wide range of medical problems, including the diagnosis of cardiac arrhythmias. A critical step to a precocious diagnosis in many heart dysfunctions diseases starts with the accurate detection and classification of cardiac arrhythmias, which can be achieved via electrocardiograms (ECGs). Motivated by the desire to enhance conventional clinical methods in diagnosing cardiac arrhythmias, we introduce an uncertainty-aware deep learning-based predictive model design for accurate large-scale classification of cardiac arrhythmias successfully trained and evaluated using three benchmark medical datasets. In addition, considering that the quantification of uncertainty estimates is vital for clinical decision-making, our method incorporates a probabilistic approach to capture the model’s uncertainty using a Bayesian-based approximation method without introducing additional parameters or significant changes to the network’s architecture. Although many arrhythmias classification solutions with various ECG feature engineering techniques have been reported in the literature, the introduced AI-based probabilistic-enabled method in this paper outperforms the results of existing methods in outstanding multiclass classification results that manifest F1 scores of 98.62% and 96.73% with (MIT-BIH) dataset of 20 annotations, and 99.23% and 96.94% with (INCART) dataset of eight annotations, and 97.25% and 96.73% with (BIDMC) dataset of six annotations, for the deep ensemble and probabilistic mode, respectively. We demonstrate our method’s high-performing and statistical reliability results in numerical experiments on the language modeling using the gating mechanism of Recurrent Neural Networks.


Sign in / Sign up

Export Citation Format

Share Document