scholarly journals Partitioning the uncertainty contributions of dependent offshore forcing conditions in the probabilistic assessment of future coastal flooding at a macrotidal site

2021 ◽  
Author(s):  
Jeremy Rohmer ◽  
Deborah Idier ◽  
Remi Thieblemont ◽  
Goneri Le Cozannet ◽  
François Bachoc

Abstract. Getting a deep insight into the role of coastal flooding drivers is of high interest for the planning of adaptation strategies for future climate conditions. Using global sensitivity analysis, we aim to measure the contributions of the offshore forcing conditions (wave/wind characteristics, still water level and sea level rise (SLR) projected up to 2200) to the occurrence of the flooding event (defined when the inland water volume exceeds a given threshold YC) at Gâvres town on the French Atlantic coast in a macrotidal environment. This procedure faces, however, two major difficulties, namely (1) the high computational time costs of the hydrodynamic numerical simulations; (2) the statistical dependence between the forcing conditions. By applying a Monte-Carlo-based approach combined with multivariate extreme value analysis, our study proposes a procedure to overcome both difficulties through the computation of sensitivity measures dedicated to dependent input variables (named Shapley effects) with the help of Gaussian process (GP) metamodels. On this basis, our results outline the key influence of SLR over time. Its contribution rapidly increases over time until 2100 where it almost exceeds the contributions of all other uncertainties (with Shapley effect > 40 % considering the representative concentration pathway RCP4.5 scenario). After 2100, it continues to linearly increase up to > 50 %. The SLR influence depends however on our modelling assumptions. Before 2100, it is strongly influenced by the digital elevation Model (DEM); with a DEM with lower topographic elevation (before the raise of dykes in some sectors), the SLR effect is smaller by ~40 %. This influence reduction goes in parallel with an increase in the importance of wave/wind characteristics, hence indicating how the relative effect of the flooding drivers strongly change when protective measures are adopted. By 2100, the joint role of RCP and of YC impacts the SLR influence, which is reduced by 20–30 % when the mode of the SLR probability distribution is high (for RCP8.5 in particular) and when YC is low (of 50 m3). Finally, by showing that these results are robust to the main uncertainties in the estimation procedure (Monte-Carlo sampling and GP error), the combined GP-Shapley effect approach proves to be a valuable tool to explore and characterize uncertainties related to compound coastal flooding under SLR.

2020 ◽  
Vol 16 (7) ◽  
pp. 20200096
Author(s):  
James M. Smoliga

Gut capacity and plasticity have been examined across multiple species, but are not typically explored in the context of extreme human performance. Here, I estimate the theoretical maximal active consumption rate (ACR) in humans, using 39 years of historical data from the annual Nathan's Famous Hot Dog Eating Contest. Through nonlinear modelling and generalized extreme value analysis, I show that humans are theoretically capable of achieving an ACR of approximately 832 g min −1 fresh matter over 10 min duration. Modelling individual performances across 5 years reveals that maximal ACR significantly increases over time in ‘elite’ competitive eaters, likely owing to training effects. Extreme digestive plasticity suggests that eating competition records are quite biologically impressive, especially in the context of carnivorous species and other human athletic competitions.


2008 ◽  
Vol 4 (4) ◽  
pp. 96-103
Author(s):  
S. Chandrasekhar

Motor Vehicle Insurance claims form a substantial component of Non life insurance claims and it is also growing with increasing number of vehicles on roads. It is also desirable to have an idea of what will be the likely claim amount for the coming future (Monthly, Quarterly, Yearly) based on past claim data. If one looks at the claim amount one can make out that there will be few large claims compared to large number of average and below average claims. Thus the distributions of claims do not follow a Symmetric pattern which makes it difficult using normal Statistical analysis. The methodology followed to analyze such data is known as Extreme value Analysis. Extreme value analysis is a general name which covers (i) Generalised Extreme Value (GEV) (ii) Generalised Pareto Distribution (GPD). Basically these techniques can deal with non symmetric shape of the distribution which is close to reality. Normally one fits a generalised Extreme Value distribution (GEV)/Generalised Pareto Distribution (GPD) and using parameters of fitted distribution future, forecast of likely losses can be predicted. Second method of analyzing such data is using methodology of simulation. Here we fit a Poisson distribution for arrival of claims and weibull/pareto/Lognormal for claim amount. Using Monte Carlo Simulation one combines both the distributions for future prediction of claim amount. This paper shows a comparison of the above techniques on motor vehicle claims data.


2011 ◽  
Vol 50 (1) ◽  
pp. 255-266 ◽  
Author(s):  
Nicholas Cook

Abstract This comment addresses the role of sampling error in extreme value analysis. A note published in this journal claimed that Weibull’s 1939 estimator for sample probability has a unique status that invalidates all other estimators and renders invalid all of the developments of unbiased distribution-dependent estimators made since 1939. The note concluded that the use of distribution-dependent estimators should be abandoned and that many estimates of the weather-related risks should be reevaluated and the related building codes and other related regulations updated. This comment uses rigorous statistical proofs to make the diametrically opposite case: namely, that development of distribution-dependent estimators has resulted in an improvement in accuracy over the past half century and that no changes are required to the basis of weather-related building codes and regulations. These rigorous proofs are supplemented by sampling experiments that demonstrate their validity. This comment provides an introduction to the basic statistical concepts of the statistical modeling of extremes, including unbiased estimators for the model parameters.


2019 ◽  
Vol 276 ◽  
pp. 04006
Author(s):  
Md Ashraful Alam ◽  
Craig Farnham ◽  
Kazuo Emura

In Bangladesh, major floods are frequent due to its unique geographic location. About one-fourth to one-third of the country is inundated by overflowing rivers during the monsoon season almost every year. Calculating the risk level of river discharge is important for making plans to protect the ecosystem and increasing crop and fish production. In recent years, several Bayesian Markov chain Monte Carlo (MCMC) methods have been proposed in extreme value analysis (EVA) for assessing the flood risk in a certain location. The Hamiltonian Monte Carlo (HMC) method was employed to obtain the approximations to the posterior marginal distribution of the Generalized Extreme Value (GEV) model by using annual maximum discharges in two major river basins in Bangladesh. The discharge records of the two largest branches of the Ganges-Brahmaputra-Meghna river system in Bangladesh for the past 42 years were analysed. To estimate flood risk, a return level with 95% confidence intervals (CI) has also been calculated. Results show that, the shape parameter of each station was greater than zero, which shows that heavy-tailed Frechet cases. One station, Bahadurabad, at Brahmaputra river basin estimated 141,387 m3s-1 with a 95% CI range of [112,636, 170,138] for 100-year return level and the 1000-year return level was 195,018 m3s-1 with a 95% CI of [122493, 267544]. The other station, Hardinge Bridge, at Ganges basin estimated 124,134 m3 s-1 with a 95% CI of [108,726, 139,543] for 100-year return level and the 1000-year return level was 170,537 m3s-1 with a 95% CI of [133,784, 207,289]. As Bangladesh is a flood prone country, the approach of Bayesian with HMC in EVA can help policy-makers to plan initiatives that could result in preventing damage to both lives and assets.


2020 ◽  
Author(s):  
Torben Schmith ◽  
Peter Thejll ◽  
Fredrik Boberg ◽  
Peter Berg ◽  
Ole Bøssing Christensen ◽  
...  

<p>Severe precipitation events occur rarely and are often localized in space and of short duration, but are important for societal managing of infrastructure such as sewage systems, metros etc. Therefore, there is a demand for estimating expected future changes in the statistics of these rare events. These are usually projected using RCM scenario runs combined with extreme value analysis to obtain selected return levels of precipitation intensity. However, due to RCM imperfections, the modelled climate for the present-day usually has errors relative to observations. Therefore, the RCM results are ‘error corrected‘ to match observations more closely in order to increase reliability of results.</p><p>In the present work we evaluate different error correction techniques and compare with non-corrected projections. This is done in an inter-model cross-validation setup, in which each model in turn plays the role of observations, against which the remaining error-corrected models are validated. The study uses hourly data (historical & RCP8.5 late 21<sup>st</sup> century) from 13 models covering the EURO-CORDEX ensemble at 0.11 degree resolution (about 12.5 km), from which fields of selected return levels are extracted for 1 h and 24 h duration. The error correction techniques applied to the return levels are based on extreme value analysis and include analytical quantile-quantile matching together with a simpler climate factor approach.</p><p>The study identifies regions where the error correction techniques perform differently, and therefore contributes to guidelines on how and where to apply calibration techniques when projecting extreme return levels.</p>


2011 ◽  
Vol 63 (5) ◽  
pp. 948-955 ◽  
Author(s):  
A. Jiménez ◽  
A. Pérez-Foguet

This paper presents an analysis of the relationships between technology of water point, management related practices and functionality over time through an extensive water point mapping study made in 15 rural districts of Tanzania, which covered 15% of the total rural population of the country. Results show irregular functionality rates at district level by technology, but reveal statistical dependence between functionality and technology at regional level. Management-related questions show that reported expenditure is the indicator most related to functionality. All categories of water points show very low performance over time. In the first five years of operation, about 30% of water points become non-functional. Only between 35% and 47% of water points are working 15 years after installation, depending on the technology. By categories, hand pumps are the less durable of the technologies studied. We suggest that more emphasis has to be placed on the creation of community capacities to manage the services during and after the installation of water points. At the same time, the role of decentralised government has to be strengthened to provide support to community services in the long term.


2013 ◽  
Vol 40 (9) ◽  
pp. 927-929 ◽  
Author(s):  
Lasse Makkonen ◽  
Matti Pajari ◽  
Maria Tikanmäki

Plotting positions are used in the extreme value analysis for many engineering applications. The authors of the discussed paper concluded based on their simulations that distribution dependent plotting position formulae provide a better fit to the underlying cumulative distribution than the distribution free Weibull formula. We show here by Monte Carlo simulations following the theory of probability that the opposite is true, and outline that the criteria used in the comparisons made by the authors of discussed paper are inappropriate. Accordingly, the Weibull formula should be used as the unique plotting position.


2015 ◽  
Vol 12 (6) ◽  
pp. 2783-2805
Author(s):  
Y. Luo ◽  
D. Sui ◽  
H. Shi ◽  
Z. Zhou ◽  
D. Wang

Abstract. We use a novel statistical approach-MGPD to analyze the joint probability distribution of storm surge events at two sites and present a warning method for storm surges at two adjacent positions in Beibu Gulf, using the sufficiently long field data on surge levels at two sites. The methodology also develops the procedure of application of MGPD, which includes joint threshold and Monte Carlo simulation, to handle multivariate extreme values analysis. By comparing the simulation result with analytic solution, it is shown that the relative error of the Monte Carlo simulation is less than 8.6 %. By running MGPD model based on long data at Beihai and Dongfang, the simulated potential surge results can be employed in storm surge warnings of Beihai and joint extreme water level predictions of two sites.


Sign in / Sign up

Export Citation Format

Share Document