scholarly journals EXTREME VALUE ANALYSIS FOR MODELING HIGH PM10 LEVEL IN JOHOR BAHRU

2015 ◽  
Vol 76 (1) ◽  
Author(s):  
Nor Azrita Mohd Amin ◽  
Mohd Bakri Adam ◽  
Ahmad Zaharin Aris

Extreme value theory is a very well-known statistical analysis for modeling extreme data in environmental management. The main focus is to compare the generalized extreme value distribution (GEV) and the generalized Pareto distribution (GPD) for modeling extreme data in terms of estimated parameters and return levels. The maximum daily PM10 data for Johor Bahru monitoring station based on a 14 years database (1997-2010) were analyzed. It is found that the parameters estimated are more comparable if the extracted numbers of extreme series for both models are much more similar. The 10-years return value for GEV is  while for GPD is . Based on the threshold choice plot, threshold  is chosen and the corresponding 10-years return level is . According to the air pollution index in Malaysia, this value is categorized as hazardous.

2007 ◽  
Vol 46 (10) ◽  
pp. 1501-1522 ◽  
Author(s):  
Allan W. MacAfee ◽  
Samuel W. K. Wong

Abstract Many of the extreme ocean wave events generated by tropical cyclones (TCs) can be explained by examining one component of the spectral wave field, the trapped-fetch wave (TFW). Using a Lagrangian TFW model, a parametric model representation of the local TC wind fields, and the National Hurricane Center’s hurricane database archive, a dataset of TFWs was created from all TCs in the Atlantic Ocean, Gulf of Mexico, and Caribbean Sea from 1851 to 2005. The wave height at each hourly position along a TFW trajectory was sorted into 2° × 2° latitude–longitude grid squares. Five grid squares (north of Hispaniola, Gulf of Mexico, Carolina coast, south of Nova Scotia, and south of Newfoundland) were used to determine if extreme value theory could be applied to the extremes in the TFW dataset. The statistical results justify accepting that a generalized Pareto distribution (GPD) model with a threshold of 6 m could be fitted to the data: the datasets were mostly modeled adequately, and much of the output information was useful. Additional tests were performed by sorting the TFW data into the marine areas in Atlantic Canada, which are of particular interest to the Meteorological Service of Canada because of the high ocean traffic, offshore drilling activities, and commercial fishery. GPD models were fitted, and return periods and the 95% confidence intervals (CIs) for 10-, 15-, and 20-m return levels were computed. The results further justified the use of the GPD model; hence, extension to the remaining grid squares was warranted. Of the 607 grid squares successfully modeled, the percentage of grid squares with finite lower (upper) values for the 10-, 15-, and 20-m return level CIs were 100 (80), 94 (53), and 90 (16), respectively. The lower success rate of 20-m TFW CIs was expected, given the rarity of 20-m TFWs: of the 5 713 625 hourly TFW points, only 13 958, or 0.24%, were 20 m or higher. Overall, the distribution of the successfully modeled grid squares in the data domain agreed with TFW theory and TC climatology. As a direct result of this study, the summary datasets and return level plots were integrated into application software for use by risk managers. A description of the applications illustrates their use in addressing various questions on extreme TFWs.


Author(s):  
Lu Deng ◽  
Zhengjun Zhang

Extreme smog can have potentially harmful effects on human health, the economy and daily life. However, the average (mean) values do not provide strategically useful information on the hazard analysis and control of extreme smog. This article investigates China's smog extremes by applying extreme value analysis to hourly PM2.5 data from 2014 to 2016 obtained from monitoring stations across China. By fitting a generalized extreme value (GEV) distribution to exceedances over a station-specific extreme smog level at each monitoring location, all study stations are grouped into eight different categories based on the estimated mean and shape parameter values of fitted GEV distributions. The extreme features characterized by the mean of the fitted extreme value distribution, the maximum frequency and the tail index of extreme smog at each location are analysed. These features can provide useful information for central/local government to conduct differentiated treatments in cities within different categories and conduct similar prevention goals and control strategies among those cities belonging to the same category in a range of areas. Furthermore, hazardous hours, breaking probability and the 1-year return level of each station are demonstrated by category, based on which the future control and reduction targets of extreme smog are proposed for the cities of Beijing, Tianjin and Hebei as an example.


2021 ◽  
Author(s):  
Anne Dutfoy ◽  
Gloria Senfaute

Abstract Probabilistic Seismic Hazard Analysis (PSHA) procedures require that at least the mean activity rate be known, as well as the distribution of magnitudes. Within the Gutenberg-Richter assumption, that distribution is an Exponential distribution, upperly truncated to a maximum possible magnitude denoted $m_{max}$. This parameter is often fixed from expert judgement under tectonics considerations, due to a lack of universal method. In this paper, we propose two innovative alternatives to the Gutenberg-Richter model, based on the Extreme Value Theory and that don't require to fix a priori the value of $m_{max}$: the first one models the tail distribution magnitudes with a Generalized Pareto Distribution; the second one is a variation on the usual Gutenberg-Richter model where $m_{max}$ is a random variable that follows a distribution defined from an extreme value analysis. We use the maximum likelihood estimators taking into account the unequal observation spans depending on magnitude, the incompleteness threshold of the catalog and the uncertainty in the magnitude value itself. We apply these new recurrence models on the data observed in the Alps region, in the south of France and we integrate them into a probabilistic seismic hazard calculation to evaluate their impact on the seismic hazard levels. The proposed new recurrence models introduce a reduction of the seismic hazard level compared to the common Gutenberg-Richter model conventionally used for PSHA calculations. This decrease is significant for all frequencies below 10 Hz, mainly at the lowest frequencies and for very long return periods. To our knowledge, both new models have never been used in a probabilistic seismic hazard calculation and constitute a new promising generation of recurrence models.


2021 ◽  
Author(s):  
Katharina Klehmet ◽  
Peter Berg ◽  
Denica Bozhinova ◽  
Louise Crochemore ◽  
Ilias Pechlivanidis ◽  
...  

<p>Robust information of hydrometeorological extremes is important for effective risk management, mitigation and adaptation measures by public authorities, civil and engineers dealing for example with water management. Typically, return values of certain variables, such as extreme precipitation and river discharge, are of particular interest and are modelled statistically using Extreme Value Theory (EVT). However, the estimation of these rare events based on extreme value analysis are affected by short observational data records leading to large uncertainties.</p><p>In order to overcome this limitation, we propose to use the latest seasonal meteorological prediction system of the European Centre for Medium-Range Weather Forecasts (ECMWF SEAS5) and seasonal hydrological forecasts generated with the pan-European E-HYPE model of the original period 1993-2015 and to extend the dataset to longer synthetic time series by pooling single forecast months to surrogate years. To ensure an independent dataset, the seasonal forecast skill is assessed in advance and months (and lead months) with positive skill are excluded. In this study, we simplify the method and work with samples of 6- and 4-month forecasts (instead of the full 7-month forecasts) depending on the statistical independency of the variables. It enables the record to be extended from the original 23 years to 3450 and 2300 surrogate years for the 6- and 4-month forecasts respectively.</p><p>Furthermore, we investigate the robustness of estimated 50- and 100-year return values for extreme precipitation and river discharge using 1-year block maxima that are fitted to the Generalized Extreme Value distribution. Surrogate sets of pooled years are randomly constructed using the Monte-Carlo approach and different sample sizes are chosen. This analysis reveals a considerable reduction in the uncertainty of all return period estimations for both variables for selected locations across Europe using a sample size of 500 years. This highlights the potential in using the ensembles of meteorological and hydrological seasonal forecasts to obtain timeseries of sufficient length and minimize the uncertainty in the extreme value analysis.</p>


2018 ◽  
Vol 150 ◽  
pp. 05025
Author(s):  
Nor Azrita Mohd Amin ◽  
Siti Aisyah Zakaria

The main concern in environmental issue is on extreme phenomena (catastrophic) instead of common events. However, most statistical approaches are concerned primarily with the centre of a distribution or on the average value rather than the tail of the distribution which contains the extreme observations. The concept of extreme value theory affords attention to the tails of distribution where standard models are proved unreliable to analyse extreme series. High level of particulate matter (PM10) is a common environmental problem which causes various impacts to human health and material damages. If the main concern is on extreme events, then extreme value analysis provides the best result with significant evidence. The monthly average and monthly maxima PM10 data for Perlis from 2003 to 2014 were analysed. Forecasting for average data is made by Holt-Winters method while return level determine the predicted value of extreme events that occur on average once in a certain period. The forecasting from January 2015 to December 2016 for average data found that the highest forecasted value is 58.18 (standard deviation 18.45) on February 2016 while return level achieved 253.76 units for 24 months (2015-2016) return periods.


2010 ◽  
Vol 14 (12) ◽  
pp. 2527-2544 ◽  
Author(s):  
J. Blanchet ◽  
M. Lehning

Abstract. For adequate risk management in mountainous countries, hazard maps for extreme snow events are needed. This requires the computation of spatial estimates of return levels. In this article we use recent developments in extreme value theory and compare two main approaches for mapping snow depth return levels from in situ measurements. The first one is based on the spatial interpolation of pointwise extremal distributions (the so-called Generalized Extreme Value distribution, GEV henceforth) computed at station locations. The second one is new and based on the direct estimation of a spatially smooth GEV distribution with the joint use of all stations. We compare and validate the different approaches for modeling annual maximum snow depth measured at 100 sites in Switzerland during winters 1965–1966 to 2007–2008. The results show a better performance of the smooth GEV distribution fitting, in particular where the station network is sparser. Smooth return level maps can be computed from the fitted model without any further interpolation. Their regional variability can be revealed by removing the altitudinal dependent covariates in the model. We show how return levels and their regional variability are linked to the main climatological patterns of Switzerland.


2010 ◽  
Vol 7 (4) ◽  
pp. 6129-6177 ◽  
Author(s):  
J. Blanchet ◽  
M. Lehning

Abstract. For adequate risk management in mountainous countries, hazard maps for extreme snow events are needed. This requires the computation of spatial estimates of return levels. In this article we use recent developments in extreme value theory and compare two main approaches for mapping snow depth return levels from in situ measurements. The first one is based on the spatial interpolation of pointwise extremal distributions (the so-called Generalized Extreme Value distribution, GEV henceforth) computed at station locations. The second one is new and based on the direct estimation of a spatially smooth GEV distribution with the joint use of all stations. We compare and validate the different approaches for modeling annual maximum snow depth measured at 100 sites in Switzerland during winters 1965–1966 to 2007–2008. The results show a better performance of the smooth GEV distribution fitting, in particular where the station network is sparser. Smooth return level maps can be computed from the fitted model without any further interpolation. Their regional variability can be revealed by removing the altitudinal dependent covariates in the model. We show how return levels and their regional variability are linked to the main climatological patterns of Switzerland.


2019 ◽  
Vol 8 (4) ◽  
pp. 1
Author(s):  
Queensley C. Chukwudum

Reinsurance is of utmost importance to insurers because it enables insurance companies cover risks that they, under normal circumstances, would not be able to cover on their own. An insurer needs to be able to evaluate his solvency probability and consequently, adjust his retention levels appropriately because the insurer’s retention level plays a vital role in determining the premiums he will pay to the reinsurer. To illustrate how Extreme Value theory can be applied, this study delves into modelling the probabilistic behaviour of the frequency and severity of large motor claims from the Nigerian insurance sector (2013-2016) using the Negative Binomial-Generalized Pareto distribution (NB-GPD). The annual loss distribution is simulated using the Monte Carlo method and it is used to predict the expected annual total claims and estimate the capital requirement for a year. Pricing of the Excess-of-loss (XL) reinsurance is also examined to aid insurers in optimizing their risk management decision in regards to the choice of their risk transfer position.


2014 ◽  
Vol 34 (5) ◽  
pp. 992-1000 ◽  
Author(s):  
Gabriel C. Blain

The application of the Extreme Value Theory (EVT) to model the probability of occurrence of extreme low Standardized Precipitation Index (SPI) values leads to an increase of the knowledge related to the occurrence of extreme dry months. This sort of analysis can be carried out by means of two approaches: the block maxima (BM; associated with the General Extreme Value distribution) and the peaks-over-threshold (POT; associated with the Generalized Pareto distribution). Each of these procedures has its own advantages and drawbacks. Thus, the main goal of this study is to compare the performance of BM and POT in characterizing the probability of occurrence of extreme dry SPI values obtained from the weather station of Ribeirão Preto-SP (1937-2012). According to the goodness-of-fit tests, both BM and POT can be used to assess the probability of occurrence of the aforementioned extreme dry SPI monthly values. However, the scalar measures of accuracy and the return level plots indicate that POT provides the best fit distribution. The study also indicated that the uncertainties in the parameters estimates of a probabilistic model should be taken into account when the probability associated with a severe/extreme dry event is under analysis.


2021 ◽  
Author(s):  
Jeremy Rohmer ◽  
Rodrigo Pedreros ◽  
Yann Krien

<p>To estimate return levels of wave heights (Hs) induced by tropical cyclones at the coast, a commonly-used approach is to (1) randomly generate a large number of synthetic cyclone events (typically >1,000); (2) numerically simulate the corresponding Hs over the whole domain of interest; (3) extract the Hs values at the desired location at the coast and (4) perform the local extreme value analysis (EVA) to derive the corresponding return level. Step 2 is however very constraining because it often involves a numerical hydrodynamic simulator that can be prohibitive to run: this might limit the number of results to perform the local EVA (typically to several hundreds). In this communication, we propose a spatial stochastic simulation procedure to increase the database size of numerical results with synthetic maps of Hs that are stochastically generated. To do so, we propose to rely on a data-driven dimensionality-reduction method, either unsupervised (Principal Component Analysis) or supervised (Partial Least Squares Regression), that is trained with a limited number of pre-existing numerically simulated Hs maps. The procedure is applied to the Guadeloupe island and results are compared to the commonly-used approach applied to a large database of Hs values computed for nearly 2,000 synthetic cyclones (representative of 3,200 years – Krien et al., NHESS, 2015). When using only a hundred of cyclones, we show that the estimates of the 100-year return levels can be achieved with a mean absolute percentage error (derived from a bootstrap-based procedure) ranging between 5 and 15% around the coasts while keeping the width of the 95% confidence interval of the same order of magnitude than the one using the full database. Without synthetic Hs maps augmentation, the error and confidence interval width are both increased by nearly 100%. A careful attention is paid to the tuning of the approach by testing the sensitivity to the spatial domain size, the information loss due to data compression, and the number of cyclones. This study has been carried within the Carib-Coast INTERREG project (https://www.interreg-caraibes.fr/carib-coast).</p>


Sign in / Sign up

Export Citation Format

Share Document