Threshold models for the analysis of environmental extremes

1984 ◽  
Vol 16 (1) ◽  
pp. 20-20
Author(s):  
Richard L. Smith

Statistical methods for analysing the extreme values of a time series may be based on the observed exceedances of the series above a high threshold level. Todorovic (1979) has developed this approach in detail; other relevant references are North (1980) and the English Flood Studies Report (1975). One way of motivating these models is by reference to the theory of extremes in stationary sequences, due to Leadbetter and others.

Author(s):  
Diaz Juan Navia ◽  
Diaz Juan Navia ◽  
Bolaños Nancy Villegas ◽  
Bolaños Nancy Villegas ◽  
Igor Malikov ◽  
...  

Sea Surface Temperature Anomalies (SSTA), in four coastal hydrographic stations of Colombian Pacific Ocean, were analyzed. The selected hydrographic stations were: Tumaco (1°48'N-78°45'W), Gorgona island (2°58'N-78°11'W), Solano Bay (6°13'N-77°24'W) and Malpelo island (4°0'N-81°36'W). SSTA time series for 1960-2015 were calculated from monthly Sea Surface Temperature obtained from International Comprehensive Ocean Atmosphere Data Set (ICOADS). SSTA time series, Oceanic Nino Index (ONI), Pacific Decadal Oscillation index (PDO), Arctic Oscillation index (AO) and sunspots number (associated to solar activity), were compared. It was found that the SSTA absolute minimum has occurred in Tumaco (-3.93°C) in March 2009, in Gorgona (-3.71°C) in October 2007, in Solano Bay (-4.23°C) in April 2014 and Malpelo (-4.21°C) in December 2005. The SSTA absolute maximum was observed in Tumaco (3.45°C) in January 2002, in Gorgona (5.01°C) in July 1978, in Solano Bay (5.27°C) in March 1998 and Malpelo (3.64°C) in July 2015. A high correlation between SST and ONI in large part of study period, followed by a good correlation with PDO, was identified. The AO and SSTA have showed an inverse relationship in some periods. Solar Cycle has showed to be a modulator of behavior of SSTA in the selected stations. It was determined that extreme values of SST are related to the analyzed large scale oscillations.


2018 ◽  
Author(s):  
Christine Masson ◽  
Stephane Mazzotti ◽  
Philippe Vernant

Abstract. We use statistical analyses of synthetic position time series to estimate the potential precision of GPS velocities. The synthetic series represent the standard range of noise, seasonal, and position offset characteristics, leaving aside extreme values. This analysis is combined with a new simple method for automatic offset detection that allows an automatic treatment of the massive dataset. Colored noise and the presence of offsets are the primary contributor to velocity variability. However, regression tree analyses show that the main factors controlling the velocity precision are first the duration of the series, followed by the presence of offsets and the noise (dispersion and spectral index). Our analysis allows us to propose guidelines, which can be applied to actual GPS data, that constrain the velocity accuracies (expressed as 95 % confidence limits) based on simple parameters: (1) Series durations over 8.0 years result in high velocity accuracies in the horizontal (0.2 mm yr−1) and vertical (0.5 mm yr−1); (2) Series durations of less than 4.5 years cannot be used for high-precision studies since the horizontal accuracy is insufficient (over 1.0 mm yr−1); (3) Series of intermediate durations (4.5–8.0 years) are associated with an intermediate horizontal accuracy (0.6 mm yr-1) and a poor vertical one (1.3 mm yr−1), unless they comprise no offset. Our results suggest that very long series durations (over 15–20 years) do not ensure a better accuracy compare to series of 8–10 years, due to the noise amplitude following a power-law dependency on the frequency. Thus, better characterizations of long-period GPS noise and pluri-annual environmental loads are critical to further improve GPS velocity precisions.


2017 ◽  
Vol 21 (5) ◽  
pp. 2579-2594 ◽  
Author(s):  
Hidayat Hidayat ◽  
Adriaan J. Teuling ◽  
Bart Vermeulen ◽  
Muh Taufik ◽  
Karl Kastner ◽  
...  

Abstract. Wetlands are important reservoirs of water, carbon and biodiversity. They are typical landscapes of lowland regions that have high potential for water retention. However, the hydrology of these wetlands in tropical regions is often studied in isolation from the processes taking place at the catchment scale. Our main objective is to study the hydrological dynamics of one of the largest tropical rainforest regions on an island using a combination of satellite remote sensing and novel observations from dedicated field campaigns. This contribution offers a comprehensive analysis of the hydrological dynamics of two neighbouring poorly gauged tropical basins; the Kapuas basin (98 700 km2) in West Kalimantan and the Mahakam basin (77 100 km2) in East Kalimantan, Indonesia. Both basins are characterised by vast areas of inland lowlands. Hereby, we put specific emphasis on key hydrological variables and indicators such as discharge and flood extent. The hydroclimatological data described herein were obtained during fieldwork campaigns carried out in the Kapuas over the period 2013–2015 and in the Mahakam over the period 2008–2010. Additionally, we used the Tropical Rainfall Measuring Mission (TRMM) rainfall estimates over the period 1998–2015 to analyse the distribution of rainfall and the influence of El-Niño – Southern Oscillation. Flood occurrence maps were obtained from the analysis of the Phase Array type L-band Synthetic Aperture Radar (PALSAR) images from 2007 to 2010. Drought events were derived from time series of simulated groundwater recharge using time series of TRMM rainfall estimates, potential evapotranspiration estimates and the threshold level approach. The Kapuas and the Mahakam lake regions are vast reservoirs of water of about 1000 and 1500 km2 that can store as much as 3 and 6.5 billion m3 of water, respectively. These storage capacity values can be doubled considering the area of flooding under vegetation cover. Discharge time series show that backwater effects are highly influential in the wetland regions, which can be partly explained by inundation dynamics shown by flood occurrence maps obtained from PALSAR images. In contrast to their nature as wetlands, both lowland areas have frequent periods with low soil moisture conditions and low groundwater recharge. The Mahakam wetland area regularly exhibits low groundwater recharge, which may lead to prolonged drought events that can last up to 13 months. It appears that the Mahakam lowland is more vulnerable to hydrological drought, leading to more frequent fire occurrences than in the Kapuas basin.


2016 ◽  
Author(s):  
Fernando Arizmendi ◽  
Marcelo Barreiro ◽  
Cristina Masoller

Abstract. By comparing time-series of surface air temperature (SAT, monthly reanalysis data from NCEP CDAS1 and ERA Interim) with respect to the top-of-atmosphere incoming solar radiation (the insolation), we perform a detailed analysis of the SAT response to solar forcing. By computing the entropy of SAT time-series, we also quantify the degree of stochasticity. We find spatial coherent structures which are characterized by high stochasticity and nearly linear response to solar forcing (the shape of SAT time-series closely follows that of the isolation), or vice versa. The entropy analysis also allows to identify geographical regions in which there are significant differences between the NCEP CDAS1 and ERA Interim datasets, which are due to the presence of extreme values in one dataset but not in the other. Therefore, entropy maps are a valuable tool for anomaly detection and model inter-comparisons.


2019 ◽  
Vol 3 (2) ◽  
pp. 274-306 ◽  
Author(s):  
Ruben Sanchez-Romero ◽  
Joseph D. Ramsey ◽  
Kun Zhang ◽  
Madelyn R. K. Glymour ◽  
Biwei Huang ◽  
...  

We test the adequacies of several proposed and two new statistical methods for recovering the causal structure of systems with feedback from synthetic BOLD time series. We compare an adaptation of the first correct method for recovering cyclic linear systems; Granger causal regression; a multivariate autoregressive model with a permutation test; the Group Iterative Multiple Model Estimation (GIMME) algorithm; the Ramsey et al. non-Gaussian methods; two non-Gaussian methods by Hyvärinen and Smith; a method due to Patel et al.; and the GlobalMIT algorithm. We introduce and also compare two new methods, Fast Adjacency Skewness (FASK) and Two-Step, both of which exploit non-Gaussian features of the BOLD signal. We give theoretical justifications for the latter two algorithms. Our test models include feedback structures with and without direct feedback (2-cycles), excitatory and inhibitory feedback, models using experimentally determined structural connectivities of macaques, and empirical human resting-state and task data. We find that averaged over all of our simulations, including those with 2-cycles, several of these methods have a better than 80% orientation precision (i.e., the probability of a directed edge is in the true structure given that a procedure estimates it to be so) and the two new methods also have better than 80% recall (probability of recovering an orientation in the true structure).


2005 ◽  
Vol 2 ◽  
pp. 255-257 ◽  
Author(s):  
L. Cavaleri

Abstract. Within the WW-Medatlas project, sponsored by the Italian, French and Greek Navies, an extensive atlas of the wind and wave conditions in the Mediterranean Sea has been completed. The atlas is based on the information derived from the archive of the European Centre for Medium-Range Weather Forecasts, UK, then calibrated on the base of the data available from the ERS1-2 and Topex satellites. The calibration is required because the wind, hence the wave, data are normally strongly underestimated in the enclosed seas. The calibration has been done deriving the model values at each satellite position, typically at 7 km intervals. The co-located values have then been assigned to the closest grid point. This has provided a substantial number of couples of data at each point, then used to derive, by best-fitting technique, the correction required. This turns out to vary amply throughout the basin, according to the local geometry and orography. The calibration coefficients, different for wind and waves, have been used to correct the original fields and the time series at the single points. Using the calibrated data, extensive statistics have been derived, both as fields and at each point, including extreme values.


2018 ◽  
Vol 2018 (8) ◽  
pp. 67-75
Author(s):  
Юрий Кропотов ◽  
Yuriy Kropotov ◽  
Алексей Белов ◽  
Aleksey Belov ◽  
Александр Проскуряков ◽  
...  

The purpose of this work is development of the method for error decrease in information presentation in telecommunication systems of monitoring by means of filtering noise and fluctuations of levels in time series counts. To solve this problem there is used a method of wavelet processing. In particular, the decrease of time series fluctuation impact is carried out by means of the computation of approximating coefficients of the n-th level which corresponds to the fulfillment of multi-level statistical processing the values of time series counts and equivalent to a signal passage through a filter of low frequencies. There was developed and investigated a simulator and its statistical parameters of processing with a wavelet transformation of time series counts. It is shown that time series wavelet processing and the application of approximation coefficients of waveletdecomposition increase the accuracy of data presentation. It is also ensured at the expense of noise component suppression through a method of thresholding upon detailing coefficients of decomposition. In the paper there are shown investigations of the dependence of approximation coefficient correlation time upon a wavelet decomposition level. There was also investigated a depression dependence of noise components of time series count fluctuations of emission at the processing with the wavelet decomposition with obtaining approximation coefficients of different levels. The fulfilled analysis of the results of different criteria application and approaches to smoothing on the basis of threshold processing the detail coefficients of wavelet decomposition has shown that at smoothing time series there will be an optimum choice of an adaptive penalty threshold level. The presented results of smoothing with an adaptive penalty threshold have shown that the signal-noise ratio increased for more than 2.53dB in comparison with the initial one.


2021 ◽  
pp. 157-190
Author(s):  
Sylvie Parey ◽  
Thi-Thu-Huong Hoang

Sign in / Sign up

Export Citation Format

Share Document