Extreme Value Prediction on Decreasing Depth by Means of a Nonlinear Wave Transformation Model

Author(s):  
K. A. Belibassakis ◽  
Ch. N. Stefanakos ◽  
Y. G. Georgiou

In the present work a weakly nonlinear wave model originally developed by Rebaudengo Lando` et al (1996) is applied to the transformation of wave spectra from offshore to nearshore, and subsequently, it has been systematically applied to the derivation of long-term time series of spectral wave parameters on decreasing depth from corresponding offshore wave data. The derived long-term series of nearshore parameters have been used as input to a new method, recently developed by Stefanakos & Athanassoulis (2006), for calculating return periods of various level values from nonstationary time series data. The latter method is based on a new definition of the return period, that uses the MEan Number of Upcrossings of the level x* (MENU method), and it has been shown to lead to predictions that are more realistic than traditional methods. To examine the effects of bottom topography on the nearshore extreme value predictions, Roseau (1976) bottom profiles have been used for which analytical expressions are available concerning the reflection and transmission coefficients. A parametric (JONSWAP) model is used to synthesize offshore spectra from integrated parameters, which are then linearly transformed based on the previous transmission coefficient to derive first-order nearshore wave spectra. Second-order random sea states have been simulated by following the approach of Hudspeth & Chen (1979) (see also Langley 1987, Lando et al 1996), exploiting the quadratic transfer functions on decreasing depth to calculate the second-order nearshore spectra. Finally, wave parameters are extracted from the nearshore spectra by calculating the first few moments.

Water ◽  
2021 ◽  
Vol 13 (4) ◽  
pp. 416
Author(s):  
Bwalya Malama ◽  
Devin Pritchard-Peterson ◽  
John J. Jasbinsek ◽  
Christopher Surfleet

We report the results of field and laboratory investigations of stream-aquifer interactions in a watershed along the California coast to assess the impact of groundwater pumping for irrigation on stream flows. The methods used include subsurface sediment sampling using direct-push drilling, laboratory permeability and particle size analyses of sediment, piezometer installation and instrumentation, stream discharge and stage monitoring, pumping tests for aquifer characterization, resistivity surveys, and long-term passive monitoring of stream stage and groundwater levels. Spectral analysis of long-term water level data was used to assess correlation between stream and groundwater level time series data. The investigations revealed the presence of a thin low permeability silt-clay aquitard unit between the main aquifer and the stream. This suggested a three layer conceptual model of the subsurface comprising unconfined and confined aquifers separated by an aquitard layer. This was broadly confirmed by resistivity surveys and pumping tests, the latter of which indicated the occurrence of leakage across the aquitard. The aquitard was determined to be 2–3 orders of magnitude less permeable than the aquifer, which is indicative of weak stream-aquifer connectivity and was confirmed by spectral analysis of stream-aquifer water level time series. The results illustrate the importance of site-specific investigations and suggest that even in systems where the stream is not in direct hydraulic contact with the producing aquifer, long-term stream depletion can occur due to leakage across low permeability units. This has implications for management of stream flows, groundwater abstraction, and water resources management during prolonged periods of drought.


2007 ◽  
pp. 88
Author(s):  
Wataru Suzuki ◽  
Yanfei Zhou

This article represents the first step in filling a large gap in knowledge concerning why Public Assistance (PA) use recently rose so fast in Japan. Specifically, we try to address this problem not only by performing a Blanchard and Quah decomposition on long-term monthly time series data (1960:04-2006:10), but also by estimating prefecturelevel longitudinal data. Two interesting findings emerge from the time series analysis. The first is that permanent shock imposes a continuously positive impact on the PA rate and is the main driving factor behind the recent increase in welfare use. The second finding is that the impact of temporary shock will last for a long time. The rate of the use of welfare is quite rigid because even if the PA rate rises due to temporary shocks, it takes about 8 or 9 years for it to regain its normal level. On the other hand, estimations of prefecture-level longitudinal data indicate that the Financial Capability Index (FCI) of the local government2 and minimum wage both impose negative effects on the PA rate. We also find that the rapid aging of Japan's population presents a permanent shock in practice, which makes it the most prominent contribution to surging welfare use.


2017 ◽  
Author(s):  
Easton R White

Long-term time series are necessary to better understand population dynamics, assess species' conservation status, and make management decisions. However, population data are often expensive, requiring a lot of time and resources. When is a population time series long enough to address a question of interest? We determine the minimum time series length required to detect significant increases or decreases in population abundance. To address this question, we use simulation methods and examine 878 populations of vertebrate species. Here we show that 15-20 years of continuous monitoring are required in order to achieve a high level of statistical power. For both simulations and the time series data, the minimum time required depends on trend strength, population variability, and temporal autocorrelation. These results point to the importance of sampling populations over long periods of time. We argue that statistical power needs to be considered in monitoring program design and evaluation. Time series less than 15-20 years are likely underpowered and potentially misleading.


Media Ekonomi ◽  
2017 ◽  
Vol 20 (1) ◽  
pp. 83
Author(s):  
Jumadin Lapopo

<p>Poverty is being a problem in all developing countries including Indonesia. Among goverment programs, poverty has become the center offattention in policy at both of the regional and national levels. Looking at thephenomenon of poverty, Islam present with solution to reduce poverty through Zakat. This study aims to analyze the effect of ZIS and Zakat Fitrah against poverty in Indonesia in 1998 until 2010, data used in this study is secondary data and uses time series data, for the dependent variabel is poverty and for independent variables are ZIS and Zakat Fitrah. The analysis tools used in this study is to use multiple regression analysis model and the assumptions of classical test using the software Eviews-4. In this study also concluded that the ZIS variables significantly affect to the reduction of poverty in Indonesia although the effect is very small. In the variable Zakat Fitrah not significantly affect poverty reduction in Indonesia because of the nature of Zakat Fitrah is for consumption and not for long-term needs. The results of this study can be used for the management of zakat to be able to develop the management and to get a better system for distribution of zakat so that the main purpose of zakat can be achieved to reduce poverty.<br />Keywords : Poverty, Zakat Fitrah, ZIS.</p>


2012 ◽  
Vol 2012 ◽  
pp. 1-15 ◽  
Author(s):  
Jia Chaolong ◽  
Xu Weixiang ◽  
Wang Futian ◽  
Wang Hanning

The combination of linear and nonlinear methods is widely used in the prediction of time series data. This paper analyzes track irregularity time series data by using gray incidence degree models and methods of data transformation, trying to find the connotative relationship between the time series data. In this paper, GM(1,1)is based on first-order, single variable linear differential equations; after an adaptive improvement and error correction, it is used to predict the long-term changing trend of track irregularity at a fixed measuring point; the stochastic linear AR, Kalman filtering model, and artificial neural network model are applied to predict the short-term changing trend of track irregularity at unit section. Both long-term and short-term changes prove that the model is effective and can achieve the expected accuracy.


2018 ◽  
Vol 7 (4.15) ◽  
pp. 25 ◽  
Author(s):  
Said Jadid Abdulkadir ◽  
Hitham Alhussian ◽  
Muhammad Nazmi ◽  
Asim A Elsheikh

Forecasting time-series data are imperative especially when planning is required through modelling using uncertain knowledge of future events. Recurrent neural network models have been applied in the industry and outperform standard artificial neural networks in forecasting, but fail in long term time-series forecasting due to the vanishing gradient problem. This study offers a robust solution that can be implemented for long-term forecasting using a special architecture of recurrent neural network known as Long Short Term Memory (LSTM) model to overcome the vanishing gradient problem. LSTM is specially designed to avoid the long-term dependency problem as their default behavior. Empirical analysis is performed using quantitative forecasting metrics and comparative model performance on the forecasted outputs. An evaluation analysis is performed to validate that the LSTM model provides better forecasted outputs on Standard & Poor’s 500 Index (S&P 500) in terms of error metrics as compared to other forecasting models.  


2001 ◽  
Vol 5 (1_suppl) ◽  
pp. 213-236 ◽  
Author(s):  
Emery Schubert

Publications of research concerning continuous emotional responses to music are increasing. The developing interest brings with it a need to understand the problems associated with the analysis of time series data. This article investigates growing concern in the use of conventional Pearson correlations for comparing time series data. Using continuous data collected in response to selected pieces of music, with two emotional dimensions for each piece, two falsification studies were conducted. The first study consisted of a factor analysis of the individual responses using the original data set and its first-order differenced transformation. The differenced data aligned according to theoretical constraints better than the untransformed data, supporting the use of first-order difference transformations. Using a similar method, the second study specifically investigated the relationship between Pearson correlations, difference transformations and the critical correlation coefficient above which the conventional correlation analysis remains internally valid. A falsification table was formulated and quantified through a hypothesis index function. The study revealed that correlations of undifferenced data must be greater than 0.75 for a valid interpretation of the relationship between bivariate emotional response time series data. First and second-order transformations were also investigated and found to be valid for correlation coefficients as low as 0.24. Of the three version of the data (untransformed, first-order differenced, and second-order differenced), first-order differenced data produced the fewest problems with serial correlation, whilst remaining a simple and meaningful transformation.


Author(s):  
Stephen F. Barstow ◽  
Harald E. Krogstad ◽  
Lasse Lo̸nseth ◽  
Jan Petter Mathisen ◽  
Gunnar Mo̸rk ◽  
...  

During the WACSIS field experiment, wave elevation time series data were collected over the period December 1997 to May 1998 on and near the Meetpost Nordwijk platform off the coast of the Netherlands from an EMI laser, a Saab radar, a Baylor Wave Staff, a Vlissingen step gauge, a Marex radar and a Directional Waverider. This paper reports and interprets, with the help of simultaneous dual video recordings of the ocean surface, an intercomparison of both single wave and sea state wave parameters.


Author(s):  
Christos N. Stefanakos

In the present work, return periods of various level values of significant wave height in the Gulf of Mexico are given. The predictions are based on a new method for nonstationary extreme-value calculations that have recently been published. This enhanced method exploits efficiently the nonstationary modeling of wind or wave time series and a new definition of return period using the MEan Number of Upcrossings of the level value x* (MENU method). The whole procedure is applied to long-term measurements of wave height in the Gulf of Mexico. Two kinds of data have been used: long-term time series of buoy measurements, and satellite altimeter data. Measured time series are incomplete and a novel procedure for filling in of missing values is applied before proceeding with the extreme-value calculations. Results are compared with several variants of traditional methods, giving more realistic estimates than the traditional predictions. This is in accordance with the results of other methods that take also into account the dependence structure of the examined time series.


Sign in / Sign up

Export Citation Format

Share Document