scholarly journals A Method to Characterize Climate, Earth or Environmental Vector Random Processes

Author(s):  
Manuel Cobos Budia ◽  
Pedro Otiñar Morillas ◽  
Pedro Magaña Redondo ◽  
Asunción Baquerizo Azofra

Abstract We propose a methodology to characterize a multivariate non-stationary vector random process that can be used for simulating random realizations that keep the probabilistic behavior of the original time series. The marginal probability distribution of each component process is assumed to be a piecewise function defined by several weighted parametric probability models. The weights are obtained analytically by ensuring that the probability density function is well defined and that it is continuous at the common endpoints. The probability model is assumed to vary periodically in time over a predefined time period by defining the model parameters and the common endpoints as truncated generalized Fourier series. The coefficients of the expansions are obtained with the maximum likelihood method. Three different types of sets of orthogonal functions are tested. The method is applied to three time series with different particularities. Firstly, it is shown its good behavior to capture the highly variable freshwater discharges at a dam located in a semiarid zone in Andalucía (Spain) which is influenced not only by the climate variability but also by management decisions. Secondly, for the Wolf sunspot number time series, the Schwabe cycle and time variations close to the 7.5 and 17 years are analyzed along a 22-year cycle. Finally, the method is applied to a bivariate (velocity and direction) wind time series observed at a location of the Atlantic Ocean. For this case, the analysis, that was combined with a vectorial autoregresive model, focus on the assessment of the goodness of the methodology to replicate the statistical features of the original series. In particular, it is found that it reproduces the marginal and joint distributions, the wind rose, and the duration of sojourns above given thresholds.

Author(s):  
Muhammad Farooq ◽  
Qamar-uz-zaman ◽  
Muhammad Ijaz

The Covid-19 infections outbreak is increasing day by day and the mortality rate is increasing exponentially both in underdeveloped and developed countries. It becomes inevitable for mathematicians to develop some models that could define the rate of infections and deaths in a population. Although there exist a lot of probability models but they fail to model different structures (non-monotonic) of the hazard rate functions and also do not provide an adequate fit to lifetime data. In this paper, a new probability model (FEW) is suggested which is designed to evaluate the death rates in a Population. Various statistical properties of FEW have been screened out in addition to the parameter estimation by using the maximum likelihood method (MLE). Furthermore, to delineate the significance of the parameters, a simulation study is conducted. Using death data from Pakistan due to Covid-19 outbreak, the proposed model applications is studied and compared to that of other existing probability models such as Ex-W, W, Ex, AIFW, and GAPW. The results show that the proposed model FEW provides a much better fit while modeling these data sets rather than Ex-W, W, Ex, AIFW, and GAPW.


Geochronology ◽  
2020 ◽  
Vol 2 (1) ◽  
pp. 119-131
Author(s):  
Pieter Vermeesch

Abstract. The actinide elements U and Th undergo radioactive decay to three isotopes of Pb, forming the basis of three coupled geochronometers. The 206Pb ∕238U and 207Pb ∕235U decay systems are routinely combined to improve accuracy. Joint consideration with the 208Pb ∕232Th decay system is less common. This paper aims to change this. Co-measured 208Pb ∕232Th is particularly useful for discordant samples containing variable amounts of non-radiogenic (“common”) Pb. The paper presents a maximum likelihood algorithm for joint isochron regression of the 206Pb ∕238Pb, 207Pb ∕235Pb and 208Pb ∕232Th chronometers. Given a set of cogenetic samples, this total-Pb/U-Th algorithm estimates the common Pb composition and concordia intercept age. U–Th–Pb data can be visualised on a conventional Wetherill or Tera–Wasserburg concordia diagram, or on a 208Pb ∕232Th vs. 206Pb ∕238U plot. Alternatively, the results of the new discordia regression algorithm can also be visualised as a 208Pbc ∕206Pb vs. 238U ∕206Pb or 208Pbc ∕207Pb vs. 235U ∕206Pb isochron, where 208Pbc represents the common 208Pb component. In its most general form, the total-Pb/U-Th algorithm accounts for the uncertainties of all isotopic ratios involved, including the 232Th ∕238U ratio, as well as the systematic uncertainties associated with the decay constants and the 238U ∕235U ratio. However, numerical stability is greatly improved when the dependency on the 232Th ∕238U-ratio uncertainty is dropped. For detrital minerals, it is generally not safe to assume a shared common Pb composition and concordia intercept age. In this case, the total-Pb/U-Th regression method must be modified by tying it to a terrestrial Pb evolution model. Thus, also detrital common Pb correction can be formulated in a maximum likelihood sense. The new method was applied to three published datasets, including low Th∕U carbonates, high Th∕U allanites and overdispersed monazites. The carbonate example illustrates how the total-Pb/U-Th method achieves a more precise common Pb correction than a conventional 207Pb-based approach does. The allanite sample shows the significant gain in both precision and accuracy that is made when the Th–Pb decay system is jointly considered with the U–Pb system. Finally, the monazite example is used to illustrate how the total-Pb/U-Th regression algorithm can be modified to include an overdispersion parameter. All the parameters in the discordia regression method (including the age and the overdispersion parameter) are strictly positive quantities that exhibit skewed error distributions near zero. This skewness can be accounted for using the profile log-likelihood method or by recasting the regression algorithm in terms of logarithmic quantities. Both approaches yield realistic asymmetric confidence intervals for the model parameters. The new algorithm is flexible enough that it can accommodate disequilibrium corrections and intersample error correlations when these are provided by the user. All the methods presented in this paper have been added to the IsoplotR software package. This will hopefully encourage geochronologists to take full advantage of the entire U–Th–Pb decay system.


2019 ◽  
Author(s):  
Pieter Vermeesch

Abstract. The actinide elements U and Th undergo radioactive decay to three isotopes of Pb, forming the basis of three coupled geochronometers. The 206Pb / 238U and 207Pb / 235U decay systems are routinely combined to improve accuracy. Joint consideration with the 208Pb / 232Th decay system is less common. This paper aims to change this. Adding 208Pb / 232Th to the mix is particularly useful for discordant samples containing variable amounts of non-radiogenic (common) Pb. The paper presents a maximum likelihood algorithm for joint isochron regression of the 206Pb / 238Pb, 207Pb / 235Pb, and 208Pb / 232Th chronometers. Given a set of cogenetic samples, the algorithm estimates the common Pb composition and concordia intercept age. U-Th-Pb data can be visualised on a conventional Wetherill or Tera-Wasserburg concordia diagram, or on a 208Pb / 232Th vs. 206Pb / 238U concordia diagram. Alternatively, the results of the new discordia regression algorithm can also be visualised as a 208Pbc / 206Pb vs. 238U / 206Pb or 208Pbc / 207Pb vs. 238U / 207Pb isochron, where 208Pbc represents the common 208Pb component. For detrital minerals, it is generally not possible to assume a shared common Pb composition and concordia intercept age. In this case the U-Th-Pb discordia regression method must be modified by tying it to a mantle evolution model. Thus also detrital common Pb correction can be formulated in a maximum likelihood sense. The new method was applied to a published monazite dataset with a Th / U-ratio of ∼ 10, resulting in a significant radiogenic 208Pb component. Therefore the case study represents a worst case scenario for the new algorithm. Nevertheless, it manages to fit the data very well. The method should work even better in low-Th phases such as carbonates. The degree to which the dispersion of the data around the isochron line matches the analytical uncertainties can be assessed using the mean square of the weighted deviates (MSWD) statistic. A modified four parameter version of the regression algorithm quantifies this overdispersion, providing potentially valuable geological insight into the processes that control isotopic closure. All the parameters in the discordia regression method (including the age and the overdispersion parameter) are strictly positive quantities that exhibit skewed error distributions near zero. This skewness can be accounted for using the profile log-likelihood method, or by recasting the regression algorithm in terms of logarithmic quantities. Both approaches yield realistic asymmetric confidence intervals for the model parameters. The new algorithm is flexible enough that it can accommodate disequilibrium corrections and inter-sample error correlations when these are provided by the user. All the methods presented in this paper have been added to the IsoplotR software package. This will hopefully encourage geochronologists to take full advantage of the entire U-Th-Pb decay system.


2021 ◽  
Author(s):  
Habib Rahimi ◽  
G. Tanircan ◽  
Mohammad Shahvar

Abstract In this study, a stochastic simulation model proposed by Yamamoto and Baker (2013), is applied to Iranian strong motion database which comprises more than 3828 recordings for a time period between 1975–2018. Each ground motion is decomposed into wavelet packets. Amplitudes of wavelet packets are divided into two groups and for each group model parameters are estimated using the maximum likelihood method. Regression coefficients are then obtained relating model parameters to seismic characteristics such as earthquake magnitude, distance, and site condition. Inter-event residuals of coefficients and correlation of total residuals of those parameters are also calculated. To reconstruct the amplitudes in time domain and do the simulation, inverse wavelet packet transform is used. Finally, a validation test is performed. The comparison of ground motion intensity measures for recorded and simulated time series shows an acceptable conformity in the application. The estimated parameters using the simulated data are in good agreement with the real data, indicating the acceptable validity of the estimated stochastic simulation model. Obtained regression equations can be used to generate ground motions for the future earthquake scenarios in Iran.


1996 ◽  
Vol 6 ◽  
pp. 175-212 ◽  
Author(s):  
Timothy W. Amato

In this article, the mathematical and probabilistic foundations of Gary King's “generalized event count” (GEC) model for dealing with unequally dispersed event count data are explored. It is shown that the GEC model is a probability model that joins together the binomial, negative binomial, and Poisson distributions. Some aspects of the GEC's reparameterization are described and extended and it is shown how different reparameterizations lead to different interpretations of the dispersion parameter. The common mathematical and statistical structure of “unequally dispersed” event count models as models that require estimation of the “number of trials” parameter along with the “probability” component is derived. Some questions pertaining to estimation of this class of models are raised for future discussion.


2007 ◽  
Vol 135 (3) ◽  
pp. 877-890 ◽  
Author(s):  
Nazario D. Ramirez-Beltran ◽  
William K. M. Lau ◽  
Amos Winter ◽  
Joan M. Castro ◽  
Nazario Ramirez Escalante

Abstract A new algorithm is proposed to predict the level of rainfall (above normal, normal, and below normal) in Puerto Rico that relies on probability and empirical models. The algorithm includes a theoretical probability model in which parameters are expressed as regression equations containing observed meteorological variables. Six rainfall stations were used in this study to implement and assess the reliability of the models. The stations, located throughout Puerto Rico, have monthly records that extend back 101 yr. The maximum likelihood method is used to estimate the parameters of the empirical probability models. A variable selection (VS) algorithm identifies the minimum number of variables that maximize the correlation between predictors and a predictand. The VS algorithm is used to identify the initial point and the maximum likelihood is optimized by using the sequential quadratic programming algorithm. Ten years of cross validation were applied to the results from six stations. The proposed method outperforms both climatology and damped persistence models. Results suggest that the methodology implemented here can be used as a potential tool to predict the level of rainfall at any station located on a tropical island, assuming that at least 50 yr of monthly rainfall observations are available. Model analyses show that meteorological indices can be used to predict rainfall stages.


2018 ◽  
Vol 7 (2) ◽  
pp. 139-150 ◽  
Author(s):  
Adekunlé Akim Salami ◽  
Ayité Sénah Akoda Ajavon ◽  
Mawugno Koffi Kodjo ◽  
Seydou Ouedraogo ◽  
Koffi-Sa Bédja

In this article, we introduced a new approach based on graphical method (GPM), maximum likelihood method (MLM), energy pattern factor method (EPFM), empirical method of Justus (EMJ), empirical method of Lysen (EML) and moment method (MOM) using the even or odd classes of wind speed series distribution histogram with 1 m/s as bin size to estimate the Weibull parameters. This new approach is compared on the basis of the resulting mean wind speed and its standard deviation using seven reliable statistical indicators (RPE, RMSE, MAPE, MABE, R2, RRMSE and IA). The results indicate that this new approach is adequate to estimate Weibull parameters and can outperform GPM, MLM, EPF, EMJ, EML and MOM which uses all wind speed time series data collected for one period. The study has also found a linear relationship between the Weibull parameters K and C estimated by MLM, EPFM, EMJ, EML and MOM using odd or even class wind speed time series and those obtained by applying these methods to all class (both even and odd bins) wind speed time series. Another interesting feature of this approach is the data size reduction which eventually leads to a reduced processing time.Article History: Received February 16th 2018; Received in revised form May 5th 2018; Accepted May 27th 2018; Available onlineHow to Cite This Article: Salami, A.A., Ajavon, A.S.A., Kodjo, M.K. , Ouedraogo, S. and Bédja, K. (2018) The Use of Odd and Even Class Wind Speed Time Series of Distribution Histogram to Estimate Weibull Parameters. Int. Journal of Renewable Energy Development 7(2), 139-150.https://doi.org/10.14710/ijred.7.2.139-150


2021 ◽  
Vol 13 (2) ◽  
pp. 542
Author(s):  
Tarate Suryakant Bajirao ◽  
Pravendra Kumar ◽  
Manish Kumar ◽  
Ahmed Elbeltagi ◽  
Alban Kuriqi

Estimating sediment flow rate from a drainage area plays an essential role in better watershed planning and management. In this study, the validity of simple and wavelet-coupled Artificial Intelligence (AI) models was analyzed for daily Suspended Sediment (SSC) estimation of highly dynamic Koyna River basin of India. Simple AI models such as the Artificial Neural Network (ANN) and Adaptive Neuro-Fuzzy Inference System (ANFIS) were developed by supplying the original time series data as an input without pre-processing through a Wavelet (W) transform. The hybrid wavelet-coupled W-ANN and W-ANFIS models were developed by supplying the decomposed time series sub-signals using Discrete Wavelet Transform (DWT). In total, three mother wavelets, namely Haar, Daubechies, and Coiflets were employed to decompose original time series data into different multi-frequency sub-signals at an appropriate decomposition level. Quantitative and qualitative performance evaluation criteria were used to select the best model for daily SSC estimation. The reliability of the developed models was also assessed using uncertainty analysis. Finally, it was revealed that the data pre-processing using wavelet transform improves the model’s predictive efficiency and reliability significantly. In this study, it was observed that the performance of the Coiflet wavelet-coupled ANFIS model is superior to other models and can be applied for daily SSC estimation of the highly dynamic rivers. As per sensitivity analysis, previous one-day SSC (St-1) is the most crucial input variable for daily SSC estimation of the Koyna River basin.


Open Physics ◽  
2020 ◽  
Vol 18 (1) ◽  
pp. 439-447
Author(s):  
Lijie Yan ◽  
Xudong Liu

AbstractTo a large extent, the load balancing algorithm affects the clustering performance of the computer. This paper illustrated the common load balancing algorithms and elaborated on the advantages and drawbacks of such algorithms. In addition, this paper provides a kind of balancing algorithm generated on the basis of the load prediction. Due to the dynamic exponential smoothing model, such an algorithm helps obtain the corresponding smoothing coefficient with the server node load time series of current phrase and allows researchers to make prediction with the load value at the next moment of this node. Subsequently, the dispatcher makes the scheduling with the serve request of users according to the load predicted value. OPNET Internet simulated software is applied to the test, and we may conclude from the results that the application of such an algorithm acquires a higher load balancing efficiency and better load balancing effect.


Author(s):  
Andrew Q. Philips

In cross-sectional time-series data with a dichotomous dependent variable, failing to account for duration dependence when it exists can lead to faulty inferences. A common solution is to include duration dummies, polynomials, or splines to proxy for duration dependence. Because creating these is not easy for the common practitioner, I introduce a new command, mkduration, that is a straightforward way to generate a duration variable for binary cross-sectional time-series data in Stata. mkduration can handle various forms of missing data and allows the duration variable to easily be turned into common parametric and nonparametric approximations.


Sign in / Sign up

Export Citation Format

Share Document