scholarly journals Stationary Wavelet with Double Generalised Rayleigh Distribution

2021 ◽  
Vol 2021 ◽  
pp. 1-14
Author(s):  
Hassan M. Aljohani

Statistics are mathematical tools applying scientific investigations, such as engineering and medical and biological analyses. However, statistical methods are often improved. Nowadays, statisticians try to find an accurate way to solve a problem. One of these problems is estimation parameters, which can be expressed as an inverse problem when independent variables are highly correlated. This paper’s significant goal is to interpret the parameter estimates of double generalized Rayleigh distribution in a regression model using a wavelet basis. It is difficult to use the standard version of the regression methods in practical terms, which is obtained using the likelihood. Since a noise level usually makes the result of estimation unstable, multicollinearity leads to various estimates. This kind of problem estimates that features of the truth are complicated. So it is reasonable to use a mixed method that combines a fully Bayesian approach and a wavelet basis. The usual rule for wavelet approaches is to choose a wavelet basis, where it helps to compute the wavelet coefficients, and then, these coefficients are used to remove Gaussian noise. Recovering data is typically calculated by inverting the wavelet coefficients. Some wavelet bases have been considered, which provide a shift-invariant wavelet transform, simultaneously providing improvements in smoothness, in recovering, and in squared-error performance. The proposed method uses combining a penalized maximum likelihood approach, a penalty term, and wavelet tools. In this paper, real data are involved and modeled using double generalized Rayleigh distributions, as they are used to estimate the wavelet coefficients of the sample using numerical tools. In practical applications, wavelet approaches are recommended. They reduce noise levels. This process may be useful since the noise level is often corrupted in real data, as a significant cause of most numerical estimation problems. A simulation investigation is studied using the MCMC tool to estimate the underlying features as an essential task statistics.


2015 ◽  
Vol 38 (2) ◽  
pp. 453-466 ◽  
Author(s):  
Hugo S. Salinas ◽  
Yuri A. Iriarte ◽  
Heleno Bolfarine

<p>In this paper we introduce a new distribution for modeling positive data with high kurtosis. This distribution can be seen as an extension of the exponentiated Rayleigh distribution. This extension builds on the quotient of two independent random variables, one exponentiated Rayleigh in the numerator and Beta(q,1) in the denominator with q&gt;0. It is called the slashed exponentiated Rayleigh random variable. There is evidence that the distribution of this new variable can be more flexible in terms of modeling the kurtosis regarding the exponentiated Rayleigh distribution. The properties of this distribution are studied and the parameter estimates are calculated using the maximum likelihood method. An application with real data reveals good performance of this new distribution.</p>



Stats ◽  
2021 ◽  
Vol 4 (1) ◽  
pp. 28-45
Author(s):  
Vasili B.V. Nagarjuna ◽  
R. Vishnu Vardhan ◽  
Christophe Chesneau

In this paper, a new five-parameter distribution is proposed using the functionalities of the Kumaraswamy generalized family of distributions and the features of the power Lomax distribution. It is named as Kumaraswamy generalized power Lomax distribution. In a first approach, we derive its main probability and reliability functions, with a visualization of its modeling behavior by considering different parameter combinations. As prime quality, the corresponding hazard rate function is very flexible; it possesses decreasing, increasing and inverted (upside-down) bathtub shapes. Also, decreasing-increasing-decreasing shapes are nicely observed. Some important characteristics of the Kumaraswamy generalized power Lomax distribution are derived, including moments, entropy measures and order statistics. The second approach is statistical. The maximum likelihood estimates of the parameters are described and a brief simulation study shows their effectiveness. Two real data sets are taken to show how the proposed distribution can be applied concretely; parameter estimates are obtained and fitting comparisons are performed with other well-established Lomax based distributions. The Kumaraswamy generalized power Lomax distribution turns out to be best by capturing fine details in the structure of the data considered.



2021 ◽  
Vol 0 (0) ◽  
Author(s):  
Tianyi Wang ◽  
Chengxiang Wang ◽  
Kequan Zhao ◽  
Wei Yu ◽  
Min Huang

Abstract Limited-angle computed tomography (CT) reconstruction problem arises in some practical applications due to restrictions in the scanning environment or CT imaging device. Some artifacts will be presented in image reconstructed by conventional analytical algorithms. Although some regularization strategies have been proposed to suppress the artifacts, such as total variation (TV) minimization, there is still distortion in some edge portions of image. Guided image filtering (GIF) has the advantage of smoothing the image as well as preserving the edge. To further improve the image quality and protect the edge of image, we propose a coupling method, that combines ℓ 0 {\ell_{0}} gradient minimization and GIF. An intermediate result obtained by ℓ 0 {\ell_{0}} gradient minimization is regarded as a guidance image of GIF, then GIF is used to filter the result reconstructed by simultaneous algebraic reconstruction technique (SART) with nonnegative constraint. It should be stressed that the guidance image is dynamically updated as the iteration process, which can transfer the edge to the filtered image. Some simulation and real data experiments are used to evaluate the proposed method. Experimental results show that our method owns some advantages in suppressing the artifacts of limited angle CT and in preserving the edge of image.



Risks ◽  
2021 ◽  
Vol 9 (4) ◽  
pp. 70
Author(s):  
Małgorzata Just ◽  
Krzysztof Echaust

The appropriate choice of a threshold level, which separates the tails of the probability distribution of a random variable from its middle part, is considered to be a very complex and challenging task. This paper provides an empirical study on various methods of the optimal tail selection in risk measurement. The results indicate which method may be useful in practice for investors and financial and regulatory institutions. Some methods that perform well in simulation studies, based on theoretical distributions, may not perform well when real data are in use. We analyze twelve methods with different parameters for forty-eight world indices using returns from the period of 2000–Q1 2020 and four sub-periods. The research objective is to compare the methods and to identify those which can be recognized as useful in risk measurement. The results suggest that only four tail selection methods, i.e., the Path Stability algorithm, the minimization of the Asymptotic Mean Squared Error approach, the automated Eyeball method with carefully selected tuning parameters and the Hall single bootstrap procedure may be useful in practical applications.



1994 ◽  
Vol 21 (6) ◽  
pp. 1074-1080 ◽  
Author(s):  
J. Llamas ◽  
C. Diaz Delgado ◽  
M.-L. Lavertu

In this paper, an improved probabilistic method for flood analysis using the probable maximum flood, the beta function, and orthogonal Jacobi’s polynomials is proposed. The shape of the beta function depends on the sample's characteristics and the bounds of the phenomenon. On the other hand, a serial of Jacobi’s polynomials has been used improving the beta function and increasing its convergence degree toward the real flood probability density function. This mathematical model has been tested using a sample of 1000 generated beta random data. Finally, some practical applications with real data series, from important Quebec's rivers, have been performed; the model solutions for these rivers showed the accuracy of this new method in flood frequency estimation. Key words: probable maximum flood, beta function, orthogonal polynomials, distribution function, flood frequency estimation, data generation, convergency.



2021 ◽  
Vol 8 (1) ◽  
Author(s):  
Paul M. Reeping ◽  
Christopher N. Morrison ◽  
Kara E. Rudolph ◽  
Monika K. Goyal ◽  
Charles C. Branas

Abstract Background Due to the differences in the way gun law permissiveness scales were created and speculation about the politically motivated underpinnings of the various scales, there have been questions about their reliability. Methods We compared seven gun law permissiveness scales, varying by type and sources, for an enhanced understanding of the extent to which choice of a gun law permissiveness scale could affect studies related to gun violence outcomes in the United States. Specifically, we evaluated seven different scales: two rankings, two counts, and three scores, arising from a range of sources. We calculated Spearman correlation coefficients for each pair of scales compared. Cronbach’s standardized alpha and Guttman’s lambda were calculated to evaluate the relative reliability of the scales, and we re-calculated Cronbach’s alpha after systematically omitting each scale to assess whether the omitted scale contributed to lower internal consistency between scales. Factor analysis was used to determine single factor loadings and estimates. We also assessed associations between permissiveness of gun laws and total firearm deaths and suicides in multivariable regression analyses. Results All pairs of scales were highly correlated (average Spearman’s correlation coefficient r = 0.77) and had high relative reliability (Cronbach’s alpha = 0.968, Guttman’s lambda = 0.975). All scales load onto a single factor. The choice of scale did not meaningfully change the parameter estimates for the associations between permissiveness of gun laws and gun deaths and suicides. Conclusion Gun law permissiveness scales are highly correlated despite any perceived political agenda, and the choice of gun law permissiveness scale has little effect on study conclusions related to gun violence outcomes.



2015 ◽  
Vol 119 (1218) ◽  
pp. 961-980 ◽  
Author(s):  
P-D. Jameson ◽  
A. K. Cooke

Abstract Reduced order models representing the dynamic behaviour of symmetric aircraft are well known and can be easily derived from the standard equations of motion. In flight testing, accurate measurements of the dependent variables which describe the linearised reduced order models for a particular flight condition are vital for successful system identification. However, not all the desired measurements such as the rate of change in vertical velocity (Ẇ) can be accurately measured in practice. In order to determine such variables two possible solutions exist: reconstruction or differentiation. This paper addresses the effect of both methods on the reliability of the parameter estimates. The methods are used in the estimation of the aerodynamic derivatives for the Aerosonde UAV from a recreated flight test scenario in Simulink. Subsequently, the methods are then applied and compared using real data obtained from flight tests of the Cranfield University Jetstream 31 (G-NFLA) research aircraft.



2017 ◽  
Vol 40 (1) ◽  
pp. 165-203 ◽  
Author(s):  
Sanku Dey ◽  
Enayetur Raheem ◽  
Saikat Mukherjee

This article addresses the various properties and different methods of estimation of the unknown parameters of the Transmuted Rayleigh (TR) distribution from the frequentist point of view. Although, our main focus is on estimation from frequentist point of view,  yet, various mathematical and statistical properties of the TR distribution (such as quantiles, moments, moment generating function, conditional moments,  hazard rate, mean residual lifetime, mean past lifetime,  mean deviation about mean and median, the stochastic ordering,  various entropies, stress-strength parameter  and order statistics) are derived.  We briefly describe different frequentist methods of estimation approaches, namely, maximum likelihood estimators, moments estimators, L-moment estimators, percentile based estimators, least squares estimators, method of maximum product of spacings,  method of Cram\'er-von-Mises, methods of Anderson-Darling and right-tail Anderson-Darling and compare them using extensive numerical simulations. Monte Carlo simulations are performed to compare the performances of the proposed methods of estimation for both small and large samples. Finally, the potentiality of the model is analyzed by means of two real data sets which is further illustrated by obtaining bias and standard error of the estimates and the bootstrap percentile confidence intervals using bootstrap resampling.



2014 ◽  
Vol 11 (2) ◽  
pp. 193-201
Author(s):  
Baghdad Science Journal

This paper interest to estimation the unknown parameters for generalized Rayleigh distribution model based on censored samples of singly type one . In this paper the probability density function for generalized Rayleigh is defined with its properties . The maximum likelihood estimator method is used to derive the point estimation for all unknown parameters based on iterative method , as Newton – Raphson method , then derive confidence interval estimation which based on Fisher information matrix . Finally , testing whether the current model ( GRD ) fits to a set of real data , then compute the survival function and hazard function for this real data.



Geophysics ◽  
2014 ◽  
Vol 79 (1) ◽  
pp. V1-V11 ◽  
Author(s):  
Amr Ibrahim ◽  
Mauricio D. Sacchi

We adopted the robust Radon transform to eliminate erratic incoherent noise that arises in common receiver gathers when simultaneous source data are acquired. The proposed robust Radon transform was posed as an inverse problem using an [Formula: see text] misfit that is not sensitive to erratic noise. The latter permitted us to design Radon algorithms that are capable of eliminating incoherent noise in common receiver gathers. We also compared nonrobust and robust Radon transforms that are implemented via a quadratic ([Formula: see text]) or a sparse ([Formula: see text]) penalty term in the cost function. The results demonstrated the importance of incorporating a robust misfit functional in the Radon transform to cope with simultaneous source interferences. Synthetic and real data examples proved that the robust Radon transform produces more accurate data estimates than least-squares and sparse Radon transforms.



Sign in / Sign up

Export Citation Format

Share Document