scholarly journals ON THE MUTH DISTRIBUTION

2015 ◽  
Vol 20 (3) ◽  
pp. 291-310 ◽  
Author(s):  
Pedro Jodra ◽  
Maria Dolores Jimenez-Gamero ◽  
Maria Virtudes Alba-Fernandez

The Muth distribution is a continuous random variable introduced in the context of reliability theory. In this paper, some mathematical properties of the model are derived, including analytical expressions for the moment generating function, moments, mode, quantile function and moments of the order statistics. In this regard, the generalized integro-exponential function, the Lambert W function and the golden ratio arise in a natural way. The parameter estimation of the model is performed by the methods of maximum likelihood, least squares, weighted least squares and moments, which are compared via a Monte Carlo simulation study. A natural extension of the model is considered as well as an application to a real data set.

2021 ◽  
Vol 50 (3) ◽  
pp. 77-105
Author(s):  
Devendra Kumar ◽  
Mazen Nassar ◽  
Ahmed Z. Afify ◽  
Sanku Dey

A new continuous four-parameter lifetime distribution is introduced by compounding the distribution of the maximum of a sequence of an independently identically exponentiated Lomax distributed random variables and zero truncated Poisson random variable, defined as the complementary exponentiated Lomax Poisson (CELP) distribution. The new distribution which exhibits decreasing and upside down bathtub shaped density while the distribution has the ability to model lifetime data with decreasing, increasing and upside-down bathtub shaped failure rates. The new distribution has a number of well-known lifetime special sub-models, such as Lomax-zero truncated Poisson distribution, exponentiated Pareto-zero truncated Poisson distribution and Pareto- zero truncated Poisson distribution. A comprehensive account of the mathematical and statistical properties of the new distribution is presented. The model parameters are obtained by the methods of maximum likelihood, least squares, weighted least squares, percentiles, maximum product of spacing and Cram\'er-von-Mises and compared them using Monte Carlo simulation study. We illustrate the performance of the proposed distribution by means of two real data sets and both the data sets show the new distribution is more appropriate as compared to the transmuted Lomax, beta exponentiated Lomax, McDonald Lomax, Kumaraswamy Lomax, Weibull Lomax, Burr X Lomax and Lomax distributions.


Author(s):  
Parisa Torkaman

The generalized inverted exponential distribution is introduced as a lifetime model with good statistical properties. This paper, the estimation of the probability density function and the cumulative distribution function of with five different estimation methods: uniformly minimum variance unbiased(UMVU), maximum likelihood(ML), least squares(LS), weighted least squares (WLS) and percentile(PC) estimators are considered. The performance of these estimation procedures, based on the mean squared error (MSE) by numerical simulations are compared. Simulation studies express that the UMVU estimator performs better than others and when the sample size is large enough the ML and UMVU estimators are almost equivalent and efficient than LS, WLS and PC. Finally, the result using a real data set are analyzed.


2019 ◽  
Vol 23 (Suppl. 6) ◽  
pp. 1839-1847
Author(s):  
Caner Tanis ◽  
Bugra Saracoglu

In this paper, it is considered the problem of estimation of unknown parameters of log-Kumaraswamy distribution via Monte-Carlo simulations. Firstly, it is described six different estimation methods such as maximum likelihood, approximate bayesian, least-squares, weighted least-squares, percentile, and Cramer-von-Mises. Then, it is performed a Monte-Carlo simulation study to evaluate the performances of these methods according to the biases and mean-squared errors of the estimators. Furthermore, two real data applications based on carbon fibers and the gauge lengths are presented to compare the fits of log-Kumaraswamy and other fitted statistical distributions.


Entropy ◽  
2021 ◽  
Vol 23 (4) ◽  
pp. 446 ◽  
Author(s):  
Mahmoud EL-Morshedy ◽  
Fahad Sameer Alshammari ◽  
Abhishek Tyagi ◽  
Iberahim Elbatal ◽  
Yasser S. Hamed ◽  
...  

In this article, we have proposed a new generalization of the odd Weibull-G family by consolidating two notable families of distributions. We have derived various mathematical properties of the proposed family, including quantile function, skewness, kurtosis, moments, incomplete moments, mean deviation, Bonferroni and Lorenz curves, probability weighted moments, moments of (reversed) residual lifetime, entropy and order statistics. After producing the general class, two of the corresponding parametric statistical models are outlined. The hazard rate function of the sub-models can take a variety of shapes such as increasing, decreasing, unimodal, and Bathtub shaped, for different values of the parameters. Furthermore, the sub-models of the introduced family are also capable of modelling symmetric and skewed data. The parameter estimation of the special models are discussed by numerous methods, namely, the maximum likelihood, simple least squares, weighted least squares, Cramér-von Mises, and Bayesian estimation. Under the Bayesian framework, we have used informative and non-informative priors to obtain Bayes estimates of unknown parameters with the squared error and generalized entropy loss functions. An extensive Monte Carlo simulation is conducted to assess the effectiveness of these estimation techniques. The applicability of two sub-models of the proposed family is illustrated by means of two real data sets.


Geophysics ◽  
2006 ◽  
Vol 71 (5) ◽  
pp. U67-U76 ◽  
Author(s):  
Robert J. Ferguson

The possibility of improving regularization/datuming of seismic data is investigated by treating wavefield extrapolation as an inversion problem. Weighted, damped least squares is then used to produce the regularized/datumed wavefield. Regularization/datuming is extremely costly because of computing the Hessian, so an efficient approximation is introduced. Approximation is achieved by computing a limited number of diagonals in the operators involved. Real and synthetic data examples demonstrate the utility of this approach. For synthetic data, regularization/datuming is demonstrated for large extrapolation distances using a highly irregular recording array. Without approximation, regularization/datuming returns a regularized wavefield with reduced operator artifacts when compared to a nonregularizing method such as generalized phase shift plus interpolation (PSPI). Approximate regularization/datuming returns a regularized wavefield for approximately two orders of magnitude less in cost; but it is dip limited, though in a controllable way, compared to the full method. The Foothills structural data set, a freely available data set from the Rocky Mountains of Canada, demonstrates application to real data. The data have highly irregular sampling along the shot coordinate, and they suffer from significant near-surface effects. Approximate regularization/datuming returns common receiver data that are superior in appearance compared to conventional datuming.


2021 ◽  
Author(s):  
Lajos Horváth ◽  
Zhenya Liu ◽  
Gregory Rice ◽  
Yuqian Zhao

Abstract The problem of detecting change points in the mean of high dimensional panel data with potentially strong cross–sectional dependence is considered. Under the assumption that the cross–sectional dependence is captured by an unknown number of common factors, a new CUSUM type statistic is proposed. We derive its asymptotic properties under three scenarios depending on to what extent the common factors are asymptotically dominant. With panel data consisting of N cross sectional time series of length T, the asymptotic results hold under the mild assumption that min {N, T} → ∞, with an otherwise arbitrary relationship between N and T, allowing the results to apply to most panel data examples. Bootstrap procedures are proposed to approximate the sampling distribution of the test statistics. A Monte Carlo simulation study showed that our test outperforms several other existing tests in finite samples in a number of cases, particularly when N is much larger than T. The practical application of the proposed results are demonstrated with real data applications to detecting and estimating change points in the high dimensional FRED-MD macroeconomic data set.


2017 ◽  
Vol 7 (1) ◽  
pp. 72 ◽  
Author(s):  
Lamya A Baharith

Truncated type I generalized logistic distribution has been used in a variety of applications. In this article, a new bivariate truncated type I generalized logistic (BTTGL) distributional models driven from three different copula functions are introduced. A study of some properties is illustrated. Parametric and semiparametric methods are used to estimate the parameters of the BTTGL models. Maximum likelihood and inference function for margin estimates of the BTTGL parameters are compared with semiparametric estimates using real data set. Further, a comparison between BTTGL, bivariate generalized exponential and bivariate exponentiated Weibull models is conducted using Akaike information criterion and the maximized log-likelihood. Extensive Monte Carlo simulation study is carried out for different values of the parameters and different sample sizes to compare the performance of parametric and semiparametric estimators based on relative mean square error.


2009 ◽  
Vol 2009 ◽  
pp. 1-8 ◽  
Author(s):  
Janet Myhre ◽  
Daniel R. Jeske ◽  
Michael Rennie ◽  
Yingtao Bi

A heteroscedastic linear regression model is developed from plausible assumptions that describe the time evolution of performance metrics for equipment. The inherited motivation for the related weighted least squares analysis of the model is an essential and attractive selling point to engineers with interest in equipment surveillance methodologies. A simple test for the significance of the heteroscedasticity suggested by a data set is derived and a simulation study is used to evaluate the power of the test and compare it with several other applicable tests that were designed under different contexts. Tolerance intervals within the context of the model are derived, thus generalizing well-known tolerance intervals for ordinary least squares regression. Use of the model and its associated analyses is illustrated with an aerospace application where hundreds of electronic components are continuously monitored by an automated system that flags components that are suspected of unusual degradation patterns.


Filomat ◽  
2018 ◽  
Vol 32 (17) ◽  
pp. 5931-5947
Author(s):  
Hatami Mojtaba ◽  
Alamatsaz Hossein

In this paper, we propose a new transformation of circular random variables based on circular distribution functions, which we shall call inverse distribution function (id f ) transformation. We show that M?bius transformation is a special case of our id f transformation. Very general results are provided for the properties of the proposed family of id f transformations, including their trigonometric moments, maximum entropy, random variate generation, finite mixture and modality properties. In particular, we shall focus our attention on a subfamily of the general family when id f transformation is based on the cardioid circular distribution function. Modality and shape properties are investigated for this subfamily. In addition, we obtain further statistical properties for the resulting distribution by applying the id f transformation to a random variable following a von Mises distribution. In fact, we shall introduce the Cardioid-von Mises (CvM) distribution and estimate its parameters by the maximum likelihood method. Finally, an application of CvM family and its inferential methods are illustrated using a real data set containing times of gun crimes in Pittsburgh, Pennsylvania.


2020 ◽  
Vol 9 (1) ◽  
pp. 47-60
Author(s):  
Samir K. Ashour ◽  
Ahmed A. El-Sheikh ◽  
Ahmed Elshahhat

In this paper, the Bayesian and non-Bayesian estimation of a two-parameter Weibull lifetime model in presence of progressive first-failure censored data with binomial random removals are considered. Based on the s-normal approximation to the asymptotic distribution of maximum likelihood estimators, two-sided approximate confidence intervals for the unknown parameters are constructed. Using gamma conjugate priors, several Bayes estimates and associated credible intervals are obtained relative to the squared error loss function. Proposed estimators cannot be expressed in closed forms and can be evaluated numerically by some suitable iterative procedure. A Bayesian approach is developed using Markov chain Monte Carlo techniques to generate samples from the posterior distributions and in turn computing the Bayes estimates and associated credible intervals. To analyze the performance of the proposed estimators, a Monte Carlo simulation study is conducted. Finally, a real data set is discussed for illustration purposes.


Sign in / Sign up

Export Citation Format

Share Document