scholarly journals Bayesian inference on reliability in a multicomponent stress-strength bathtub-shaped model based on record values

Author(s):  
Abbas Pak ◽  
Nayereh Bagheri Khoolenjani ◽  
Manoj Kumar Rastogi

In the literature, there are a well-developed estimation techniques for the reliability assessment in multicomponent stress-strength models when the information about all the experimental units are available. However, in real applications, only observations that exceed (or fall below) the current value may be recorded. In this paper, assuming that the components of the system follow bathtub-shaped distribution, we investigate Bayesian estimation of the reliability of a multicomponent stress-strength system when the available data are reported in terms of record values. Considering squared error, linex and entropy loss functions, various Bayes estimates of the reliability are derived. Because there are not closed forms for the Bayes estimates, we will use Lindley’s method to calculate the approximate Bayes estimates. Further, for comparison purposes, the maximum likelihood estimate of the reliability parameter is obtained. Finally, simulation studies are conducted in order to evaluate the performances of the proposed procedures and analysis of real data sets is provided.

Author(s):  
F. Shahsanaei ◽  
A. Daneshkhah

This paper provides Bayesian and classical inference of Stress–Strength reliability parameter, [Formula: see text], where both [Formula: see text] and [Formula: see text] are independently distributed as 3-parameter generalized linear failure rate (GLFR) random variables with different parameters. Due to importance of stress–strength models in various fields of engineering, we here address the maximum likelihood estimator (MLE) of [Formula: see text] and the corresponding interval estimate using some efficient numerical methods. The Bayes estimates of [Formula: see text] are derived, considering squared error loss functions. Because the Bayes estimates could not be expressed in closed forms, we employ a Markov Chain Monte Carlo procedure to calculate approximate Bayes estimates. To evaluate the performances of different estimators, extensive simulations are implemented and also real datasets are analyzed.


2020 ◽  
Vol 26 (1) ◽  
pp. 69-82
Author(s):  
Kahina Bedouhene ◽  
Nabil Zougab

AbstractA Bayesian procedure for bandwidth selection in kernel circular density estimation is investigated, when the Markov chain Monte Carlo (MCMC) sampling algorithm is utilized for Bayes estimates. Under the quadratic and entropy loss functions, the proposed method is evaluated through a simulation study and real data sets, which were already discussed in the literature. The proposed Bayesian approach is very competitive in comparison with the existing classical global methods, namely plug-in and cross-validation techniques.


Author(s):  
Rahila Yousaf ◽  
Sajid Ali ◽  
Muhammad Aslam

In this article, we aim to estimate the parameters of the transmuted Weibull distribution (TWD) using Bayesian approach, as the Weibull distribution plays an important role in reliability engineering and life testing problems. Informative and non-informative priors under squared error loss function (SELF), precautionary loss function (PLF) and quadratic loss function (QLF) are assumed to estimate the scale, the shape and the transmuted parameter of the TWD. In addition to this, we also compute the Bayesian credible intervals (BCIs). To estimate parameters, we adopt Markov Chain Monte Carlo (MCMC) technique assuming uncensored and censored environments in terms of different sample sizes and censoring rates. The posterior risks, associated with each estimator are used to compare the performance of different estimators. Two real data sets are analyzed to illustrate the flexibility of the proposed distribution.


Author(s):  
Hiba Zeyada Muhammed ◽  
Essam Abd Elsalam Muhammed

In this paper, Bayesian and non-Bayesian estimation of the inverted Topp-Leone distribution shape parameter are studied when the sample is complete and random censored. The maximum likelihood estimator (MLE) and Bayes estimator of the unknown parameter are proposed. The Bayes estimates (BEs) have been computed based on the squared error loss (SEL) function and using Markov Chain Monte Carlo (MCMC) techniques. The asymptotic, bootstrap (p,t), and highest posterior density intervals are computed. The Metropolis Hasting algorithm is proposed for Bayes estimates. Monte Carlo simulation is performed to compare the performances of the proposed methods and one real data set has been analyzed for illustrative purposes.


2018 ◽  
Vol 41 (2) ◽  
pp. 251-267 ◽  
Author(s):  
Abbas Pak ◽  
Arjun Kumar Gupta ◽  
Nayereh Bagheri Khoolenjani

In this paper  we study the reliability of a multicomponent stress-strength model assuming that the components follow power Lindley model.  The maximum likelihood estimate of the reliability parameter and its asymptotic confidence interval are obtained. Applying the parametric Bootstrap technique, interval estimation of the reliability is presented.  Also, the Bayes estimate and highest posterior density credible interval of the reliability parameter are derived using suitable priors on the parameters. Because there is no closed form for the Bayes estimate, we use the Markov Chain Monte Carlo method to obtain approximate Bayes  estimate of the reliability. To evaluate the performances of different procedures,  simulation studies are conducted and an example of real data sets is provided.


Symmetry ◽  
2021 ◽  
Vol 13 (3) ◽  
pp. 412 ◽  
Author(s):  
Hadeel S. Klakattawi ◽  
Wedad H. Aljuhani

In the following article, a new five-parameter distribution, the alpha power exponentiated Weibull-exponential distribution is proposed, based on a newly developed technique. It is of particular interest because the density of this distribution can take various symmetric and asymmetric possible shapes. Moreover, its related hazard function is tractable and showing a great diversity of asymmetrical shaped, including increasing, decreasing, near symmetrical, increasing-decreasing-increasing, increasing-constant-increasing, J-shaped, and reversed J-shaped. Some properties relating to the proposed distribution are provided. The inferential method of maximum likelihood is employed, in order to estimate the model’s unknown parameters, and these estimates are evaluated based on various simulation studies. Moreover, the usefulness of the model is investigated through its application to three real data sets. The results show that the proposed distribution can, in fact, better fit the data, when compared to other competing distributions.


Author(s):  
Fiaz Ahmad Bhatti ◽  
Gauss M. Cordeiro ◽  
Mustafa Ç. Korkmaz ◽  
G.G. Hamedani

We introduce a four-parameter lifetime model with flexible hazard rate called the Burr XII gamma (BXIIG) distribution.  We derive the BXIIG distribution from (i) the T-X family technique and (ii) nexus between the exponential and gamma variables. The failure rate function for the BXIIG distribution is flexible as it can accommodate various shapes such as increasing, decreasing, decreasing-increasing, increasing-decreasing-increasing, bathtub and modified bathtub.  Its density function can take shapes such as exponential, J, reverse-J, left-skewed, right-skewed and symmetrical. To illustrate the importance of the BXIIG distribution, we establish various mathematical properties such as random number generator, ordinary moments, generating function, conditional moments, density functions of record values, reliability measures and characterizations.  We address the maximum likelihood estimation for the parameters. We estimate the adequacy of the estimators via a simulation study. We consider applications to two real data sets to prove empirically the potentiality of the proposed model.


2021 ◽  
pp. 096228022110342
Author(s):  
Denis Talbot ◽  
Awa Diop ◽  
Mathilde Lavigne-Robichaud ◽  
Chantal Brisson

Background The change in estimate is a popular approach for selecting confounders in epidemiology. It is recommended in epidemiologic textbooks and articles over significance test of coefficients, but concerns have been raised concerning its validity. Few simulation studies have been conducted to investigate its performance. Methods An extensive simulation study was realized to compare different implementations of the change in estimate method. The implementations were also compared when estimating the association of body mass index with diastolic blood pressure in the PROspective Québec Study on Work and Health. Results All methods were susceptible to introduce important bias and to produce confidence intervals that included the true effect much less often than expected in at least some scenarios. Overall mixed results were obtained regarding the accuracy of estimators, as measured by the mean squared error. No implementation adequately differentiated confounders from non-confounders. In the real data analysis, none of the implementation decreased the estimated standard error. Conclusion Based on these results, it is questionable whether change in estimate methods are beneficial in general, considering their low ability to improve the precision of estimates without introducing bias and inability to yield valid confidence intervals or to identify true confounders.


Author(s):  
M. M. E. Abd El-Monsef ◽  
Ghareeb A. Marei ◽  
N. M. Kilany

This paper aims to estimate the stress-strength reliability parameter  when  and  are follow the weighted Lomax (WL) distribution. The behavior of stress-strength parameters and reliability have been studied by using maximum likelihood and Bayesian estimators through the Monte Carlo simulation study which carried out showing satisfactory performance of the estimators obtained. Finally, two real data sets representing waiting times before service of the customers of two banks A and B are fitted using the WL distribution and used to estimate the stress-strength parameters and reliability function.


2014 ◽  
Vol 53 (01) ◽  
pp. 54-61 ◽  
Author(s):  
M. Preuß ◽  
A. Ziegler

SummaryBackground: The random-effects (RE) model is the standard choice for meta-analysis in the presence of heterogeneity, and the stand ard RE method is the DerSimonian and Laird (DSL) approach, where the degree of heterogeneity is estimated using a moment-estimator. The DSL approach does not take into account the variability of the estimated heterogeneity variance in the estimation of Cochran’s Q. Biggerstaff and Jackson derived the exact cumulative distribution function (CDF) of Q to account for the variability of Ť 2.Objectives: The first objective is to show that the explicit numerical computation of the density function of Cochran’s Q is not required. The second objective is to develop an R package with the possibility to easily calculate the classical RE method and the new exact RE method.Methods: The novel approach was validated in extensive simulation studies. The different approaches used in the simulation studies, including the exact weights RE meta-analysis, the I 2 and T 2 estimates together with their confidence intervals were implemented in the R package metaxa.Results: The comparison with the classical DSL method showed that the exact weights RE meta-analysis kept the nominal type I error level better and that it had greater power in case of many small studies and a single large study. The Hedges RE approach had inflated type I error levels. Another advantage of the exact weights RE meta-analysis is that an exact confidence interval for T 2is readily available. The exact weights RE approach had greater power in case of few studies, while the restricted maximum likelihood (REML) approach was superior in case of a large number of studies. Differences between the exact weights RE meta-analysis and the DSL approach were observed in the re-analysis of real data sets. Application of the exact weights RE meta-analysis, REML, and the DSL approach to real data sets showed that conclusions between these methods differed.Conclusions: The simplification does not require the calculation of the density of Cochran’s Q, but only the calculation of the cumulative distribution function, while the previous approach required the computation of both the density and the cumulative distribution function. It thus reduces computation time, improves numerical stability, and reduces the approximation error in meta-analysis. The different approaches, including the exact weights RE meta-analysis, the I 2 and T 2estimates together with their confidence intervals are available in the R package metaxa, which can be used in applications.


Sign in / Sign up

Export Citation Format

Share Document