bayesian procedures
Recently Published Documents


TOTAL DOCUMENTS

65
(FIVE YEARS 14)

H-INDEX

11
(FIVE YEARS 2)

2021 ◽  
Vol 2021 ◽  
pp. 1-15
Author(s):  
Ali Algarni ◽  
Mohammed Elgarhy ◽  
Abdullah M Almarashi ◽  
Aisha Fayomi ◽  
Ahmed R El-Saeed

The challenge of estimating the parameters for the inverse Weibull (IW) distribution employing progressive censoring Type-I (PCTI) will be addressed in this study using Bayesian and non-Bayesian procedures. To address the issue of censoring time selection, qauntiles from the IW lifetime distribution will be implemented as censoring time points for PCTI. Focusing on the censoring schemes, maximum likelihood estimators (MLEs) and asymptotic confidence intervals (ACI) for unknown parameters are constructed. Under the squared error (SEr) loss function, Bayes estimates (BEs) and concomitant maximum posterior density credible interval estimations are also produced. The BEs are assessed using two methods: Lindley’s approximation (LiA) technique and the Metropolis-Hasting (MH) algorithm utilizing Markov Chain Monte Carlo (MCMC). The theoretical implications of MLEs and BEs for specified schemes of PCTI samples are shown via a simulation study to compare the performance of the different suggested estimators. Finally, application of two real data sets will be employed.


2021 ◽  
Vol 9 (4) ◽  
pp. 789-808
Author(s):  
Amal Helu ◽  
Hani Samawi

In this article, we consider statistical inferences about the unknown parameters of the Lomax distribution basedon the Adaptive Type-II Progressive Hybrid censoring scheme, this scheme can save both the total test time and the cost induced by the failure of the units and increases the efficiency of statistical analysis. The estimation of the parameters is derived using the maximum likelihood (MLE) and the Bayesian procedures. The Bayesian estimators are obtained based on the symmetric and asymmetric loss functions. There are no explicit forms for the Bayesian estimators, therefore, we propose Lindley’s approximation method to compute the Bayesian estimators. A comparison between these estimators is provided by using extensive simulation. A real-life data example is provided to illustrate our proposed estimators.


2021 ◽  
Vol 20 ◽  
pp. 288-299
Author(s):  
Refah Mohammed Alotaibi ◽  
Yogesh Mani Tripathi ◽  
Sanku Dey ◽  
Hoda Ragab Rezk

In this paper, inference upon stress-strength reliability is considered for unit-Weibull distributions with a common parameter under the assumption that data are observed using progressive type II censoring. We obtain di_erent estimators of system reliability using classical and Bayesian procedures. Asymptotic interval is constructed based on Fisher information matrix. Besides, boot-p and boot-t intervals are also obtained. We evaluate Bayes estimates using Lindley's technique and Metropolis-Hastings (MH) algorithm. The Bayes credible interval is evaluated using MH method. An unbiased estimator of this parametric function is also obtained under know common parameter case. Numerical simulations are performed to compare estimation methods. Finally, a data set is studied for illustration purposes.


2021 ◽  
Vol 228 ◽  
pp. 111522
Author(s):  
Yan Zhang ◽  
Luc E. Chouinard ◽  
David Conciatori ◽  
Gabriel J. Power

Author(s):  
Johnny van Doorn ◽  
Don van den Bergh ◽  
Udo Böhm ◽  
Fabian Dablander ◽  
Koen Derks ◽  
...  

Abstract Despite the increasing popularity of Bayesian inference in empirical research, few practical guidelines provide detailed recommendations for how to apply Bayesian procedures and interpret the results. Here we offer specific guidelines for four different stages of Bayesian statistical reasoning in a research setting: planning the analysis, executing the analysis, interpreting the results, and reporting the results. The guidelines for each stage are illustrated with a running example. Although the guidelines are geared towards analyses performed with the open-source statistical software JASP, most guidelines extend to Bayesian inference in general.


2020 ◽  
Vol 58 (3) ◽  
pp. 644-719 ◽  
Author(s):  
Mark F. J. Steel

The method of model averaging has become an important tool to deal with model uncertainty, for example in situations where a large amount of different theories exist, as are common in economics. Model averaging is a natural and formal response to model uncertainty in a Bayesian framework, and most of the paper deals with Bayesian model averaging. The important role of the prior assumptions in these Bayesian procedures is highlighted. In addition, frequentist model averaging methods are also discussed. Numerical techniques to implement these methods are explained, and I point the reader to some freely available computational resources. The main focus is on uncertainty regarding the choice of covariates in normal linear regression models, but the paper also covers other, more challenging, settings, with particular emphasis on sampling models commonly used in economics. Applications of model averaging in economics are reviewed and discussed in a wide range of areas including growth economics, production modeling, finance and forecasting macroeconomic quantities. (JEL C11, C15, C20, C52, O47).


2020 ◽  
pp. 2391-2400
Author(s):  
Valter Harry Bumbieris Junior ◽  
Robson Marcelo Rossi ◽  
Egon Henrique Horst ◽  
Murilo Dolfini Paranzini ◽  
Vinicius André de Pietro Guimarães ◽  
...  

The objective of this study was to evaluate the ruminal degradability of dry matter and crude protein of high moisture triticale silage ensiled with different chemical and biological additives. Urea, sodium benzoate and an enzyme-bacterial inoculant were used as treatments. Four samples from each treatment were incubated in rumen on four sheep. Effective degradability was estimated for ruminal passage rate of 2%, 5% and 8% hour-1. Bayesian procedures were used to estimate potential degradation parameters in situ. The high moisture triticale silage with urea showed highest value for the soluble fraction (70.46%) and the best effective dry matter degradability, with a passing rate of 2% h-1 (90.63%), of control silage at other rates of passage. In relation to control silage, the addition of sodium benzoate and enzyme-bacterial inoculant decreased the effective degradability of dry matter, regardless of rate passage evaluated. Due to high solubility of urea, the silage added with this additive had the highest soluble fraction of crude protein (76.42%). The addition of enzyme-bacterial inoculant accelerated the ruminal passage rate of dry matter and protein to 0.26 and 0.20% h-1, respectively, providing less potential degradability of both in relation to other silages. As enzyme-bacterial inoculation reduces rumen degradability of crude protein, it tends to increase the availability of amino acids for intestinal absorption. The addition of urea to high moisture triticale silage may be recommended for sheep feeding at a low level of consumption, as it improves the effective dry matter degradability.


Biometrika ◽  
2020 ◽  
Vol 107 (4) ◽  
pp. 891-906
Author(s):  
Antonio Lijoi ◽  
Igor Prünster ◽  
Tommaso Rigon

Summary Discrete nonparametric priors play a central role in a variety of Bayesian procedures, most notably when used to model latent features, such as in clustering, mixtures and curve fitting. They are effective and well-developed tools, though their infinite dimensionality is unsuited to some applications. If one restricts to a finite-dimensional simplex, very little is known beyond the traditional Dirichlet multinomial process, which is mainly motivated by conjugacy. This paper introduces an alternative based on the Pitman–Yor process, which provides greater flexibility while preserving analytical tractability. Urn schemes and posterior characterizations are obtained in closed form, leading to exact sampling methods. In addition, the proposed approach can be used to accurately approximate the infinite-dimensional Pitman–Yor process, yielding improvements over existing truncation-based approaches. An application to convex mixture regression for quantitative risk assessment illustrates the theoretical results and compares our approach with existing methods.


Biostatistics ◽  
2020 ◽  
Author(s):  
Samiran Ghosh ◽  
Erina Paul ◽  
Shrabanti Chowdhury ◽  
Ram C. Tiwari

Summary With the availability of limited resources, innovation for improved statistical method for the design and analysis of randomized controlled trials (RCTs) is of paramount importance for newer and better treatment discovery for any therapeutic area. Although clinical efficacy is almost always the primary evaluating criteria to measure any beneficial effect of a treatment, there are several important other factors (e.g., side effects, cost burden, less debilitating, less intensive, etc.), which can permit some less efficacious treatment options favorable to a subgroup of patients. This leads to non-inferiority (NI) testing. The objective of NI trial is to show that an experimental treatment is not worse than an active reference treatment by more than a pre-specified margin. Traditional NI trials do not include a placebo arm for ethical reason; however, this necessitates stringent and often unverifiable assumptions. On the other hand, three-arm NI trials consisting of placebo, reference, and experimental treatment, can simultaneously test the superiority of the reference over placebo and NI of experimental treatment over the reference. In this article, we proposed both novel Frequentist and Bayesian procedures for testing NI in the three-arm trial with Poisson distributed count outcome. RCTs with count data as the primary outcome are quite common in various disease areas such as lesion count in cancer trials, relapses in multiple sclerosis, dermatology, neurology, cardiovascular research, adverse event count, etc. We first propose an improved Frequentist approach, which is then followed by it’s Bayesian version. Bayesian methods have natural advantage in any active-control trials, including NI trial when substantial historical information is available for placebo and established reference treatment. In addition, we discuss sample size calculation and draw an interesting connection between the two paradigms.


Author(s):  
Leonardo Martin Nieto ◽  
Luiz Otávio Campos da Silva ◽  
Antônio do Nascimento Ferreira Rosa

Abstract: The objective of this work was to evaluate the potential of different threshold models to determine the genetic variability in Nellore cattle, with basis on the heritability estimates for the traits stayability (STA) and first calving probability at 36 months of age (CP36). Data came from the Nellore herds participating in the animal breeding program of the Embrapa-Geneplus partnership. Binomial and multi-threshold models were defined for the STA and CP36 traits. Heritability estimates were obtained following Bayesian procedures in the Multiple-trait Gibbs Sampler for Animal Models (MTGSAM) software, using a sire-maternal grandsire model. The heritability estimates, provided by the binary and alternative models, were, respectively, 0.08 and 0.12 for STA and 0.17 and 0.12 for CP36. The multi-threshold model can efficiently detect the genetic variability for stayability, but not for probability of calving for 36-month-old cows.


Sign in / Sign up

Export Citation Format

Share Document