bayesian paradigm
Recently Published Documents


TOTAL DOCUMENTS

101
(FIVE YEARS 33)

H-INDEX

14
(FIVE YEARS 1)

Author(s):  
Osval Antonio Montesinos López ◽  
Abelardo Montesinos López ◽  
Jose Crossa

AbstractThe Bayesian paradigm for parameter estimation is introduced and linked to the main problem of genomic-enabled prediction to predict the trait of interest of the non-phenotyped individuals from genotypic information, environment variables, or other information (covariates). In this situation, a convenient practice is to include the individuals to be predicted in the posterior distribution to be sampled. We explained how the Bayesian Ridge regression method is derived and exemplified with data from plant breeding genomic selection. Other Bayesian methods (Bayes A, Bayes B, Bayes C, and Bayesian Lasso) were also described and exemplified for genome-based prediction. The chapter presented several examples that were implemented in the Bayesian generalized linear regression (BGLR) library for continuous response variables. The predictor under all these Bayesian methods includes main effects (of environments and genotypes) as well as interaction terms related to genotype × environment interaction.


2021 ◽  
Author(s):  
Chong Zhong ◽  
Zhihua Ma ◽  
Junshan Shen ◽  
Catherine Liu

Bayesian paradigm takes advantage of well-fitting complicated survival models and feasible computing in survival analysis owing to the superiority in tackling the complex censoring scheme, compared with the frequentist paradigm. In this chapter, we aim to display the latest tendency in Bayesian computing, in the sense of automating the posterior sampling, through a Bayesian analysis of survival modeling for multivariate survival outcomes with the complicated data structure. Motivated by relaxing the strong assumption of proportionality and the restriction of a common baseline population, we propose a generalized shared frailty model which includes both parametric and nonparametric frailty random effects to incorporate both treatment-wise and temporal variation for multiple events. We develop a survival-function version of the ANOVA dependent Dirichlet process to model the dependency among the baseline survival functions. The posterior sampling is implemented by the No-U-Turn sampler in Stan, a contemporary Bayesian computing tool, automatically. The proposed model is validated by analysis of the bladder cancer recurrences data. The estimation is consistent with existing results. Our model and Bayesian inference provide evidence that the Bayesian paradigm fosters complex modeling and feasible computing in survival analysis, and Stan relaxes the posterior inference.


Author(s):  
Georgios P. Karagiannis

AbstractWe present basic concepts of Bayesian statistical inference. We briefly introduce the Bayesian paradigm. We present the conjugate priors; a computational convenient way to quantify prior information for tractable Bayesian statistical analysis. We present tools for parametric and predictive inference, and particularly the design of point estimators, credible sets, and hypothesis tests. These concepts are presented in running examples. Supplementary material is available from GitHub.


2021 ◽  
Author(s):  
Javaid Ahmad Reshi ◽  
Bilal Ahmad Para ◽  
Shahzad Ahmad Bhat

This paper deals with estimation of parameters of Weighted Maxwell-Boltzmann Distribution by using Classical and Bayesian Paradigm. Under Classical Approach, we have estimated the rate parameter using Maximum likelihood Estimator. In Bayesian Paradigm, we have primarily studied the Bayes’ estimator of the parameter of the Weighted Maxwell-Boltzmann Distribution under the extended Jeffrey’s prior, Gamma and exponential prior distributions assuming different loss functions. The extended Jeffrey’s prior gives the opportunity of covering wide spectrum of priors to get Bayes’ estimates of the parameter – particular cases of which are Jeffrey’s prior and Hartigan’s prior. A comparative study has been done between the MLE and the estimates of different loss functions (SELF and Al-Bayyati’s, Stein and Precautionary new loss function). From the results, we observe that in most cases, Bayesian Estimator under New Loss function (Al-Bayyati’s Loss function) has the smallest Mean Squared Error values for both prior’s i.e., Jeffrey’s and an extension of Jeffrey’s prior information. Moreover, when the sample size increases, the MSE decreases quite significantly. These estimators are then compared in terms of mean square error (MSE) which is computed by using the programming language R. Also, two types of real life data sets are considered for making the model comparison between special cases of Weighted Maxwell-Boltzmann Distribution in terms of fitting.


2021 ◽  
Vol 2021 ◽  
pp. 1-20
Author(s):  
Refah Alotaibi ◽  
Mervat Khalifa ◽  
Ehab M. Almetwally ◽  
Indranil Ghosh ◽  
Rezk. H.

Exponentiated exponential (EE) model has been used effectively in reliability, engineering, biomedical, social sciences, and other applications. In this study, we introduce a new bivariate mixture EE model with two parameters assuming two cases, independent and dependent random variables. We develop a bivariate mixture starting from two EE models assuming two cases, two independent and two dependent EE models. We study some useful statistical properties of this distribution, such as marginals and conditional distributions and product moments and conditional moments. In addition, we study a dependent case, a new mixture of the bivariate model based on EE distribution marginal with two parameters and with a bivariate Gaussian copula. Different methods of estimation for the model parameters are used both under the classical and under the Bayesian paradigm. Some simulation studies are presented to verify the performance of the estimation methods of the proposed model. To illustrate the flexibility of the proposed model, a real dataset is reanalyzed.


Author(s):  
Arvind Pandey ◽  
David D. Hanagal ◽  
Shikhar Tyagi ◽  
Pragya Gupta

Due to the unavailability of complete data in various circumstances in biological, epidemiological, and medical studies, the analysis of censored data is very common among practitioners. But the analysis of bivariate censored data is not a regular mechanism because it is not necessary to always have independent data. Observed and unobserved covariates affect the variables under study. So, heterogeneity is present in the data. Ignoring observed and unobserved covariates may have objectionable consequences. But it is not easy to find that whether there is any effect of the unobserved covariate or not. Shared frailty models are the viable choice to counter such scenarios. However, due to certain restrictions such as the identifiability condition and the requirement that their Laplace transform exists, finding a frailty distribution can be difficult. As a result, in this paper, we introduce a new frailty distribution generalized Lindley (GL) for reversed hazard rate (RHR) setup that outperforms the gamma frailty distribution. So, our main motive is to establish a new frailty distribution under the RHR setup. By assuming exponential Gumbel (EG) and generalized inverted exponential (GIE) baseline distributions, we propose a new class of shared frailty models based on RHR. We estimate the parameters in these frailty models and use the Bayesian paradigm of the Markov Chain Monte Carlo (MCMC) technique. Model selection criteria have been performed for the comparison of models. We analyze Australian twin data and suggest a better model.


2021 ◽  
pp. 1-34
Author(s):  
Jean-François Bégin

Abstract This article proposes a complex economic scenario generator that nests versions of well-known actuarial frameworks. The generator estimation relies on the Bayesian paradigm and accounts for both model and parameter uncertainty via Markov chain Monte Carlo methods. So, to the question is less more?, we answer maybe, but it depends on your criteria. From an in-sample fit perspective, on the one hand, a complex economic scenario generator seems better. From the conservatism, forecasting and coverage perspectives, on the other hand, the situation is less clear: having more complex models for the short rate, term structure and stock index returns is clearly beneficial. However, that is not the case for inflation and the dividend yield.


2021 ◽  
Author(s):  
Jared C. Allen

Background: Bayesian approaches to police decision support offer an improvement upon more commonly used statistical approaches. Common approaches to case decision support often involve using frequencies from cases similar to the case under consideration to come to an isolated likelihood that a given suspect either a) committed the crime or b) has a given characteristic or set of characteristics. The Bayesian approach, in contrast, offers formally contextualized estimates and utilizes the formal logic desired by investigators. Findings: Bayes’ theorem incorporates the isolated likelihood as one element of a three-part equation, the other parts being 1) what was known generally about the variables in the case prior to the case occurring (the scientific-theoretical priors) and 2) the relevant base rate information that contextualizes the evidence obtained (the event context). These elements are precisely the domain of decision support specialists (investigative advisers), and the Bayesian paradigm is uniquely apt for combining them into contextualized estimates for decision support. Conclusions: By formally combining the relevant knowledge, context, and likelihood, Bayes’ theorem can improve the logic, accuracy, and relevance of decision support statements.


2021 ◽  
Author(s):  
Jared C. Allen

Background: Bayesian approaches to police decision support offer an improvement upon more commonly used statistical approaches. Common approaches to case decision support often involve using frequencies from cases similar to the case under consideration to come to an isolated likelihood that a given suspect either a) committed the crime or b) has a given characteristic or set of characteristics. The Bayesian approach, in contrast, offers formally contextualized estimates and utilizes the formal logic desired by investigators. Findings: Bayes’ theorem incorporates the isolated likelihood as one element of a three-part equation, the other parts being 1) what was known generally about the variables in the case prior to the case occurring (the scientific-theoretical priors) and 2) the relevant base rate information that contextualizes the evidence obtained (the event context). These elements are precisely the domain of decision support specialists (investigative advisers), and the Bayesian paradigm is uniquely apt for combining them into contextualized estimates for decision support. Conclusions: By formally combining the relevant knowledge, context, and likelihood, Bayes’ theorem can improve the logic, accuracy, and relevance of decision support statements.


Sign in / Sign up

Export Citation Format

Share Document