Reducing MCMC Computational Cost with a Two Layered Bayesian Approach

Author(s):  
Ramin Madarshahian ◽  
Juan M. Caicedo
2005 ◽  
Vol 277-279 ◽  
pp. 183-188
Author(s):  
Myung Hee Jung ◽  
Eui Jung Yun ◽  
Sy Woo Byun

Markov Random Field (MRF) models have been successfully utilized in many digital image processing problems such as texture modeling and region labeling. Although MRF provides a well-defined statistical approach for the analysis of images, one disadvantage is the expensive computational cost for the processing and sampling of large images, since global features are assumed to be specified through local descriptions. In this study, a methodology is explored that reduces the computational burden and increases the speed of image analysis for large images, especially airborne and space-based remotely sensed data. The Bayesian approach is suggested as a reasonable alternative method in parameter estimation of MRF models; the utilization of a multiresolution framework is also investigated, which provides convenient and efficient structures for the transition between local and global features. The suggested approach is applied to the simulation of spatial pattern using MRF.


PLoS ONE ◽  
2021 ◽  
Vol 16 (2) ◽  
pp. e0247338
Author(s):  
Davood Roshan ◽  
John Ferguson ◽  
Charles R. Pedlar ◽  
Andrew Simpkin ◽  
William Wyns ◽  
...  

In a clinical setting, biomarkers are typically measured and evaluated as biological indicators of a physiological state. Population based reference ranges, known as ‘static’ or ‘normal’ reference ranges, are often used as a tool to classify a biomarker value for an individual as typical or atypical. However, these ranges may not be informative to a particular individual when considering changes in a biomarker over time since each observation is assessed in isolation and against the same reference limits. To allow early detection of unusual physiological changes, adaptation of static reference ranges is required that incorporates within-individual variability of biomarkers arising from longitudinal monitoring in addition to between-individual variability. To overcome this issue, methods for generating individualised reference ranges are proposed within a Bayesian framework which adapts successively whenever a new measurement is recorded for the individual. This new Bayesian approach also allows the within-individual variability to differ for each individual, compared to other less flexible approaches. However, the Bayesian approach usually comes with a high computational cost, especially for individuals with a large number of observations, that diminishes its applicability. This difficulty suggests that a computational approximation may be required. Thus, methods for generating individualised adaptive ranges by the use of a time-efficient approximate Expectation-Maximisation (EM) algorithm will be presented which relies only on a few sufficient statistics at the individual level.


Author(s):  
T. Wu ◽  
B. Rosić ◽  
L. De Lorenzis ◽  
H. G. Matthies

AbstractPhase-field modeling of fracture has gained popularity within the last decade due to the flexibility of the related computational framework in simulating three-dimensional arbitrarily complicated fracture processes. However, the numerical predictions are greatly affected by the presence of uncertainties in the mechanical properties of the material originating from unresolved heterogeneities and the use of noisy experimental data. The objective of this work is to apply the Bayesian approach to estimate bulk and shear moduli, tensile strength and fracture toughness of the phase-field model, thus improving accuracy of the simulations with the help of experimental data. Conventional approaches for estimating the Bayesian posterior probability density function adopt sampling schemes, which often require a large amount of model estimations to achieve the desired convergence, thus resulting in a high computational cost. In order to alleviate this problem, we employ a more efficient approach called sampling-free linear Bayesian update, which relies on the evaluation of the conditional expectation of parameters given experimental data. We identify the mechanical properties of cement mortar by conditioning on the experimental data of the three-point bending test (observations) in an online and offline manner. In the online approach the parameter values are sequentially updated on the fly as the new experimental information comes in. In contrast, the offline approach is used only when the whole history of experimental data is provided once the experiment is performed. Both versions of estimation are discussed and compared by validating the phase-field fracture model on an unused set of experimental data.


2011 ◽  
Vol 1 (6) ◽  
pp. 909-921 ◽  
Author(s):  
Ioanna Manolopoulou ◽  
Lorenza Legarreta ◽  
Brent C. Emerson ◽  
Steve Brooks ◽  
Simon Tavaré

Phylogeographic methods have attracted a lot of attention in recent years, stressing the need to provide a solid statistical framework for many existing methodologies so as to draw statistically reliable inferences. Here, we take a flexible fully Bayesian approach by reducing the problem to a clustering framework, whereby the population distribution can be explained by a set of migrations, forming geographically stable population clusters. These clusters are such that they are consistent with a fixed number of migrations on the corresponding (unknown) subdivided coalescent tree. Our methods rely upon a clustered population distribution, and allow for inclusion of various covariates (such as phenotype or climate information) at little additional computational cost. We illustrate our methods with an example from weevil mitochondrial DNA sequences from the Iberian peninsula.


2020 ◽  
Author(s):  
Igor Kolesnikov ◽  
Celso Mendes ◽  
Reinaldo De Carvalho ◽  
Reinaldo Rosa

Parametric computational modeling of galaxies is a process with a high computational cost. The statistical component of modeling, which may involve model refinements in relation to the source brightness distribution, achieves more satisfactory results when the Bayesian approach is employed. In our research, we use GALaxy PHotometric ATtributes (GALPHAT) as our primary tool for data processing. In the current scenario of cosmology, to be scientifically relevant, this type of modeling must be performed on thousands of galaxies. In this article, we present the study and optimization of solutions based on modern HPC platforms, including a many-core processor, that enable effective processing of that amount of galaxies obtained from Sloan Digital Sky Survey.


2012 ◽  
Author(s):  
Todd Wareham ◽  
Robert Robere ◽  
Iris van Rooij
Keyword(s):  

2020 ◽  
Vol 2020 (14) ◽  
pp. 378-1-378-7
Author(s):  
Tyler Nuanes ◽  
Matt Elsey ◽  
Radek Grzeszczuk ◽  
John Paul Shen

We present a high-quality sky segmentation model for depth refinement and investigate residual architecture performance to inform optimally shrinking the network. We describe a model that runs in near real-time on mobile device, present a new, highquality dataset, and detail a unique weighing to trade off false positives and false negatives in binary classifiers. We show how the optimizations improve bokeh rendering by correcting stereo depth misprediction in sky regions. We detail techniques used to preserve edges, reject false positives, and ensure generalization to the diversity of sky scenes. Finally, we present a compact model and compare performance of four popular residual architectures (ShuffleNet, MobileNetV2, Resnet-101, and Resnet-34-like) at constant computational cost.


Sign in / Sign up

Export Citation Format

Share Document