scholarly journals abctools: An R Package for Tuning Approximate Bayesian Computation Analyses

The R Journal ◽  
2015 ◽  
Vol 7 (2) ◽  
pp. 189 ◽  
Author(s):  
Matthew,A. Nunes ◽  
Dennis Prangle
2012 ◽  
Vol 3 (3) ◽  
pp. 475-479 ◽  
Author(s):  
Katalin Csilléry ◽  
Olivier François ◽  
Michael G. B. Blum

2020 ◽  
Author(s):  
Marcelo Gehara ◽  
Guilherme G. Mazzochinni ◽  
Frank Burbrink

AbstractUnderstanding population divergence involves testing diversification scenarios and estimating historical parameters, such as divergence time, population size and migration rate. There is, however, an immense space of possible highly parameterized scenarios that are difsficult or impossible to solve analytically. To overcome this problem researchers have used alternative simulation-based approaches, such as approximate Bayesian computation (ABC) and supervised machine learning (SML), to approximate posterior probabilities of hypotheses. In this study we demonstrate the utility of our newly developed R-package to simulate summary statistics to perform ABC and SML inferences. We compare the power of both ABC and SML methods and the influence of the number of loci in the accuracy of inferences; and we show three empirical examples: (i) the Muller’s termite frog genomic data from Southamerica; (ii) the cottonmouth and (iii) and the copperhead snakes sanger data from Northamerica. We found that SML is more efficient than ABC. It is generally more accurate and needs fewer simulations to perform an inference. We found support for a divergence model without migration, with a recent bottleneck for one of the populations of the southamerican frog. For the cottonmouth we found support for divergence with migration and recent expansion and for the copperhead we found support for a model of divergence with migration and recent bottleneck. Interestingly, by using an SML method it was possible to achieve high accuracy in model selection even when several models were compared in a single inference. We also found a higher accuracy when inferring parameters with SML.


Author(s):  
Cecilia Viscardi ◽  
Michele Boreale ◽  
Fabio Corradi

AbstractWe consider the problem of sample degeneracy in Approximate Bayesian Computation. It arises when proposed values of the parameters, once given as input to the generative model, rarely lead to simulations resembling the observed data and are hence discarded. Such “poor” parameter proposals do not contribute at all to the representation of the parameter’s posterior distribution. This leads to a very large number of required simulations and/or a waste of computational resources, as well as to distortions in the computed posterior distribution. To mitigate this problem, we propose an algorithm, referred to as the Large Deviations Weighted Approximate Bayesian Computation algorithm, where, via Sanov’s Theorem, strictly positive weights are computed for all proposed parameters, thus avoiding the rejection step altogether. In order to derive a computable asymptotic approximation from Sanov’s result, we adopt the information theoretic “method of types” formulation of the method of Large Deviations, thus restricting our attention to models for i.i.d. discrete random variables. Finally, we experimentally evaluate our method through a proof-of-concept implementation.


2021 ◽  
Vol 62 (2) ◽  
Author(s):  
Jason D. Christopher ◽  
Olga A. Doronina ◽  
Dan Petrykowski ◽  
Torrey R. S. Hayden ◽  
Caelan Lapointe ◽  
...  

Entropy ◽  
2021 ◽  
Vol 23 (3) ◽  
pp. 312
Author(s):  
Ilze A. Auzina ◽  
Jakub M. Tomczak

Many real-life processes are black-box problems, i.e., the internal workings are inaccessible or a closed-form mathematical expression of the likelihood function cannot be defined. For continuous random variables, likelihood-free inference problems can be solved via Approximate Bayesian Computation (ABC). However, an optimal alternative for discrete random variables is yet to be formulated. Here, we aim to fill this research gap. We propose an adjusted population-based MCMC ABC method by re-defining the standard ABC parameters to discrete ones and by introducing a novel Markov kernel that is inspired by differential evolution. We first assess the proposed Markov kernel on a likelihood-based inference problem, namely discovering the underlying diseases based on a QMR-DTnetwork and, subsequently, the entire method on three likelihood-free inference problems: (i) the QMR-DT network with the unknown likelihood function, (ii) the learning binary neural network, and (iii) neural architecture search. The obtained results indicate the high potential of the proposed framework and the superiority of the new Markov kernel.


Author(s):  
Cesar A. Fortes‐Lima ◽  
Romain Laurent ◽  
Valentin Thouzeau ◽  
Bruno Toupance ◽  
Paul Verdu

2014 ◽  
Vol 64 (3) ◽  
pp. 416-431 ◽  
Author(s):  
C. Baudet ◽  
B. Donati ◽  
B. Sinaimeri ◽  
P. Crescenzi ◽  
C. Gautier ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document