scholarly journals Model Based Inference of Large Scale Brain Networks with Approximate Bayesian Computation

2019 ◽  
Author(s):  
Timothy O. West ◽  
Luc Berthouze ◽  
Simon F. Farmer ◽  
Hayriye Cagnan ◽  
Vladimir Litvak

AbstractBrain networks and the neural dynamics that unfold upon them are of great interest across the many scales of systems neuroscience. The tools of inverse modelling provide a way of both constraining and selecting models of large scale brain networks from empirical data. Such models have the potential to yield broad theoretical insights in the understanding of the physiological processes behind the integration and segregation of activity in the brain. In order to make inverse modelling computationally tractable, simplifying model assumptions have often been adopted that appeal to steady-state approximations to neural dynamics and thus prevent the investigation of stochastic or intermittent dynamics such as gamma or beta burst activity. In this work we describe a framework that uses the Approximate Bayesian Computation (ABC) algorithm for the inversion of neural models that can flexibly represent any statistical feature of empirically recorded data and eschew the need to assume a locally linearized system. Further, we demonstrate how Bayesian model comparison can be applied to fitted models to enable the selection of competing hypotheses regarding the causes of neural data. This work establishes a validation of the procedures by testing for both the face validity (i.e. the ability to identify the original model that has generated the observed data) and predictive validity (i.e. the consistency of the parameter estimation across multiple realizations of the same data). From the validation and example applications presented here we conclude that the proposed framework provides a novel opportunity to researchers aiming to explain how complex brain dynamics emerge from neural circuits.

2017 ◽  
Vol 469 (3) ◽  
pp. 2791-2805 ◽  
Author(s):  
ChangHoon Hahn ◽  
Mohammadjavad Vakili ◽  
Kilian Walsh ◽  
Andrew P. Hearin ◽  
David W. Hogg ◽  
...  

Author(s):  
Cecilia Viscardi ◽  
Michele Boreale ◽  
Fabio Corradi

AbstractWe consider the problem of sample degeneracy in Approximate Bayesian Computation. It arises when proposed values of the parameters, once given as input to the generative model, rarely lead to simulations resembling the observed data and are hence discarded. Such “poor” parameter proposals do not contribute at all to the representation of the parameter’s posterior distribution. This leads to a very large number of required simulations and/or a waste of computational resources, as well as to distortions in the computed posterior distribution. To mitigate this problem, we propose an algorithm, referred to as the Large Deviations Weighted Approximate Bayesian Computation algorithm, where, via Sanov’s Theorem, strictly positive weights are computed for all proposed parameters, thus avoiding the rejection step altogether. In order to derive a computable asymptotic approximation from Sanov’s result, we adopt the information theoretic “method of types” formulation of the method of Large Deviations, thus restricting our attention to models for i.i.d. discrete random variables. Finally, we experimentally evaluate our method through a proof-of-concept implementation.


2021 ◽  
Vol 62 (2) ◽  
Author(s):  
Jason D. Christopher ◽  
Olga A. Doronina ◽  
Dan Petrykowski ◽  
Torrey R. S. Hayden ◽  
Caelan Lapointe ◽  
...  

Entropy ◽  
2021 ◽  
Vol 23 (3) ◽  
pp. 312
Author(s):  
Ilze A. Auzina ◽  
Jakub M. Tomczak

Many real-life processes are black-box problems, i.e., the internal workings are inaccessible or a closed-form mathematical expression of the likelihood function cannot be defined. For continuous random variables, likelihood-free inference problems can be solved via Approximate Bayesian Computation (ABC). However, an optimal alternative for discrete random variables is yet to be formulated. Here, we aim to fill this research gap. We propose an adjusted population-based MCMC ABC method by re-defining the standard ABC parameters to discrete ones and by introducing a novel Markov kernel that is inspired by differential evolution. We first assess the proposed Markov kernel on a likelihood-based inference problem, namely discovering the underlying diseases based on a QMR-DTnetwork and, subsequently, the entire method on three likelihood-free inference problems: (i) the QMR-DT network with the unknown likelihood function, (ii) the learning binary neural network, and (iii) neural architecture search. The obtained results indicate the high potential of the proposed framework and the superiority of the new Markov kernel.


Author(s):  
Cesar A. Fortes‐Lima ◽  
Romain Laurent ◽  
Valentin Thouzeau ◽  
Bruno Toupance ◽  
Paul Verdu

2014 ◽  
Vol 64 (3) ◽  
pp. 416-431 ◽  
Author(s):  
C. Baudet ◽  
B. Donati ◽  
B. Sinaimeri ◽  
P. Crescenzi ◽  
C. Gautier ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document