bayesian sampling
Recently Published Documents


TOTAL DOCUMENTS

68
(FIVE YEARS 20)

H-INDEX

12
(FIVE YEARS 1)

2021 ◽  
Author(s):  
Max Berg ◽  
Matthias Feldmann ◽  
Tobias Kube

Rumination is a widely recognized cognitive deviation in depression. An integrative view that combines clinical findings on rumination with theories of mental simulation and cognitive problem-solving could help explain the development and maintenance of rumination in a computationally and biologically plausible framework. In this review, we connect insights from neuroscience and computational psychiatry to elucidate rumination as repetitive but unsuccessful attempts at mental problem-solving. Appealing to a predictive processing account, we suggest that problem-solving is based on an algorithm that generates candidate behavior (policy primitives for problem solutions) using a Bayesian sampling approach, evaluates resulting policies for action, and then engages in instrumental learning to reduce prediction errors. We present evidence suggesting that this problem-solving algorithm is distorted in depression: Specifically, depressive rumination is regarded as excessive Bayesian sampling of candidates that is associated with high prediction errors without activation of the successive steps (policy evaluation, instrumental learning) of the algorithm. Thus, prediction errors cannot be decreased, and excessive resampling of the same problems occur. This then leads to reduced precision weighting attributed to external, “online” stimuli, low behavioral output and high opportunity costs due to the time-consuming nature of the sampling process itself. We review different computational reasons that make the proposed Bayesian sampling algorithm vulnerable to a ruminative „halting problem”. We also identify neurophysiological correlates of these deviations in pathological connectivity patterns of different brain networks. We conclude by suggesting future directions for research into behavioral and neurophysiological features of the model and point to clinical implications.


2021 ◽  
Vol 12 ◽  
Author(s):  
Jing Lu ◽  
Jiwei Zhang ◽  
Zhaoyuan Zhang ◽  
Bao Xu ◽  
Jian Tao

In this paper, a new two-parameter logistic testlet response theory model for dichotomous items is proposed by introducing testlet discrimination parameters to model the local dependence among items within a common testlet. In addition, a highly effective Bayesian sampling algorithm based on auxiliary variables is proposed to estimate the testlet effect models. The new algorithm not only avoids the Metropolis-Hastings algorithm boring adjustment the turning parameters to achieve an appropriate acceptance probability, but also overcomes the dependence of the Gibbs sampling algorithm on the conjugate prior distribution. Compared with the traditional Bayesian estimation methods, the advantages of the new algorithm are analyzed from the various types of prior distributions. Based on the Markov chain Monte Carlo (MCMC) output, two Bayesian model assessment methods are investigated concerning the goodness of fit between models. Finally, three simulation studies and an empirical example analysis are given to further illustrate the advantages of the new testlet effect model and Bayesian sampling algorithm.


2021 ◽  
Author(s):  
Andres Fortunato ◽  
Helmut Herwartz ◽  
Ramón E. López ◽  
Eugenio Figueroa

Abstract We study the long-run dynamic and predictive connection between atmospheric carbon dioxide (CO2) concentration and the probability of hydrometeorological disasters. For a panel of 193 countries over the period 1970-2016 we estimate the probabilities of hydrometeorological disasters at country levels by means of Bayesian sampling techniques. We then separate the effects of climatological and socio-demographic factors (used as proxies for exposure and vulnerability) and other country-specific factors, from a global probability of disasters (GPOD). Finally, we subject these global probability time paths to a cointegration analysis with CO2 concentration and run projections to year 2040 of the GPOD conditional on nine Shared Socioeconomic Pathways scenarios. We detect a stable long-term relation between CO2 accumulation and the GPOD that allows to determine projections of the latter process conditional on the former. This way, we demonstrate that generally and readily available statistical data on CO2 global atmospheric concentrations can be used as a conceptually meaningful, statistically valid and policy useful predictor of the probability of occurrence of (global) hydrometeorological disasters.


2020 ◽  
Vol 224 (2) ◽  
pp. 1404-1421
Author(s):  
Théa Ragon ◽  
Mark Simons

SUMMARY Earthquake source estimates are affected by many types of uncertainties, deriving from observational errors, modelling choices and our simplified description of the Earth’s interior. While observational errors are often accounted for, epistemic uncertainties, which stem from our imperfect description of the forward model, are usually neglected. In particular, 3-D variations in crustal properties are rarely considered. 3-D crustal heterogeneity is known to largely affect estimates of the seismic source, using either geodetic or seismic data. Here, we use a perturbation approach to investigate, and account for, the impact of epistemic uncertainties related to 3-D variations of the mechanical properties of the crust. We validate our approach using a Bayesian sampling procedure applied to synthetic geodetic data generated from 2-D and 3-D finite-fault models. We show that accounting for uncertainties in crustal structure systematically increases the reliability of source estimates.


2020 ◽  
Author(s):  
Geoffrey Hutchison ◽  
Leung Sing Chan ◽  
Garrett M. Morris ◽  
Dakota Folmsbee

Sign in / Sign up

Export Citation Format

Share Document