posterior distributions
Recently Published Documents


TOTAL DOCUMENTS

399
(FIVE YEARS 91)

H-INDEX

29
(FIVE YEARS 5)

2022 ◽  
Vol 11 ◽  
Author(s):  
Sebastian Regnery ◽  
Carolin Buchele ◽  
Fabian Weykamp ◽  
Moritz Pohl ◽  
Philipp Hoegen ◽  
...  

PurposeTo explore the benefit of adaptive magnetic resonance-guided stereotactic body radiotherapy (MRgSBRT) for treatment of lung tumors in different locations with a focus on ultracentral lung tumors (ULT).Patients & MethodsA prospective cohort of 21 patients with 23 primary and secondary lung tumors was analyzed. Tumors were located peripherally (N = 10), centrally (N = 2) and ultracentrally (N = 11, planning target volume (PTV) overlap with proximal bronchi, esophagus and/or pulmonary artery). All patients received MRgSBRT with gated dose delivery and risk-adapted fractionation. Before each fraction, the baseline plan was recalculated on the anatomy of the day (predicted plan). Plan adaptation was performed in 154/165 fractions (93.3%). Comparison of dose characteristics between predicted and adapted plans employed descriptive statistics and Bayesian linear multilevel models. The posterior distributions resulting from the Bayesian models are presented by the mean together with the corresponding 95% compatibility interval (CI).ResultsPlan adaptation decreased the proportion of fractions with violated planning objectives from 94% (predicted plans) to 17% (adapted plans). In most cases, inadequate PTV coverage was remedied (predicted: 86%, adapted: 13%), corresponding to a moderate increase of PTV coverage (mean +6.3%, 95% CI: [5.3–7.4%]) and biologically effective PTV doses (BED10) (BEDmin: +9.0 Gy [6.7–11.3 Gy], BEDmean: +1.4 Gy [0.8–2.1 Gy]). This benefit was smaller in larger tumors (−0.1%/10 cm³ PTV [−0.2 to −0.02%/10 cm³ PTV]) and ULT (−2.0% [−3.1 to −0.9%]). Occurrence of exceeded maximum doses inside the PTV (predicted: 21%, adapted: 4%) and violations of OAR constraints (predicted: 12%, adapted: 1%, OR: 0.14 [0.04–0.44]) was effectively reduced. OAR constraint violations almost exclusively occurred if the PTV had touched the corresponding OAR in the baseline plan (18/19, 95%).ConclusionAdaptive MRgSBRT is highly recommendable for ablative treatment of lung tumors whose PTV initially contacts a sensitive OAR, such as ULT. Here, plan adaptation protects the OAR while maintaining best-possible PTV coverage.


2022 ◽  
Author(s):  
Hanne Kekkonen

Abstract We consider the statistical non-linear inverse problem of recovering the absorption term f>0 in the heat equation with given sufficiently smooth functions describing boundary and initial values respectively. The data consists of N discrete noisy point evaluations of the solution u_f. We study the statistical performance of Bayesian nonparametric procedures based on a large class of Gaussian process priors. We show that, as the number of measurements increases, the resulting posterior distributions concentrate around the true parameter generating the data, and derive a convergence rate for the reconstruction error of the associated posterior means. We also consider the optimality of the contraction rates and prove a lower bound for the minimax convergence rate for inferring f from the data, and show that optimal rates can be achieved with truncated Gaussian priors.


Author(s):  
Aijaz Ahmad ◽  
Rajnee Tripathi

In this study, the shape parameter of the weighted Inverse Maxwell distribution is estimated by employing Bayesian techniques. To produce posterior distributions, the extended Jeffery's prior and the Erlang prior are utilised. The estimators are derived from the squared error loss function, the entropy loss function, the precautionary loss function, and the Linex loss function. Furthermore, an actual data set is studied to assess the effectiveness of various estimators under distinct loss functions.


Author(s):  
Yuming Ba ◽  
Jana de Wiljes ◽  
Dean S. Oliver ◽  
Sebastian Reich

AbstractMinimization of a stochastic cost function is commonly used for approximate sampling in high-dimensional Bayesian inverse problems with Gaussian prior distributions and multimodal posterior distributions. The density of the samples generated by minimization is not the desired target density, unless the observation operator is linear, but the distribution of samples is useful as a proposal density for importance sampling or for Markov chain Monte Carlo methods. In this paper, we focus on applications to sampling from multimodal posterior distributions in high dimensions. We first show that sampling from multimodal distributions is improved by computing all critical points instead of only minimizers of the objective function. For applications to high-dimensional geoscience inverse problems, we demonstrate an efficient approximate weighting that uses a low-rank Gauss-Newton approximation of the determinant of the Jacobian. The method is applied to two toy problems with known posterior distributions and a Darcy flow problem with multiple modes in the posterior.


2021 ◽  
Vol 162 (6) ◽  
pp. 304
Author(s):  
Jacob Golomb ◽  
Graça Rocha ◽  
Tiffany Meshkat ◽  
Michael Bottom ◽  
Dimitri Mawet ◽  
...  

Abstract The work presented here attempts at answering the following question: how do we decide when a given detection is a planet or just residual noise in exoplanet direct imaging data? To this end we implement a metric meant to replace the empirical frequentist-based thresholds for detection. Our method, implemented within a Bayesian framework, introduces an “evidence-based” approach to help decide whether a given detection is a true planet or just noise. We apply this metric jointly with a postprocessing technique and Karhunen–Loeve Image Processing (KLIP), which models and subtracts the stellar PSF from the image. As a proof of concept we implemented a new routine named PlanetEvidence that integrates the nested sampling technique (Multinest) with the KLIP algorithm. This is a first step to recast such a postprocessing method into a fully Bayesian perspective. We test our approach on real direct imaging data, specifically using GPI data of β Pictoris b, and on synthetic data. We find that for the former the method strongly favors the presence of a planet (as expected) and recovers the true parameter posterior distributions. For the latter case our approach allows us to detect (true) dim sources invisible to the naked eye as real planets, rather than background noise, and set a new lower threshold for detection at ∼2.5σ level. Further it allows us to quantify our confidence that a given detection is a real planet and not just residual noise.


2021 ◽  
Vol 5 (12) ◽  
pp. 276
Author(s):  
Carter Rhea ◽  
Julie Hlavacek-Larrondo ◽  
Laurie Rousseau-Nepton ◽  
Simon Prunet

Abstract LUCI is an general-purpose spectral line-fitting pipeline which natively integrates machine learning algorithms to initialize fit functions. LUCI currently uses point-estimates obtained from a convolutional neural network (CNN) to inform optimization algorithms; this methodology has shown great promise by reducing computation time and reducing the chance of falling into a local minimum using convex optimization methods. In this update to LUCI, we expand upon the CNN developed in Rhea et al. so that it outputs Gaussian posterior distributions of the fit parameters of interest (the velocity and broadening) rather than simple point-estimates. Moreover, these posteriors are then used to inform the priors in a Bayesian inference scheme, either emcee or dynesty. The code is publicly available at crhea93:LUCI (https://github.com/crhea93/LUCI).


2021 ◽  
Vol 162 (6) ◽  
pp. 262
Author(s):  
Eliab F. Canul ◽  
Héctor Velázquez ◽  
Yilen Gómez Maqueo Chew

Abstract The transit timing variations method is currently the most successful method to determine dynamical masses and orbital elements for Earth-sized transiting planets. Precise mass determination is fundamental to restrict planetary densities and thus infer planetary compositions. In this work, we present Nauyaca, a Python package dedicated to finding planetary masses and orbital elements through the fitting of observed midtransit times from an N-body approach. The fitting strategy consists of performing a sequence of minimization algorithms (optimizers) that are used to identify high probability regions in the parameter space. These results from optimizers are used for initialization of a Markov chain Monte Carlo method, using an adaptive Parallel-Tempering algorithm. A set of runs are performed in order to obtain posterior distributions of planetary masses and orbital elements. In order to test the tool, we created a mock catalog of synthetic planetary systems with different numbers of planets where all of them transit. We calculate their midtransit times to give them as an input to Nauyaca, testing statistically its efficiency in recovering the planetary parameters from the catalog. For the recovered planets, we find typical dispersions around the real values of ∼1–14 M ⊕ for masses, between 10–110 s for periods, and between ∼0.01–0.03 for eccentricities. We also investigate the effects of the signal-to-noise ratio and number of transits on the correct determination of the planetary parameters. Finally, we suggest choices of the parameters that govern the tool for the usage with real planets, according to the complexity of the problem and computational facilities.


Computation ◽  
2021 ◽  
Vol 9 (11) ◽  
pp. 119
Author(s):  
Kathrin Hellmuth ◽  
Christian Klingenberg ◽  
Qin Li ◽  
Min Tang

Chemotaxis describes the movement of an organism, such as single or multi-cellular organisms and bacteria, in response to a chemical stimulus. Two widely used models to describe the phenomenon are the celebrated Keller–Segel equation and a chemotaxis kinetic equation. These two equations describe the organism’s movement at the macro- and mesoscopic level, respectively, and are asymptotically equivalent in the parabolic regime. The way in which the organism responds to a chemical stimulus is embedded in the diffusion/advection coefficients of the Keller–Segel equation or the turning kernel of the chemotaxis kinetic equation. Experiments are conducted to measure the time dynamics of the organisms’ population level movement when reacting to certain stimulation. From this, one infers the chemotaxis response, which constitutes an inverse problem. In this paper, we discuss the relation between both the macro- and mesoscopic inverse problems, each of which is associated with two different forward models. The discussion is presented in the Bayesian framework, where the posterior distribution of the turning kernel of the organism population is sought. We prove the asymptotic equivalence of the two posterior distributions.


Author(s):  
Hongyu Shen ◽  
Eliu Huerta ◽  
Eamonn O’Shea ◽  
Prayush Kumar ◽  
Zhizhen Zhao

Abstract We introduce deep learning models to estimate the masses of the binary components of black hole mergers, (m1, m2), and three astrophysical properties of the post-merger compact remnant, namely, the final spin, af, and the frequency and damping time of the ringdown oscillations of the fundamental (l=m=2) bar mode, (ωR, ωI). Our neural networks combine a modified WaveNet architecture with contrastive learning and normalizing flow. We validate these models against a Gaussian conjugate prior family whose posterior distribution is described by a closed analytical expression. Upon confirming that our models produce statistically consistent results, we used them to estimate the astrophysical parameters (m1, m2, af, ωR, ωI) of five binary black holes: GW150914, GW170104, GW170814, GW190521 and GW190630. We use PyCBC Inference to directly compare traditional Bayesian methodologies for parameter estimation with our deep learning based posterior distributions. Our results show that our neural network models predict posterior distributions that encode physical correlations, and that our data-driven median results and 90\% confidence intervals are similar to those produced with gravitational wave Bayesian analyses. This methodology requires a single V100 NVIDIA GPU to produce median values and posterior distributions within two milliseconds for each event. This neural network, and a tutorial for its use, are available at the Data and Learning Hub for Science.


2021 ◽  
Vol 21 (1) ◽  
Author(s):  
Kristine J. Rosenberger ◽  
Rui Duan ◽  
Yong Chen ◽  
Lifeng Lin

Abstract Background Network meta-analysis (NMA) is a widely used tool to compare multiple treatments by synthesizing different sources of evidence. Measures such as the surface under the cumulative ranking curve (SUCRA) and the P-score are increasingly used to quantify treatment ranking. They provide summary scores of treatments among the existing studies in an NMA. Clinicians are frequently interested in applying such evidence from the NMA to decision-making in the future. This prediction process needs to account for the heterogeneity between the existing studies in the NMA and a future study. Methods This article introduces the predictive P-score for informing treatment ranking in a future study via Bayesian models. Two NMAs were used to illustrate the proposed measure; the first assessed 4 treatment strategies for smoking cessation, and the second assessed treatments for all-grade treatment-related adverse events. For all treatments in both NMAs, we obtained their conventional frequentist P-scores, Bayesian P-scores, and predictive P-scores. Results In the two examples, the Bayesian P-scores were nearly identical to the corresponding frequentist P-scores for most treatments, while noticeable differences existed for some treatments, likely owing to the different assumptions made by the frequentist and Bayesian NMA models. Compared with the P-scores, the predictive P-scores generally had a trend to converge toward a common value of 0.5 due to the heterogeneity. The predictive P-scores’ numerical estimates and the associated plots of posterior distributions provided an intuitive way for clinicians to appraise treatments for new patients in a future study. Conclusions The proposed approach adapts the existing frequentist P-score to the Bayesian framework. The predictive P-score can help inform medical decision-making in future studies.


Sign in / Sign up

Export Citation Format

Share Document