Performance Testing of a Large-Format X-ray Reflection Grating Prototype for a Suborbital Rocket Payload

2020 ◽  
Vol 09 (04) ◽  
pp. 2050017
Author(s):  
Benjamin D. Donovan ◽  
Randall L. McEntaffer ◽  
Casey T. DeRoo ◽  
James H. Tutt ◽  
Fabien Grisé ◽  
...  

The soft X-ray grating spectrometer on board the Off-plane Grating Rocket Experiment (OGRE) hopes to achieve the highest resolution soft X-ray spectrum of an astrophysical object when it is launched via suborbital rocket. Paramount to the success of the spectrometer are the performance of the [Formula: see text] reflection gratings populating its reflection grating assembly. To test current grating fabrication capabilities, a grating prototype for the payload was fabricated via electron-beam lithography at The Pennsylvania State University’s Materials Research Institute and was subsequently tested for performance at Max Planck Institute for Extraterrestrial Physics’ PANTER X-ray Test Facility. Bayesian modeling of the resulting data via Markov chain Monte Carlo (MCMC) sampling indicated that the grating achieved the OGRE single-grating resolution requirement of [Formula: see text] at the 94% confidence level. The resulting [Formula: see text] posterior probability distribution suggests that this confidence level is likely a conservative estimate though, since only a finite [Formula: see text] parameter space was sampled and the model could not constrain the upper bound of [Formula: see text] to less than infinity. Raytrace simulations of the tested system found that the observed data can be reproduced with a grating performing at [Formula: see text]. It is therefore postulated that the behavior of the obtained [Formula: see text] posterior probability distribution can be explained by a finite measurement limit of the system and not a finite limit on [Formula: see text]. Implications of these results and improvements to the test setup are discussed.

Author(s):  
Munir S Pathan ◽  
S M Pradhan ◽  
T Palani Selvam

Abstract In this study, the Bayesian probabilistic approach is applied for the estimation of the actual dose using personnel monitoring dose records of occupational workers. To implement the Bayesian approach, the probability distribution of the uncertainty in the reported dose as a function of the actual dose is derived. Using the uncertainty distribution function of reported dose and prior knowledge of dose levels generally observed in a monitoring period, the posterior probability distribution of the actual dose is estimated. The posterior distributions of each monitoring period in a year are convoluted to arrive at actual annual dose distribution. The estimated actual doses distributions show a significant deviation from reported annual doses particularly for low annual doses.


2013 ◽  
Vol 807-809 ◽  
pp. 1570-1574 ◽  
Author(s):  
Hai Dong Yang ◽  
Dong Guo Shao ◽  
Bi Yu Liu

Pollution point source identification for the non-shore emission which is the main form of sudden water pollution incident is considered in this paper. Firstly, the source traceability of sudden water pollution accidents is taken as the Bayesian estimation problem; secondly, the posterior probability distribution of the source's parameters are deduced; thirdly, the marginal posterior probability density is obtained by using a new traceability method; finally, this proposed method is compared with Bayesian-MCMC by numerical experiments. The conclusions are as following: the new traceability method can reduce the iterations, improve the recognition accuracy, and reduce the overall average error obviously and it is more stable and robust than Bayesian-MCMC and can identify sudden water pollution accidents source effectively. Therefore, it provides a new idea and method to solve the difficulty of traceability problems in sudden water pollution accidents.


2014 ◽  
Vol 10 (S306) ◽  
pp. 273-275
Author(s):  
Pedro T. P. Viana

AbstractObservational data on clusters of galaxies holds relevant information that can be used to determine the relative plausibility of different models for the large-scale evolution of the Universe, or estimate the joint posterior probability distribution function of the parameters that pertain to each model. Within the next few years, several surveys of the sky will yield large galaxy cluster catalogues. In order to make use of the vast amount of information they will contain, their selection functions will have to be properly understood. We argue this, as well as the estimation of the full joint posterior probability distribution function of the most relevant cluster properties, can be best achieved in the framework of bayesian statistics.


2020 ◽  
Author(s):  
Xin Zhang ◽  
Andrew Curtis

<p><span>In a variety of geoscientific applications we require maps of subsurface properties together with the corresponding maps of uncertainties to assess their reliability. Seismic tomography is a method that is widely used to generate those maps. Since tomography is significantly nonlinear, Monte Carlo sampling methods are often used for this purpose, but they are generally computationally intractable for large data sets and high-dimensionality parameter spaces. To extend uncertainty analysis to larger systems, we introduce variational inference methods to conduct seismic tomography. In contrast to Monte Carlo sampling, variational methods solve the Bayesian inference problem as an optimization problem yet still provide fully nonlinear, probabilistic results. This is achieved by minimizing the Kullback-Leibler (KL) divergence between approximate and target probability distributions within a predefined family of probability distributions.</span></p><p><span>We introduce two variational inference methods: automatic differential variational inference (ADVI) and Stein variational gradient descent (SVGD). In ADVI a Gaussian probability distribution is assumed and optimized to approximate the posterior probability distribution. In SVGD a smooth transform is iteratively applied to an initial probability distribution to obtain an approximation to the posterior probability distribution. At each iteration the transform is determined by seeking the steepest descent direction that minimizes the KL-divergence. </span></p><p><span>We apply the two variational inference methods to 2D travel time tomography using both synthetic and real data, and compare the results to those obtained from two different Monte Carlo sampling methods: Metropolis-Hastings Markov chain Monte Carlo (MH-McMC) and reversible jump Markov chain Monte Carlo (rj-McMC). The results show that ADVI provides a biased approximation because of its Gaussian approximation, whereas SVGD produces more accurate approximations to the results of MH-McMC. In comparison rj-McMC produces smoother mean velocity models and lower standard deviations because the parameterization used in rj-McMC (Voronoi cells) imposes prior restrictions on the pixelated form of models: all pixels within each Voronoi cell have identical velocities. This suggests that the results of rj-McMC need to be interpreted in the light of the specific prior information imposed by the parameterization. Both variational methods estimate the posterior distribution at significantly lower computational cost, provided that gradients of parameters with respect to data can be calculated efficiently. We therefore expect that the methods can be applied fruitfully to many other types of geophysical inverse problems.</span></p>


2020 ◽  
Vol 76 (3) ◽  
pp. 238-247 ◽  
Author(s):  
Randy J. Read ◽  
Robert D. Oeffner ◽  
Airlie J. McCoy

The information gained by making a measurement, termed the Kullback–Leibler divergence, assesses how much more precisely the true quantity is known after the measurement was made (the posterior probability distribution) than before (the prior probability distribution). It provides an upper bound for the contribution that an observation can make to the total likelihood score in likelihood-based crystallographic algorithms. This makes information gain a natural criterion for deciding which data can legitimately be omitted from likelihood calculations. Many existing methods use an approximation for the effects of measurement error that breaks down for very weak and poorly measured data. For such methods a different (higher) information threshold is appropriate compared with methods that account well for even large measurement errors. Concerns are raised about a current trend to deposit data that have been corrected for anisotropy, sharpened and pruned without including the original unaltered measurements. If not checked, this trend will have serious consequences for the reuse of deposited data by those who hope to repeat calculations using improved new methods.


2019 ◽  
Vol 485 (3) ◽  
pp. 4343-4358
Author(s):  
Germán Chaparro-Molano ◽  
Juan Carlos Cuervo ◽  
Oscar Alberto Restrepo Gaitán ◽  
Sergio Torres Arzayús

ABSTRACT We propose the use of robust, Bayesian methods for estimating extragalactic distance errors in multimeasurement catalogues. We seek to improve upon the more commonly used frequentist propagation-of-error methods, as they fail to explain both the scatter between different measurements and the effects of skewness in the metric distance probability distribution. For individual galaxies, the most transparent way to assess the variance of redshift independent distances is to directly sample the posterior probability distribution obtained from the mixture of reported measurements. However, sampling the posterior can be cumbersome for catalogue-wide precision cosmology applications. We compare the performance of frequentist methods versus our proposed measures for estimating the true variance of the metric distance probability distribution. We provide pre-computed distance error data tables for galaxies in three catalogues: NED-D, HyperLEDA, and Cosmicflows-3. Additionally, we develop a Bayesian model that considers systematic and random effects in the estimation of errors for Tully–Fisher (TF) relation derived distances in NED-D. We validate this model with a Bayesian p-value computed using the Freeman–Tukey discrepancy measure as a posterior predictive check. We are then able to predict distance errors for 884 galaxies in the NED-D catalogue and 203 galaxies in the HyperLEDA catalogue that do not report TF distance modulus errors. Our goal is that our estimated and predicted errors are used in catalogue-wide applications that require acknowledging the true variance of extragalactic distance measurements.


Author(s):  
Luis D. Couto ◽  
Dong Zhang ◽  
Antti Aitio ◽  
Scott Moura ◽  
David Howey

Abstract This paper addresses the parameter estimation problem for lithium-ion battery pack models comprising cells in series. This valuable information can be exploited in fault diagnostics to estimate the number of cells that are exhibiting abnormal behaviour, e.g. large resistances or small capacities. In particular, we use a Bayesian approach to estimate the parameters of a two-cell arrangement modelled using equivalent circuits. Although our modeling framework has been extensively reported in the literature, its structural identifiability properties have not been reported yet to the best of the authors’ knowledge. Moreover, most contributions in the literature tackle the estimation problem through point-wise estimates assuming Gaussian noise using e.g. least-squares methods (maximum likelihood estimation) or Kalman filters (maximum a posteriori estimation). In contrast, we apply methods that are suitable for nonlinear and non-Gaussian estimation problems and estimate the full posterior probability distribution of the parameters. We study how the model structure, available measurements and prior knowledge of the model parameters impact the underlying posterior probability distribution that is recovered for the parameters. For two cells in series, a bimodal distribution is obtained whose modes are centered around the real values of the parameters for each cell. Therefore, bounds on the model parameters for a battery pack can be derived.


2021 ◽  
Vol 21 (1) ◽  
Author(s):  
Kazuhiro Watanabe ◽  
Norito Kawakami

Abstract Background Although sedentary behavior is associated with the onset of major depressive disorder, it remains unclear whether sedentary behavior at work increases the risk of depression. The present study used the Bayesian approach to investigate the association between sitting time at work and the onset of major depressive episode (MDE). Methods A 1-year prospective cohort study was conducted among 233 Japanese workers without MDE (response rate: 4.3%). MDE onset was assessed using the self-reported WHO Composite International Diagnostic Interview version 3.0. A Bayesian Cox proportional hazard model was used to estimate the hazard ratio (HR) between long sitting time at work and MDE onset. Results A total of 231 workers were included in the analysis. During the follow-up, 1621 person-months were observed, and six participants experienced MDE onset. Incident rates per months were 0.34, 0.11, and 1.02% in short (< 7.2 h per day), medium (7.2–9.5 h), and long (9.5+ h) sitting time at work, respectively. The estimated median posterior probability distribution of the HR of long sitting time was 3.00 (95% highest density interval [HDI]: 0.73–12.03). The estimated median remained positive after adjustment for physical activity level and other covariates (HR = 2.11, 95% HDI: 0.42–10.22). The 10-base Bayesian factor for H1 (HR = 1.00) compared with the alternatives (H0, HR = 1.00) was 0.68 in the adjusted model. The analysis, which treated sitting time at work as a continuous variable, estimated that the median of the posterior probability distribution of the HR of sitting time was 0.79 (95% HDI: 0.58–1.07. The 10-base Bayesian factor was 2.73 in the linear association. Conclusions Long sitting time at work (9.5+ h per day) might be associated with MDE onset among workers. However, the linear association indicated conflicting results. Non-linear associations between sitting time and MDE onset might explain this inconsistency. The evidence for an adverse association between sitting time at work and MDE onset remains inconclusive.


Sign in / Sign up

Export Citation Format

Share Document