scholarly journals Analysis of COSIMA spectra: Bayesian approach

Author(s):  
H. J. Lehto ◽  
B. Zaprudin ◽  
K. M. Lehto ◽  
T. Lönnberg ◽  
J. Silén ◽  
...  

Abstract. We describe the use of Bayesian analysis methods applied to TOF-SIMS spectra. The method finds the probability density functions of measured line parameters (number of lines, and their widths, peak amplitudes, integrated amplitudes, positions) in mass intervals over the whole spectrum. We discuss the results we can expect from this analysis. We discuss the effects the instrument dead time causes in the COSIMA TOF SIMS. We address this issue in a new way. The derived line parameters can be used to further calibrate the mass scaling of TOF-SIMS and to feed the results into other analysis methods such as multivariate analyses of spectra. We intend to use the method in two ways, first as a comprehensive tool to perform quantitative analysis of spectra, and second as a fast tool for studying interesting targets for obtaining additional TOF-SIMS measurements of the sample, a property unique for COSIMA. Finally, we point out that the Bayesian method can be thought as a means to solve inverse problems but with forward calculations only.

2015 ◽  
Vol 4 (1) ◽  
pp. 139-148
Author(s):  
H. J. Lehto ◽  
B. Zaprudin ◽  
K. M. Lehto ◽  
T. Lönnberg ◽  
J. Silén ◽  
...  

Abstract. We describe the use of Bayesian analysis methods applied to time-of-flight secondary ion mass spectrometer (TOF-SIMS) spectra. The method is applied to the COmetary Secondary Ion Mass Analyzer (COSIMA) TOF-SIMS mass spectra where the analysis can be broken into subgroups of lines close to integer mass values. The effects of the instrumental dead time are discussed in a new way. The method finds the joint probability density functions of measured line parameters (number of lines, and their widths, peak amplitudes, integrated amplitudes and positions). In the case of two or more lines, these distributions can take complex forms. The derived line parameters can be used to further calibrate the mass scaling of TOF-SIMS and to feed the results into other analysis methods such as multivariate analyses of spectra. We intend to use the method, first as a comprehensive tool to perform quantitative analysis of spectra, and second as a fast tool for studying interesting targets for obtaining additional TOF-SIMS measurements of the sample, a property unique to COSIMA. Finally, we point out that the Bayesian method can be thought of as a means to solve inverse problems but with forward calculations, only with no iterative corrections or other manipulation of the observed data.


2019 ◽  
Vol 285 ◽  
pp. 00013
Author(s):  
Adrian Pawełek ◽  
Piotr Lichota

This article presents a method that allows to analyze selected aspects of past arrival traffic by modelling distributions of time separations of arriving aircraft in a chosen navigationpoint of Terminal Manoeuvring Area with the use of continuous probability distributions. Modelling arriving aircraft time separations distribution with continuous probability density functions allows to apply various mathematical tools to analyze separations distributions. Moreover, by comparing distributions parameters, quantitative analysis of separations for days with various arrival traffic intensity can be performed. Assumptions, mathematical model, application in the exemplary experimental scenario with an airport and days with low and high traffic intensity, and results are presented in this article. Real air traffic data was used for the experimental scenario. Outcomes show that the method can be used for air traffic post-analysis, e.g assessment of maintaining separation.


Geophysics ◽  
2003 ◽  
Vol 68 (6) ◽  
pp. 2000-2009 ◽  
Author(s):  
Arild Buland ◽  
Henning Omre

A Bayesian method for wavelet estimation from seismic and well data is developed. The method works both on stacked data and on prestack data in form of angle gathers. The seismic forward model is based on the convolutional model, where the reflectivity is calculated from the well logs. Possible misties between the seismic traveltimes and the time axis of the well logs, errors in the log measurements, and seismic noise are included in the model. The estimated wavelets are given as probability density functions such that uncertainties of the wavelets are an integral part of the solution. The solution is not analytically obtainable and is therefore computed by Markov‐chain Monte Carlo simulation. An example from Sleipner field shows that the estimated wavelet has higher amplitude compared to wavelet estimation where well log errors are neglected, and the uncertainty of the estimated wavelet is lower.


2015 ◽  
Vol 137 (4) ◽  
Author(s):  
Dong Wang ◽  
Qiang Miao

In our previous work, a general particle filter based Bayesian method was proposed to derive the graphical relationship between wavelet parameters, including center frequency and bandwidth, and to posteriorly find optimal wavelet parameters so as to extract bearing fault features. In this work, some improvements on the previous Bayesian method are proposed. First, the previous Bayesian method strongly depended on an initial uniform distribution to generate random particles. Here, a random particle represented a potential solution to optimize wavelet parameters. Once the random particles were obtained, the previous Bayesian method could not generate new random particles. To solve this problem, this paper introduces Gaussian random walk to joint posterior probability density functions of wavelet parameters so that new random particles can be generated from Gaussian random walk to improve optimization of wavelet parameters. Besides, Gaussian random walk is automatically initialized by the famous fast kurtogram. Second, the previous work used the random particles generated from the initial uniform distribution to generate measurements. Because the random particles used in the previous work were fixed, the measurements were also fixed. To solve this problem, the first measurement used in this paper is provided by the fast kurtogram, and its linear extrapolations are used to generate monotonically increasing measurements. With the monotonically increasing measurements, optimization of wavelet parameters is further improved. At last, because Gaussian random walk is able to generate new random particles from joint posterior probability density functions of wavelet parameters, the number of the random particles is not necessarily set to a high value that was used in the previous work. Two instance studies were investigated to illustrate how the Gaussian random walk based Bayesian method works. Comparisons with the famous fast kurtogram were conducted to demonstrate that the Gaussian random walk based Bayesian method can better extract bearing fault features.


Author(s):  
John J. Friel

Committee E-04 on Metallography of the American Society for Testing and Materials (ASTM) conducted an interlaboratory round robin test program on quantitative energy dispersive spectroscopy (EDS). The test program was designed to produce data on which to base a precision and bias statement for quantitative analysis by EDS. Nine laboratories were sent specimens of two well characterized materials, a type 308 stainless steel, and a complex mechanical alloy from Inco Alloys International, Inconel® MA 6000. The stainless steel was chosen as an example of a straightforward analysis with no special problems. The mechanical alloy was selected because elements were present in a wide range of concentrations; K, L, and M lines were involved; and Ta was severely overlapped with W. The test aimed to establish limits of precision that could be routinely achieved by capable laboratories operating under real world conditions. The participants were first allowed to use their own best procedures, but later were instructed to repeat the analysis using specified conditions: 20 kV accelerating voltage, 200s live time, ∼25% dead time and ∼40° takeoff angle. They were also asked to run a standardless analysis.


2021 ◽  
Vol 13 (12) ◽  
pp. 2307
Author(s):  
J. Javier Gorgoso-Varela ◽  
Rafael Alonso Ponce ◽  
Francisco Rodríguez-Puerta

The diameter distributions of trees in 50 temporary sample plots (TSPs) established in Pinus halepensis Mill. stands were recovered from LiDAR metrics by using six probability density functions (PDFs): the Weibull (2P and 3P), Johnson’s SB, beta, generalized beta and gamma-2P functions. The parameters were recovered from the first and the second moments of the distributions (mean and variance, respectively) by using parameter recovery models (PRM). Linear models were used to predict both moments from LiDAR data. In recovering the functions, the location parameters of the distributions were predetermined as the minimum diameter inventoried, and scale parameters were established as the maximum diameters predicted from LiDAR metrics. The Kolmogorov–Smirnov (KS) statistic (Dn), number of acceptances by the KS test, the Cramér von Misses (W2) statistic, bias and mean square error (MSE) were used to evaluate the goodness of fits. The fits for the six recovered functions were compared with the fits to all measured data from 58 TSPs (LiDAR metrics could only be extracted from 50 of the plots). In the fitting phase, the location parameters were fixed at a suitable value determined according to the forestry literature (0.75·dmin). The linear models used to recover the two moments of the distributions and the maximum diameters determined from LiDAR data were accurate, with R2 values of 0.750, 0.724 and 0.873 for dg, dmed and dmax. Reasonable results were obtained with all six recovered functions. The goodness-of-fit statistics indicated that the beta function was the most accurate, followed by the generalized beta function. The Weibull-3P function provided the poorest fits and the Weibull-2P and Johnson’s SB also yielded poor fits to the data.


2021 ◽  
Vol 502 (2) ◽  
pp. 1768-1784
Author(s):  
Yue Hu ◽  
A Lazarian

ABSTRACT The velocity gradients technique (VGT) and the probability density functions (PDFs) of mass density are tools to study turbulence, magnetic fields, and self-gravity in molecular clouds. However, self-absorption can significantly make the observed intensity different from the column density structures. In this work, we study the effects of self-absorption on the VGT and the intensity PDFs utilizing three synthetic emission lines of CO isotopologues 12CO (1–0), 13CO (1–0), and C18O (1–0). We confirm that the performance of VGT is insensitive to the radiative transfer effect. We numerically show the possibility of constructing 3D magnetic fields tomography through VGT. We find that the intensity PDFs change their shape from the pure lognormal to a distribution that exhibits a power-law tail depending on the optical depth for supersonic turbulence. We conclude the change of CO isotopologues’ intensity PDFs can be independent of self-gravity, which makes the intensity PDFs less reliable in identifying gravitational collapsing regions. We compute the intensity PDFs for a star-forming region NGC 1333 and find the change of intensity PDFs in observation agrees with our numerical results. The synergy of VGT and the column density PDFs confirms that the self-gravitating gas occupies a large volume in NGC 1333.


Sign in / Sign up

Export Citation Format

Share Document