scholarly journals Mobile sensing of point-source gas emissions using Bayesian inference: An empirical examination of the likelihood function

2019 ◽  
Vol 218 ◽  
pp. 116981 ◽  
Author(s):  
Xiaochi Zhou ◽  
Amir Montazeri ◽  
John D. Albertson
2018 ◽  
Vol 40 ◽  
pp. 06029
Author(s):  
Luiz Henrique Maldonado ◽  
Daniel Firmo Kazay ◽  
Elio Emanuel Romero Lopez

The estimation of the uncertainty associated with stage-discharge relations is a challenge to the hydrologists. Bayesian inference with likelihood estimator is a promissory approach. The choice of the likelihood function has an important impact on the capability of the model to represent the residues. This paper aims evaluate two likelihood functions with DREAM algorithm to estimate specific non-unique stage-discharge rating curves: normal likelihood function and Laplace likelihood function. The result of BaRatin is also discussed. The MCMC of the DREAM and the BaRatin algorithm have been compared and its results seem consistent for the studied case. The Laplace likelihood function presented as good results as normal likelihood function for the residues. Other gauging stations should be evaluated to attend more general conclusions.


2021 ◽  
Author(s):  
Russell T. Johnson ◽  
Daniel Lakeland ◽  
James M. Finley

Background: Musculoskeletal modeling is currently a preferred method for estimating the muscle forces that underlie observed movements. However, these estimates are sensitive to a variety of assumptions and uncertainties, which creates difficulty when trying to interpret the muscle forces from musculoskeletal simulations. Here, we describe an approach that uses Bayesian inference to identify plausible ranges of muscle forces for a simple motion while representing uncertainty in the measurement of the motion and the objective function used to solve the muscle redundancy problem. Methods: We generated a reference elbow flexion-extension motion by simulating a set of muscle excitation signals derived from the computed muscle control tool built into OpenSim. We then used a Markov Chain Monte Carlo (MCMC) algorithm to sample from a posterior probability distribution of muscle excitations that would result in the reference elbow motion trajectory. We constructed a prior over the excitation parameters which down-weighted regions of the parameter space with greater muscle excitations. We used muscle excitations to find the corresponding kinematics using OpenSim, where the error in position and velocity trajectories (likelihood function) was combined with the sum of the cubed muscle excitations integrated over time (prior function) to compute the posterior probability density. Results: We evaluated the muscle forces that resulted from the set of excitations that were visited in the MCMC chain (five parallel chains, 450,000 iterations per chain, runtime = 71 hours). The estimated muscle forces compared favorably with the reference motion from computed muscle control, while the elbow angle and velocity from MCMC matched closely with the reference with an average RMSE for angle and velocity equal to 0.008° and 0.18°/s, respectively. However, our rank plot analysis and potential scale reduction statistics, which we used to evaluate convergence of the algorithm, indicated that the parallel chains did not fully mix. Conclusions: While the results from this process are a promising step towards characterizing uncertainty in muscle force estimation, the computational time required to search the solution space with, and the lack of MCMC convergence indicates that further developments in MCMC algorithms are necessary for this process to become feasible for larger-scale models.


Entropy ◽  
2018 ◽  
Vol 20 (12) ◽  
pp. 919
Author(s):  
María Martel-Escobar ◽  
Francisco-José Vázquez-Polo ◽  
Agustín Hernández-Bastida 

Problems in statistical auditing are usually one–sided. In fact, the main interest for auditors is to determine the quantiles of the total amount of error, and then to compare these quantiles with a given materiality fixed by the auditor, so that the accounting statement can be accepted or rejected. Dollar unit sampling (DUS) is a useful procedure to collect sample information, whereby items are chosen with a probability proportional to book amounts and in which the relevant error amount distribution is the distribution of the taints weighted by the book value. The likelihood induced by DUS refers to a 201–variate parameter p but the prior information is in a subparameter θ linear function of p , representing the total amount of error. This means that partial prior information must be processed. In this paper, two main proposals are made: (1) to modify the likelihood, to make it compatible with prior information and thus obtain a Bayesian analysis for hypotheses to be tested; (2) to use a maximum entropy prior to incorporate limited auditor information. To achieve these goals, we obtain a modified likelihood function inspired by the induced likelihood described by Zehna (1966) and then adapt the Bayes’ theorem to this likelihood in order to derive a posterior distribution for θ . This approach shows that the DUS methodology can be justified as a natural method of processing partial prior information in auditing and that a Bayesian analysis can be performed even when prior information is only available for a subparameter of the model. Finally, some numerical examples are presented.


Water ◽  
2018 ◽  
Vol 10 (11) ◽  
pp. 1662
Author(s):  
Qin-Bo Cheng ◽  
Xi Chen ◽  
Jiao Wang ◽  
Zhi-Cai Zhang ◽  
Run-Run Zhang ◽  
...  

The soil and water assessment tool (SWAT) is widely used to quantify the spatial and temporal patterns of sediment loads for watershed-scale management of sediment and nonpoint-source pollutants. However few studies considered the trade-off between flow and sediment objectives during model calibration processes. This study proposes a new multi-objective calibration method that incorporates both flow and sediment observed information into a likelihood function based on the Bayesian inference. For comparison, two likelihood functions, i.e., the Nash–Sutcliffe efficiency coefficient (NSE) approach that assumes model residuals follow the Gaussian distribution, and the BC-GED approach that assumes model residuals after Box–Cox transformation (BC) follow the generalized error distribution (GED), are applied for calibrating the flow and sediment parameters of SWAT with the water balance model and the variable source area concept (SWAT-WB-VSA) in the Baocun watershed, Eastern China. Compared with the single-objective method, the multi-objective approach improves the performance of sediment simulations without significantly impairing the performance of flow simulations, and reduces the uncertainty of flow parameters, especially flow concentration parameters. With the NSE approach, SWAT-WB-VSA captures extreme flood events well, but fails to mimic low values of river discharge and sediment load, possibly because the NSE approach is an informal likelihood function, and puts greater emphasis on high values. By contrast, the BC-GED approach approximates a formal likelihood function, and balances consideration of the high- and low- values. As a result, inferred results of the BC-GED method are more reasonable and consistent with the field survey results and previous related-studies. This method even discriminates the nonerodible characteristic of main channels.


2014 ◽  
Vol 143 ◽  
pp. 34-43 ◽  
Author(s):  
Ramaprasad Majumder ◽  
Stephen J. Livesley ◽  
David Gregory ◽  
Stefan K. Arndt

2016 ◽  
Vol 2016 ◽  
pp. 1-7
Author(s):  
Haijun Wang ◽  
Hongjuan Ge ◽  
Shengyan Zhang

We present a fast and robust object tracking algorithm by using 2DPCA andl2-regularization in a Bayesian inference framework. Firstly, we model the challenging appearance of the tracked object using 2DPCA bases, which exploit the strength of subspace representation. Secondly, we adopt thel2-regularization to solve the proposed presentation model and remove the trivial templates from the sparse tracking method which can provide a more fast tracking performance. Finally, we present a novel likelihood function that considers the reconstruction error, which is concluded from the orthogonal left-projection matrix and the orthogonal right-projection matrix. Experimental results on several challenging image sequences demonstrate that the proposed method can achieve more favorable performance against state-of-the-art tracking algorithms.


2018 ◽  
Vol 53 ◽  
pp. 14-22 ◽  
Author(s):  
Wolfgang Betz ◽  
James L. Beck ◽  
Iason Papaioannou ◽  
Daniel Straub

2010 ◽  
Vol 46 (12) ◽  
Author(s):  
Tyler Smith ◽  
Ashish Sharma ◽  
Lucy Marshall ◽  
Raj Mehrotra ◽  
Scott Sisson

2021 ◽  
Vol 14 (7) ◽  
pp. 4319-4333
Author(s):  
Sebastian Springer ◽  
Heikki Haario ◽  
Jouni Susiluoto ◽  
Aleksandr Bibov ◽  
Andrew Davis ◽  
...  

Abstract. Estimating parameters of chaotic geophysical models is challenging due to their inherent unpredictability. These models cannot be calibrated with standard least squares or filtering methods if observations are temporally sparse. Obvious remedies, such as averaging over temporal and spatial data to characterize the mean behavior, do not capture the subtleties of the underlying dynamics. We perform Bayesian inference of parameters in high-dimensional and computationally demanding chaotic dynamical systems by combining two approaches: (i) measuring model–data mismatch by comparing chaotic attractors and (ii) mitigating the computational cost of inference by using surrogate models. Specifically, we construct a likelihood function suited to chaotic models by evaluating a distribution over distances between points in the phase space; this distribution defines a summary statistic that depends on the geometry of the attractor, rather than on pointwise matching of trajectories. This statistic is computationally expensive to simulate, compounding the usual challenges of Bayesian computation with physical models. Thus, we develop an inexpensive surrogate for the log likelihood with the local approximation Markov chain Monte Carlo method, which in our simulations reduces the time required for accurate inference by orders of magnitude. We investigate the behavior of the resulting algorithm with two smaller-scale problems and then use a quasi-geostrophic model to demonstrate its large-scale application.


Sign in / Sign up

Export Citation Format

Share Document